As part of the Data & Analytics team, you will responsible for designing, developing, and supporting the infrastructure to collect, analyze, and transform large volumes of business-data into enterprise reporting and analytic solutions. You will also be responsible for optimizing data flow and collection for cross functional teams. The Data Engineer will support our software engineers, database architects and analysts on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. You must be self-directed and comfortable supporting the data needs of multiple teams, systems and products.
These are some things we will expect:
- Develop core data relationship, data ingest, data transformation services and search capabilities.
- Maintain all facets of data infrastructure, including backups/disaster recovery, software deployment/configuration, security best practices, performance tuning, and troubleshooting.
- Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘big data’ technologies
- Review project objectives and determine best technology for implementation. Implement best practice standards for development, build and deployment automation.
- Design, maintain and optimize highly distributed analytics data stores
- Write well-abstracted, reusable and efficient code
- Analyze large data sets and extract patterns that lead to greater customer insight, optimized performance, enhanced engagement, improved retention, increased revenue and decreased cost
- Work with business partners and management teams to the ensure collection and analysis of appropriate data and metrics to facilitate improvements in processes and profit
- Work with executives and leadership teams to formulate hypotheses and design research aimed at improving key performance indicators
- Utilize expertise in data modeling, ETL architecture, and report design for department initiatives, and produce detailed documentation including data flow diagrams, logical diagrams, and physical diagrams as needed
- Design and build automated algorithms to find patterns in data and create predictive models
What we need from you:
- 2+ years of relevant Data Engineering technical experience
- Bachelor’s degree in Technology based discipline preferred
- Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases
- Advanced working knowledge of event-driven architectures such as Eventsourcing and CQRS
- Build processes supporting data transformation, data structures, metadata, dependency and workload management
- Experience with AWS cloud services: Lambda, S3, Glue, Redshift, and Athena, or their open source equivalent (Zeppelin, Presto, etc)
- Experience with data pipelining with AWS Datapipeline, Airflow or Luigi
- Experience with object-oriented/object function scripting languages: Python code development, python unit testing (tox), Pyspark
- Experience with Node.js
- Up-to-date on latest industry trends; able to articulate trends and potential clearly and confidently
- Understanding of best practices within the development process
- Experience in providing clients with reports, dashboards, and analytics
- Strong organizational skills and the ability to manage a diverse work load in a fast-paced environment
- Competitive wages.
- Medical, dental and vision benefits after 90 days.
- Free lunch every two weeks.