TITLE: Data Engineer (Mid-level)
REQUIREMENTS: Master’s Degree in Computer Science, Computer Engineering, Mathematics, or a related field plus 2 years of related experience in the following: Python: Coding and Scripting with a focus on data engineering. SQL (Complex Queries): Data Analysis. One year of experience with the following: AWS (including advanced features): AWS services. ETL Pipelines: building and maintaining ETL processes. Data Acquisition and Processing: Managing data acquisition, processing, and improving data reliability. Amazon QuickSight and Reporting: Creating specialized reports. Agile Methodologies (including source control, testing, and CI/CD).
JOB DUTIES: Manage intricate data pipelines using advanced tools including Python, Elasticsearch, MySQL, MongoDB, and AWS. Ensure optimal performance of data pipelines, providing critical information for strategic assessment and development prioritization. Enhance data pipelines for scalability to meet unique business and technical requirements. Design and construct reports to present key NLP performance metrics for business insights. Apply a deep understanding of complex business requirements to programming, analysis, and system logic. Collaborate with various departments to ensure work aligns with standards, norms, and goals. Provide solutions to programming issues across functional and technological areas. Lead the design and deployment of new enterprise systems ensuring compatibility and interoperability. Develop infrastructure to support advanced data analysis. Deliver high-standard software solutions in a collaborative team environment.
SALARY: $130,000 to $150,000 per year.
CONTACT: Send resume to:
Ecotrak Facility Management Software,
http://ecotrak.com/careers
Jobsite: Irvine, CA (Full-Time Position)
100% Telecommuting is permitted.