About Veloce Energy
Veloce is a fast-growing leader in the electrification of transportation and transformation of the grid edge. We are dedicated and passionate about the electrification of everything as a means to fight climate change and air pollution while creating a just transition empowering all. While we each bring our strengths, we drive collaboratively to reach our overall goals. Our co-founders have helped shape the solar, storage, e-Mobility and retail energy industries. We are seeking diverse, passionate, driven people to lead us further at all levels. Join us if you want to shape a just clean energy future.
• Collaborate with the data engineering team to design, develop, and implement scalable data solutions on the AWS platform.
• Build and maintain data pipelines, ensuring smooth data flow from various sources into AWS data storage systems.
• Implement data transformation processes to cleanse, validate, and enrich data using AWS Glue, AWS Lambda, or other suitable tools.
• Develop and maintain ETL (Extract, Transform, Load) processes using AWS services like AWS Glue, AWS Data Pipeline, or AWS Step Functions.
• Optimize data storage and retrieval performance by utilizing AWS services such as Amazon S3, Amazon Redshift, or Amazon DynamoDB.
• Collaborate with data scientists and analysts to enable efficient data access, retrieval, analysis, and machine learning model deployment in AWS.
• Implement and maintain data security and access controls in compliance with industry best practices and company policies.
• Utilize AWS MLOps tools like AWS SageMaker, AWS Glue, AWS Step Functions, and AWS Lambda to automate and streamline machine learning workflows.
• Monitor and troubleshoot data pipelines, infrastructure, and machine learning workflows to ensure data quality, system reliability, and model performance.
• Stay up to date with the latest AWS tools, technologies, and best practices in data engineering and MLOps, and actively explore opportunities for leveraging new capabilities to enhance data engineering and machine learning processes.
• Document technical specifications, data flow diagrams, machine learning workflows, and other relevant documentation to ensure knowledge sharing and maintainable systems.
Bachelor’s degree in computer science, information systems, or a related field.
Solid understanding of data engineering principles and concepts.
Strong knowledge of AWS services and tools, including AWS Glue, AWS Lambda, Amazon S3, Amazon Redshift, and AWS Data Pipeline.
Experience with ETL processes and data transformation using AWS Glue, AWS Data Pipeline, or similar technologies.
Proficiency in programming languages such as Python, including experience with data manipulation, transformation, and machine learning libraries (e.g., Pandas, NumPy, Scikit-learn, TensorFlow, PyTorch). C++ experience is a plus.
Working knowledge of SQL and database technologies.
Familiarity with data modeling and database design principles.
Strong analytical and problem-solving skills, with attention to detail.
Excellent communication and collaboration skills to work effectively in a team environment.
AWS certifications (e.g., AWS Certified Big Data – Specialty, AWS Certified Machine Learning – Specialty) are a plus.
Experience with AWS MLOps tools like AWS SageMaker, AWS Glue, AWS Step Functions, or similar tools is a plus.
$75,000 – $85,000 a year
Diversity, Equity, and Inclusion
We know that a diverse, equitable, and inclusive workplace will make Veloce a stronger and more flexible organization, better able to create technological and social change. We believe diversity in age, gender identity, race, sexual orientation, physical and mental ability, ethnicity, and perspective all drive innovation. So we’re building a culture where difference is valued and creating an environment where everyone, from any background, can do their best work. We are rapidly growing our company and will help people manage their careers while we create the future together.
Receive emails for the latest jobs matching your search criteria