Collaborate with data workflows and platforms, including BigQuery and Airflow, ensuring backend systems scale alongside a growing data platform and data lake. Experience working with data platforms and pipelines, including data warehousing, Airflow, and SQL/NoSQL databases. Our award-winning program
Strong programming fundamentals, including data structures, algorithms, object-oriented programming and design patterns. Design, develop, refactor, test and maintain new features and improve existing code base using Java/J2EE, JavaScript, SQL, HTML, CSS, Ext JS, REACT, JUnit, and Eclipse. Demonstrat
Knowledge of CI/CD principles and experience building deployment pipelines. Our differentiated architecture seamlessly integrates hardware, software and system level technologies to maximize the efficiency of GPU, CPU and accelerator-based compute clusters at any scale. Build and maintain cust
The ideal candidate will be proficient in Java and the J2EE stack, experienced in AWS cloud services, and skilled in CI/CD pipelines and software design patterns. We bring together healthcare providers and manufacturers and distributors in North America and Europe - who rely on smart, secure healthc
We are able to build high-quality datasets at petabyte-scale and low cost through a tight integration of infrastructure, engineering, and research work. Collaborate with others on the AI Team and Speechify Leadership to craft the AI Team’s dataset roadmap to power Speechify’s next-generation con
This remote position involves designing and developing data integration solutions and APIs for enterprise systems. A leading energy transformation firm is seeking a Lead Java Software and Data Integration Engineer to join their SaaS team. Ideal candidates will have 8+ years of software development e
As a software engineer on the team, you'll collaborate on an end-to-end data analytics and mlops solution composed of popular, open-source, machine learning tools, such as Kubeflow, MLFlow, DVC, and Feast. Python and Kubernetes Specialist Engineers focused on Data, AI/ML and Analytics Solutions. Can
The ideal candidate will have extensive Python and cybersecurity experience to build and secure data pipelines for a threat intelligence platform. Responsibilities include developing ETL processes, setting up lab pipelines, and collaborating with various teams. Additional benefits include medical in
The ideal candidate has strong experience with CI/CD tools, build systems, and scripting. A tech company in San Francisco is looking for a Software Engineer specializing in Dev Productivity. In this role, you will streamline engineering workflows, optimize CI/CD systems, and enhance developer toolin
A leading open-source software provider is seeking a Software Engineer to develop, test, and release improvements to the Ubuntu Pro client. The ideal candidate will be proficient in Python and possess a Bachelor's degree in Computer Science or a related field. This role offers a distributed work env
Architect and implement API-driven interfaces and ETL/data pipelines using Java, Spring Boot, and integration frameworks. Design and implement parallel and batch processing of large data sets, applying proven integration patterns and performance optimization techniques. Lead the design and developme
Support and implement branding initiatives, including logo updates, digital brand asset review and development, and merchandise rollouts; help ensure cohesive brand use by following CDSS and Berkeley brand guidelines; facilitate promotional print materials and branded merchandise for CDSS by trackin
A consulting and IT services provider is looking for a Mid-Senior Level Data Engineer to design and architect end-to-end data solutions. The ideal candidate should have over 10 years of experience, strong expertise in Matillion ETL/ELT, and deep knowledge of Snowflake and data governance. This contr
A strong background in Python and experience with SDKs and open-source libraries are essential for success in this dynamic environment. You will design and implement libraries that ensure efficiency and ease of use, deeply engaging with the developer experience. The role involves working with major
Govern standards for data modeling, data quality, metadata, lineage, MDM, and integration patterns. Deep understanding of Azure cloud, data engineering best practices, warehousing/lakehouse architecture, MDM, and metadata governance. Ensure SLAs, incident response, change management, and environment