We are a cloud-native cybersecurity and network analytics company based in Atlanta, GA, focused on providing cutting-edge security insights through advanced data engineering.. With a fully remote, highly collaborative environment, we empower our engineers to innovate and solve complex challenges in cloud-scale cybersecurity analytics.. ● Hands-on experience with AWS services like S3, Glue, Athena, EMR, Redshift, Lambda.. ● Familiarity with CI/CD, containerization (Docker), and infrastructure-as-code (Terraform, CloudFormation).. ● Fully remote with a modern, cloud-native environment.
The Company's three core products are Snapchat , a visual messaging app that enhances your relationships with friends, family, and the world; Lens Studio , an augmented reality platform that powers AR across Snapchat and other services; and its AR glasses, Spectacles.. Ability to access, analyze, interpret, and communicate ads performance insights leveraging a wide range of standard data science tooling.. Proficiency in advanced analytical tools (e.g. SQL, R, Python). A deep understanding of applied statistics including sampling approaches, causal modeling, time series analysis, and data-mining techniques.. Experience in marketing science, product strategy, ads measurement or a related field.
Manager, Quality Engineer AI/MLAs a Manager, Quality Engineer specializing in artificial intelligence (AI) and Machine Learning (ML) technologies, you will actively engage in your AI/ML craft, taking a hands-on approach to multiple high-visibility projects.. Foster a collaborative environment that enhances team synergy andTechnical Proficiency: Possess deep expertise in modern software engineering practices and principles, including AI/ML/GenAI, Agile methodologies and DevSecOps to deliver daily product deployments using full automation from code check-in to production with all quality checks through SDLC lifecycle.. Understanding of methodologies and tools like, XP, Lean, SAFe, DevSecOps, ADO, GitHub, SonarQube, etc.. 6+ years of engineering experience focused in AI/ML3+ years of experience in Python, R, TensorFlow, PyTorch, Keras, Julia, ML libraries, NLP, etc.. 2+ years of recent experience in GenAI utilizing technologies such as: OpenAI, Claude, Gemini, LangChain, Agents, Vector databases and approaches like Prompt Engineering, fine-tuning, etc.
TR Labs innovates collaboratively across our core segments in Legal, Tax & Accounting, Government, and Reuters News. As a Lead Research Engineer at Thomson Reuters Labs, you will be part of a global interdisciplinary team of experts.. About You: You are a fit for the Lead Research Engineer role if your background includes: A Bachelors Degree in Computer Science or Related Field.. Are familiar with the Python data science stack through exposure to libraries such as Numpy, Scipy, Pandas, Dask, spaCy, NLTK, scikit-learn.. Experience integrating Machine Learning solutions into production-grade software with a sound understanding of ModelOps and MLOps principles and the ability to translate between language and methodologies used both in research and engineering fields Had previous exposure to Natural Language Processing (NLP) problems and have familiarity with key tasks such as Named Entity Recognition (NER), Information Extraction, Information Retrieval, etc.. Have been successfully taking and integrating Machine Learning solutions to production-grade software Hands-on experience in other programming and scripting languages (Java, TypeScript, JavaScript, etc.)
Good knowledge of ML & AI algorithms and their deployment using Python. Data analysis and machine learning techniques implementation. Strong coding skills in Python, SQL. Experienced in large language models (LLM) and deep learning. Master or PhD degree in Computer Science or Electrical Engineering.
The Data Developer Specialist is responsible for the development, testing, and documentation of complex data transformations and services to enable the analysis of information to support strategic initiatives and ongoing business requirements.. The role will develop data transformation solutions using Informatica PowerCenter, Data Services, Shell scripts, and advanced SQL. This position will also assist in database performance tuning including analyzing and tuning of complex SQL statements.. The Data Developer Specialist will lead the data transformation solution life-cycle, transformation standards, best practices, and procedures.. This role will also act as a mentor to fellow Junior Data Developers in helping to create a sustainable data warehouse ecosystem.. Informatica PC (5+ years)
Additionally, you will collaborate on setting best practices for data integration, security, and governance in Palantir based manufacturing applications.. Collaborate on setting best practices for data management, security, and governance in Palantir environments.. Troubleshoot and resolve issues with Palantir data platforms and systems.. Support and maintain the full lifecycle of Palantir's data tools and related services.. Minimum of 3 year's experience with artificial intelligence data science, machine learning engineering, or data analytics.
Bachelor's degree in Robotics, Computer Vision, Machine Learning, Artificial Intelligence, Computer Science, or Computer Engineering, or a field that requires a strong mathematical and/or engineering foundation (e.g. physics, aerospace engineering). Masters or PhD in Robotics, Computer Vision, Machine Learning, Artificial Intelligence, Computer Science, or Computer Engineering, or a field that requires a strong mathematical and/or engineering foundation (e.g. physics, aerospace engineering). Experience with python, machine learning software frameworks, and GPU programming in CUDA or related higher-level languages. Neurodivergence, for example, attention-deficit/hyperactivity disorder (ADHD), autism spectrum disorder, dyslexia, dyspraxia, other learning disabilities. Traumatic brain injury
One (1) or more years of experience with Microsoft Azure, preferred. Experience with artificial intelligence / machine learning, preferred. Microsoft Azure Dev Ops or Github. Microsoft SQL Reporting Services and/or Power BI. The annual allocation to the ESOP is fully funded by BDO through investments in company stock and grants employees the chance to grow their wealth over time as their shares vest and grow in value with the firm’s success, with no employee contributions.
Role Summary As a Full Stack AI Engineer , you will architect, build, and deploy next-generation AI applications powered by the latest advancements in LLMs (GPT-4, Gemini, Claude, Llama, Mistral), autonomous agents, and multimodal generative systems.. Performance Optimization – Continuously refine model efficiency, latency, and accuracy using quantization, distillation, and cutting-edge inference techniques.. A Technical Polymath – Proficient in Python, SQL, PyTorch/TensorFlow, and big data tools (PySpark, Pandas, Polars) with hands-on cloud experience (AWS Sagemaker, Azure AI, GCP Vertex).. Experience with agile development methodologies and tools such as Azure DevOps or Git. Experience building Delta live pipelines in Databricks as well as working in serverless environments.
A great product, amazing people and our stable financial history have made us one of the largest used car finance companies nationally.. Recognize upstream/downstream impacts; communicate effectively verbally and in writing.. Requires bachelor’s degree in Statistics, Mathematics, or closely related field and coursework in Database Systems, Statistical Computing, Object-Oriented Programming & Design, Artificial Intelligence, Data Science, and Regression Analysis.. Headquarters (HQ) in Southfield, MI; telecommuting permitted.. Required degrees must have been earned at institutions of Higher Education which are accredited by the Council for Higher Education Accreditation or equivalent.
Must have a bachelors degree in Computer Science, Technology, Computer Information Systems, Computer Applications, Engineering, or related field, plus 5 years of progressive post-baccalaureate experience in the IT consulting industry. Coding and deploying applications using DevOps by utilizing GitLab, AWS, and Splunk;.. Developing and maintaining applications, and extract, transform, and load (ETL) leveraging Scala, Python, PySpark, AWS, and Shell;.. Working with SQL databases, including PostgreSQL and RedShift;.. Creating prototype solutions, and producing and maintaining data for machine learning and artificial intelligence applications;
Embracing increased ambiguity, you are comfortable when the path forward isn’t clear, you ask questions, and you use these moments as opportunities to grow.. , common LLM development frameworks (e.g., Langchain, Semantic Kernel), Relational storage (SQL), Non-relational storage (NoSQL);. Experience in analytical techniques such as Machine Learning, Deep Learning and Optimization. Understanding or hands on experience with Azure, AWS, and / or Google Cloud platforms. For only those qualified applicants that are impacted by the Los Angeles County Fair Chance Ordinance for Employers, the Los Angeles' Fair Chance Initiative for Hiring Ordinance, the San Francisco Fair Chance Ordinance, San Diego County Fair Chance Ordinance, and the California Fair Chance Act, where applicable, arrest or conviction records will be considered for Employment in accordance with these laws.
Bachelor’s Degree in Mechanical/Mechatronics Engineering, Electrical Engineering or Computer Science related field. 1+ year experience in query languages, SQL, HiveQL, Google BigQuery, or Spark. 2+ years of experience in query languages, SQL, HiveQL, Google BigQuery, or Spark.. Ability to understand how to ingest large data sets from a variety of sources (SQL, Python, Java, Hadoop, Alteryx, Qlik Sense, etc). Family building benefits including adoption and surrogacy expense reimbursement, fertility treatments, and more
We have assignments available to help our insurance carrier clients in Pricing Actuary or Associate Pricing Actuary positions.. Develop and refine pricing models to assess risk and profitability for various Property & Casualty insurance products.. Possess ability to extract and manipulate data in relational databases: SAS, SQL, R, Prophet, VBA, Python or similar preferred.. ACAS or FCAS designation preferred but not required.. Proficiency in predictive modeling and machine learning techniques
o Orchestrate complex data pipelines using data integration tools like Informatica and Python, ensuring seamless data flow from various sources.. o Leverage GCP Dataflow, Cloud Functions, and other cloud technologies to build scalable and resilient data ingestion and processing pipelines.. o Document best practices and quality standards to be adherence during development of data science solutions.. o Work with diverse data sources, including relational databases (Oracle, SQL Server, MySQL, Postgres, Snowflake), big data platforms (Hadoop, Parquet files, BigQuery, Big Lake managed Iceberg), and streaming data (Kafka, GCP Dataflow/Proc).. o Explore and implement Vertex AI models to generate quick insights and support business requirements.
on GCP journey or other ML Engineering / MLOps / Generative AI - LLM tasks, focusing on AI/ML platform adoption and AI/ML democratization.. Work closely with Tech Anchor, Product Manager and Product Owner to deliver machine learning use cases using Agile Methodology.. Experience with Python and have used Machine Learning algorithms (pytorch, NLP etc).. Experience using orchestration tools like Airflow and have knowledge of Infrastructure as code (Terraform).. The Stefanini Group is a global provider of offshore, onshore and near shore outsourcing, IT digital consulting, systems integration, application, and strategic staffing services to Fortune 1000 enterprises around the world.
Since 2010, SynergisticIT has helped thousands of candidates find jobs with major tech companies like Apple, Google, PayPal, Western Union, Client, Visa, Walmart Labs, and more.. Synergisticit at Gartner Data & Analytics summit. Familiarity with Spring Boot, Microservices, and REST APIs. Knowledge of Statistics, Python, and data visualization tools.. For Data Science/Machine Learning roles: NLP, text mining, Tableau, and time series analysis.
Minimum 3 years of Big Data and Big Data tools in one or more of the following: Batch Processing (e.g. Hadoop distributions, Spark), Real time processing (e.g. Kafka, Flink/Spark Streaming).. Strong Knowledge of Databricks SQL/Scala - Data Engineering Pipelines.. Strong familiarity with batch processing and workflow tools such as dbt, AirFlow, NiFi. StockX is proud to be a Detroit-based technology leader focused on the large and growing online market for sneakers, apparel, accessories, electronics, collectibles, trading cards, and more. The StockX platform features hundreds of brands across verticals including Jordan Brand, adidas, Nike, Supreme, BAPE, Off-White, Louis Vuitton, Gucci; collectibles from brands including LEGO, KAWS, Bearbrick, and Pop Mart; and electronics from industry-leading manufacturers Sony, Microsoft, Meta, and Apple.
This role requires a strong blend of technical knowledge, sales acumen, and a visionary approach to artificial intelligence. Bachelor's degree in Computer Science, Data Science, Business, or related field (Master's or MBA preferred). 7+ years of experience in AI consulting, AI strategy, or technology sales, with a focus on AI-driven solutions. Strong technical expertise in artificial intelligence technologies, machine learning, and their applications in business. Excellent communication and public speaking skills with a proven ability to create and deliver engaging thought leadership content.