15+ years' experience with a primary focus in an executive Data & Analytics leadership role such as, Chief Data Officer (CDO), Chief Analytics Officer (CAO), Chief Data Scientist, Data Scientist, VP / Sr. VP Business Intelligence and / or Analytics / Business Analytics within a Fortune 500 company.. CDO Focus : Advising on enterprise-wide data and information strategy, governance, control, policy development & data optimization. Information Management Focus : Leadership perspective in Business Intelligence, Analytics, and Master Data Management approach in key industry verticals. Data Science Focus : Artificial Intelligence, Generative AI, Large Language Models, Natural Language Processing (NLP), Machine Learning, Conceptual Modeling, Statistical Analysis, Internet of Things (IoT), AI / BOTS. Experience in one or more of the following application disciplines is also desired : Agile / Bimodal IT, DevOps, Lean Development, Mobile app development, Cloud app deployment, Integration Strategy and API Management, App Maintenance Methods and Outsourcing, App Portfolio Management
Develop machine learning models for data analysis and automation, including building RAG models for AWS Bedrock.. Strong programming skills in Python, SCALA, and/or UNIX shell scripting.. Experience with big data platforms such as Hadoop, Spark, and Kafka.. Experience with data visualization tools (e.g., Tableau, PowerBI).. Pay Range : There are a host of factors that can influence final salary including, but not limited to, geographic location, Federal Government contract labor categories and contract wage rates, relevant prior work experience, specific skills and competencies, education, and certifications.
We are seeking an experienced Artificial Intelligence (AI) Tester to join our team.. Skilled in AI development frameworks such as TensorFlow, PyTorch, scikit-learn, and/or Keras. Understanding of and experience in machine learning and artificial intelligence testing. Paid Time Off, Paid Holidays, Paid Leave (e.g., Maternity, Paternity, Jury Duty, Bereavement, Military). Corporate Fellowship: Opportunities to participate in company sports teams and employee-led interest groups for personal and professional development.
+ Bachelor's or Master's degree or equivalent experience and a shown foundation in statistical and data science (e.g. Machine Learning, Predictive analytics, etc. Skill with analytic tools ranging from relational databases and SQL to Excel, and Python/R. Experience with Product analytics tools like Amplitude, Content square, Adobe analytics, etc. Experience working with on Premise and/or Cloud analytics environments like Hadoop, AWS, Snowflake, etc. Experience with data visualization and enablement tools like Tableau, Power BI, etc.
PySpark Developer, Amazon Web Service (AWS), Cloud Computing, Big Data, Apache Spark, Python, SQL, NoSQL, DB2, PostgreSQL, Snowf. Very Strong hands-on experience with PySpark, Apache Spark, and Python.. Strong Hands on experience with SQL and NoSQL databases (DB2, PostgreSQL, Snowflake, etc. DBT, AWS Astronomer. Experience with SQL and NoSQL databases (DB2, PostgreSQL, Snowflake, etc
Utilize programming languages like Python, SQL, and NoSQL databases, along with Apache Spark for data processing, dbt for data transformation, container orchestration services such as Docker and Kubernetes, and various Azure tools and services.. Advanced programming experience and big data experience within Python, SQL, dbt, Spark, Kafka, Git, Containerization (Docker and Kubernetes).. Advanced experience with Data Warehouses (Snowflake preferred), dimensional modeling, and analytics.. Advanced understanding of DevOps concepts including Azure DevOps framework and tools.. 2+ years of experience in Big-data tools like Spark and Databricks
Cultivate Strategic Partnerships: Forge strong relationships with key sponsors and cross-functional teams, managing expectations through proactive communication, influencing crucial decisions with data-driven insights, and leading high-stakes stakeholder meetings (e.g., TRR, IPT, TWG). 10+ years of progressive project/program management experience, with demonstrated success within the DOD, specifically in Navy, Army, or Air Force mission programs. Hands-on experience working with leading cloud platforms (e.g., AWS, Azure, GCP) and containerized environments (e.g., Docker, Kubernetes). A strong understanding of machine learning, deep learning, and AI model development. Understanding of AI/ML frameworks and libraries (e.g., TensorFlow, PyTorch, Keras).
Senior Machine Learning Engineer As a Capital One Machine Learning Engineer (MLE), you'll be part of an Agile team dedicated toproductionizing machine learning applications and systems at scale.. At least 2 years of on-the-job experience with an industry recognized ML frameworks such as scikit-learn, PyTorch, Dask, Spark, or TensorFlow. 1 years of experience developing and deploying ML solutions in a public cloud such as AWS, Azure, or Google Cloud Platform At this time, Capital One will not sponsor a new applicant for employment authorization, or offer any immigration related support for this position (i.e. H1B, F-1 OPT, F-1 STEM OPT, F-1 CPT, J-1, TN, or another type of work authorization).. McLean, VA: $158,600 - $181,000 for Senior Machine Learning Engineer New York, NY: $173,000 - $197,400 for Senior Machine Learning Engineer Candidates hired to work in other locations will be subject to the pay range associated with that location, and the actual annualized salary amount offered to any candidate at the time of hire will be reflected solely in the candidate's offer letter.. This role is also eligible to earn performance based incentive compensation, which may include cash bonus(es) and/or long term incentives (LTI).
We want someone who is passionate about our mission and who believes in the power of crypto and blockchain technology to update the financial system.. As a Machine Learning Engineer on our team, you will play a pivotal role in this transformation, leveraging cutting-edge AI models, frameworks, and tools, as well as diverse and extensive data sourcesincluding blockchain data.. Your work will focus on lowering the barriers to cryptocurrency adoption while shaping the future of AI-powered interactions.. Proficiency in one or more areas, including data mining, information retrieval, advanced statistics, natural language processing, or computer vision.. Proficiency with GenAI frameworks/tools and technologies such as Apache Airflow, Spark, Flink, Kafka/Kinesis, Snowflake, and Databricks.
Fully Remote (onsite in Vinings, GA from Mon-Thurs if converted to FTE). We are seeking a Senior BI Analyst with expertise in Python, SQL, Tableau, and predictive analytics.. You will work with large datasets across SAP and Snowflake environments to deliver strategic insights, with a strong focus on forecasting container arrivals, optimizing inventory levels, and identifying cost-saving opportunities – all in support of a major ERP modernization and the nationwide consolidation of distribution centers.. System Migration : Lead data analysis and reporting for a major migration from a legacy ERP to SAP Warehouse Management, with reporting transitioning to Snowflake.. Python Predictive Analytics : Perform predictive modeling in Python to forecast container arrivals, inventory levels, and other supply chain KPIs.
Hadoop Developer / SME. Primarily responsible for developing batch and real time ETL to create meaningful insights and analytics using Hadoop eco-system, Oracle, SQL server, MySQL, Business Objects, SAS, MicroStrategy, and ETL tools Informatica, IDQ, etc.. Extensive knowledge of Big Data and ETL technologies such as Druid, Flink, Kafka, Hadoop, Hive, MongoDB.. Leads, influences, and persuades others to take a specific course of action when there is no direct line of control or command.. Specializes in functional or technical areas such as, but not limited to, business process reengineering, physical and data security, Earned Value Management (EVM), Quality Assurance, Project Management, training and Business Intelligence.
As a half a billion dollar IT company, with more than 9,000 professionals across 30+ offices, Collabera offers comprehensive, cost-effective IT staffing & IT Services.. We provide services to Fortune 500 and mid-size companies to meet their talent needs with high quality IT resources through Staff Augmentation, Global Talent Management, Value Added Services through CLASS (Competency Leveraged Advanced Staffing & Solutions) Permanent Placement Services and Vendor Management Programs.. The main function of a Hadoop Developer is to design, develop, code, test, and debug new software or provide complex enhancements to existing software using Hadoop familiarity to develop and support big data analytics.. Qualifications Bachelor's degree in a technical field such as computer science, computer engineering or related field required 8+ years experience required.. Experience as a Software Developer using Hadoop technologies and interest to continue coding and developingStrong understanding of data modeling
Experiment with and implement advanced LLM and Agentic AI technologies to enhance application capabilities. Ensure compliance with Duke Energy’s AI policies. Stay current with advancements in Gen AI, LLMs, Blockchain, and Quantum Computing. Knowledge of Natural Language Processing (NLP) and Computer Vision. Sign in to set job alerts for “Artificial Intelligence Specialist” roles.
Assistant Professor of Information Systems - Equitable AI & Machine Learning. The person holding this position will be responsible for developing and teaching online and face-to-face undergraduate and graduate-level courses in AI, data analytics, information systems and/or project management.. The candidate will conduct research and write and obtain grants in AI/Data Analytics/Data Science/Information Systems leading to publishing in referred journals and presenting the research in local and national conferences and serve as a member of professional organizations.. Teaching assignments will include courses in our Masters program in Data Analytics and Visualization, doctoral and undergraduate courses in AI, Data Visualization, Data Analytics, and depending on background and qualifications, may include courses in computer literacy, management information systems, computer networks, cybersecurity, systems analysis and design, project management, supply chain management, and enterprise information systems (SAP).. The successful candidate will have an earned doctorate (by August 2025) in Information Systems, or related disciplines such as Decision Sciences and/or Management Sciences with research experience in artificial intelligence.
Senior Research Scientist - Radio Frequency Machine Learning (RFML). Top Secret/SCI w/Poly. Strong digital signal processing skills and RF domain knowledge. Experience with machine learning frameworks (e.g., Keras, TensorFlow, PyTorch). This role requires 100% on-site work in College Park, MD or Laurel, MD
The Johns Hopkins Data Science and AI Institute (DSAI) is a new pan-institutional initiative at Johns Hopkins to advance artificial intelligence and its applications, in part through investments in the software engineering, data science, and machine learning space.. Expert-level knowledge of multiple modern AI/ML, vision, NLP, bioinformatics and/or mathematical or computational libraries.. Demonstrated leadership and self-direction.. For AI/ML concentration, experience designing, developing, training and applying state-of-the-art AI/ML models and/or generative AI to practical applications such as vision, NLP, bioinformatics, chemical discovery, medical diagnostics, robotics, anomaly detection, recommendation systems and time series analysis.. Areas of relevant expertise include design and development of statistical and mathematical models of data, data transformation, ETL and information extraction, data modeling of complex scientific datasets, architectures for computing with large datasets, distributed computational pipelines and real-time data streaming architectures.
Integral Federal is looking to hire a Data Science Engineer to support a government customer as they continue to look for ways to enhance their layered approach to security through new state-of-the-art technologies, expanded use of existing and proven technologies, improved passenger identification techniques, and other developments that will continue to strengthen anti-terrorism capabilities.. Capture data lineage and produce lineage reports to inform end users and customers how the data was sourced, transformed, and where it resides, extending data with third-party sources of information when needed.. Implement data governance practices and procedures that serve as the guides for data science projects.. Knowledge of programming languages (e.g., Java, Python, SQL) and skills with RHEL shell scripting.. Previous experience with configuration management in data science and data analytics environments; also Microsoft SharePoint and Agile project delivery and work management tools like JIRA or similar.
Experience in building modern and scalable REST-based microservices using Scala, preferably with Play as MVC framework.. Expertise with functional programming using Scala.. Experience in implementing RESTful web services in Scala, Java, or similar languages.. Experience/knowledge in big data using Scala Spark, ML, Kafka, Elastic Search will be a plus.. About Company: TechSophy, founded in 2009, is a global product engineering company with offices in the US, Middle East, and India.
We are seeking an experienced Informatica Developer with deep expertise in big data technologies, cloud environments, and distributed computing.. The ideal candidate will be highly proficient in Informatica PowerCenter , Apache Spark , PySpark , and cloud-based data platforms.. Utilize workflow schedulers such as Apache Airflow. Strong development skills in Apache Spark, PySpark, and Python. Proficiency in SQL and NoSQL databases (DB2, PostgreSQL, Snowflake)
Big Data, dataflows, Artificial Intelligence / Machine Learning (AI/ML) familiarity, Analytics in GME, Jupyter notebooks, and Spark.. Degree in Mathematics, Applied Mathematics, Statistics, Applied Statistics, Machine Learning, Data Science, Operations Research, or Computer Science.. data mining, advanced statistical analysis (e.g. statistical foundations of machine learning, statistical approaches to missing data, time series), advanced mathematical foundations (e.g. numerical methods, graph theory), artificial intelligence, workflow and reproducibility, data management and curation, data modeling and assessment (e.g. model selection, evaluation, and sensitivity).. Employ some combination (2 or more) of the following areas: Foundations (Mathematical, Computational, Statistical); Data Processing (Data management and curation, data description and visualization, workflow, and reproducibility); Modeling, Inference, and Prediction (Data modeling and assessment, domain-specific considerations).. Unlimited access to Red Hat Enterprise Linux, AWS, and NetApp training and accreditation