Data Scientist Jobs in Springdale, OH

- 111 Jobs
All
Data Scientist
Data Engineer
Senior Data Scientist
Data Science Internship
Pricing Actuary
  • GCP Data Engineer

    Tata Consultancy Services 4.3company rating

    Data Scientist Job In Blue Ash, OH

    Job Type: Fulltime Experience: 10+years As an Advanced Data Engineer, you will have the opportunity to lead the development of innovative data solutions, enabling the effective use of data across the organization. You will be responsible for designing, building, and maintaining robust data pipelines and platforms to meet business objectives, focusing on data as a strategic asset. Your role will involve collaboration with cross-functional teams, leveraging cutting-edge technologies, and ensuring scalable, efficient, and secure data engineering practices. A strong emphasis will be placed on expertise in GCP, Vertex AI, and advanced feature engineering techniques. 4+ years of professional Data Development experience. 4+ years of experience with SQL and NoSQL technologies. 3+ years of experience building and maintaining data pipelines and workflows. 5+ years of experience developing with Java. 2+ years of experience developing with Python. 3+ years of experience developing Kafka solutions. 2+ years of experience in feature engineering for machine learning pipelines. Experience with GCP services such as BigQuery, Vertex AI Platform, Cloud Storage, AutoMLOps, and Dataflow. Experience with CI/CD pipelines and processes. Experience with automated unit, integration, and performance testing. Experience with version control software such as Git. Full understanding of ETL and Data Warehousing concepts. Strong understanding of Agile principles (Scrum). Additional Qualifications Knowledge of Structured Streaming (Spark, Kafka, EventHub, or similar technologies). Experience with GitHub SaaS/GitHub Actions. Experience understanding Databricks concepts. Experience with PySpark and Spark development. Roles & Responsibilities: Provide Technical Leadership: Offer technical leadership to ensure clarity between ongoing projects and facilitate collaboration across teams to solve complex data engineering challenges. Build and Maintain Data Pipelines: Design, build, and maintain scalable, efficient, and reliable data pipelines to support data ingestion, transformation, and integration across diverse sources and destinations, using tools such as Kafka, Databricks, and similar toolsets. Drive Digital Innovation: Leverage innovative technologies and approaches to modernize and extend core data assets, including SQL-based, NoSQL-based, cloud-based, and real-time streaming data platforms. Implement Feature Engineering: Develop and manage feature engineering pipelines for machine learning workflows, utilizing tools like Vertex AI, BigQuery ML, and custom Python libraries. Implement Automated Testing: Design and implement automated unit, integration, and performance testing frameworks to ensure data quality, reliability, and compliance with organizational standards. Optimize Data Workflows: Optimize data workflows for performance, cost efficiency, and scalability across large datasets and complex environments. Mentor Team Members: Mentor team members in data principles, patterns, processes, and practices to promote best practices and improve team capabilities. Draft and Review Documentation: Draft and review architectural diagrams, interface specifications, and other design documents to ensure clear communication of data solutions and technical requirements. Cost/Benefit Analysis: Present opportunities with cost/benefit analysis to leadership, guiding sound architectural decisions for scalable and efficient data solutions Support flows for an ML platform in GCP and needs to be able to work with data science and understand the ML concepts in terms of requirements to be met by the data. #LI-RJ2 Salary Range - $93,700-$135,000 a year
    $93.7k-135k yearly 25d ago
  • Data Engineer

    Ltimindtree

    Data Scientist Job In Cincinnati, OH

    About Us: LTIMindtree is a global technology consulting and digital solutions company that enables enterprises across industries to reimagine business models, accelerate innovation, and maximize growth by harnessing digital technologies. As a digital transformation partner to more than 700+ clients, LTIMindtree brings extensive domain and technology expertise to help drive superior competitive differentiation, customer experiences, and business outcomes in a converging world. Powered by nearly 90,000 talented and entrepreneurial professionals across more than 30 countries, LTIMindtree - a Larsen & Toubro Group company - combines the industry-acclaimed strengths of erstwhile Larsen and Toubro Infotech and Mindtree in solving the most complex business challenges and delivering transformation at scale. For more information, please visit ******************** Job Title: Data Engineer Location: Cincinnati OH Job Summary: Skills Required Data Modeling Domain Driven Design Data Profiling and general Analysis Data Platforms Modernization Data Migration Emphasis on data management for OLTP systems but can help Data Warehouse team as well Technical Required ERD Logical Data Modeling tool Erwin or similar Databases and related Postgres Oracle Physical Data Model and architecture strong SQL skills stored procedures reading and understanding JSON Preferable Python Kafka Enterprise Elastic Search Bonus Data Services Rest API Hazelcast Java CRUD Benefits/perks listed below may vary depending on the nature of your employment with LTIMindtree (“LTIM”): Benefits and Perks: Comprehensive Medical Plan Covering Medical, Dental, Vision Short Term and Long-Term Disability Coverage 401(k) Plan with Company match Life Insurance Vacation Time, Sick Leave, Paid Holidays Paid Paternity and Maternity Leave The range displayed on each job posting reflects the minimum and maximum salary target for the position across all US locations. Within the range, individual pay is determined by work location and job level and additional factors including job-related skills, experience, and relevant education or training. Depending on the position offered, other forms of compensation may be provided as part of overall compensation like an annual performance-based bonus, sales incentive pay and other forms of bonus or variable compensation. Disclaimer: The compensation and benefits information provided herein is accurate as of the date of this posting. LTIMindtree is an equal opportunity employer that is committed to diversity in the workplace. Our employment decisions are made without regard to race, color, creed, religion, sex (including pregnancy, childbirth or related medical conditions), gender identity or expression, national origin, ancestry, age, family-care status, veteran status, marital status, civil union status, domestic partnership status, military service, handicap or disability or history of handicap or disability, genetic information, atypical hereditary cellular or blood trait, union affiliation, affectional or sexual orientation or preference, or any other characteristic protected by applicable federal, state, or local law, except where such considerations are bona fide occupational qualifications permitted by law. Safe return to office: In order to comply with LTIMindtree' s company COVID-19 vaccine mandate, candidates must be able to provide proof of full vaccination against COVID-19 before or by the date of hire. Alternatively, one may submit a request for reasonable accommodation from LTIMindtree's COVID-19 vaccination mandate for approval, in accordance with applicable state and federal law, by the date of hire. Any request is subject to review through LTIMindtree's applicable processes.
    $75k-101k yearly est. 26d ago
  • Commercial Pricing Actuary (Hybrid #56823)

    DW Simpson Global Actuarial & Analytics Recruitment 4.1company rating

    Data Scientist Job In Cincinnati, OH

    Commercial Pricing Actuary Join a growing company! A P&C insurer is seeking their next Commercial Pricing Actuary. This position is responsible for providing support to Commercial Operations in California, giving pricing recommendations, and preparing rate filing support. (#56823) Requirements: Bachelor's Degree. ACAS or FCAS designation. 7+ years of actuarial experience. Knowledge of California commercial lines. Strong commercial pricing experience. Excellent communication, interpersonal and organizational skills Compensation: A salary range of $150 - $230K. Location: Fairfield, OH - Hybrid Cincinnati, OH - Hybrid
    $38k-72k yearly est. 19d ago
  • Data Scientist I

    Constructconnect 4.3company rating

    Data Scientist Job In Cincinnati, OH

    This position sits within our Product Development division, which develops, tests, and improves our software solutions in an innovative and collaborative environment. The Opportunity As a Data Scientist, you will work with your manager and Lead Data Scientist to expand the analytics capabilities of the company through identifying business trends and problems with complex big data analysis. You will interpret results from multiple sources using a variety of techniques, ranging from simple data aggregation via statistical analysis to complex data mining. Design, develop and implement best-in-class analytics that create quantifiable value for our organization and our clients. Prepare big data, implement data models, and develop databases to support the business solutions. Responsibilities What You'll Be Doing Use data analysis combined with economic and business knowledge to make recommendations on the business direction to various stakeholders within the Define and develop analytical, statistical and machine learning algorithms for advanced analysis and prediction Work with programming languages (such as Python, Ruby, Java) and other technologies to do rapid design, prototyping, analysis, simulation of, and experimentation with, advanced algorithms and applications. Build scalable predictive models based on data sets Help to identify exceptions and outliers to ConstructConnect generated datasets Acquire and ingest data from a variety of sources to extend our products and solutions (innovate) Provide estimates of level of work effort and time for completion Participate in scrum ceremonies This job description in no way implies that the duties listed here are the only ones that team members can be required to perform Qualifications What You Bring to the Team Bachelor's Degree in Statistics, Mathematics, Economics, Econometrics, Computer Science or related quantitative discipline with experience is required Familiarity with R, Python, SAS (or equivalent) programming languages Familiarity with SQL or another database language Strong algorithmic development skills Understanding of applied mathematics, statistics, and economics Experience with a range of data science techniques including clustering, machine learning, and network analysis Creative problem-solver who is passionate about digging into complex problems and devising new approaches to reach results. Enthusiastic about learning new techniques and Strong communication skills and experience distilling and presenting complex quantitative analysis into action-oriented Thrives in a fast paced, changing environment Physical Demands and Work Environment: The physical activities of this position include frequent sitting, telephone communication, and working on a computer for extended periods of time. Visual acuity is required to perform activities close to the eyes. This position is considered Hybrid. Team members are expected to have an established workspace. Ability to work in the Greater Cincinnati/Northern Kentucky area. E-Verify Statement ConstructConnect utilizes the E-Verify program with every potential new hire. This makes it possible for us to make certain that every employee who works for ConstructConnect is eligible to work in the United States. To learn more about E-Verify you can call ************** or visit their website. E-Verify is a registered trademark of the United States Department of Homeland Security. Privacy Notice
    $75k-104k yearly est. 23d ago
  • Data Scientist 3 - 22620

    HII

    Data Scientist Job In Wright-Patterson Air Force Base, OH

    Company: HII's Mission Technologies division Required Travel: 0 - 10% Employment Type: Full Time/Salaried/Exempt Anticipated Salary Range: $99,336.00 - $132,500.00 Security Clearance: Secret Level of Experience: Mid This opportunity resides with Warfare Systems (WS), a business group within HII's Mission Technologies division. Warfare Systems comprises cyber and mission IT; electronic warfare; and C5ISR systems. HII works within our nation's intelligence and cyber operations communities to defend our interests in cyberspace and anticipate emerging threats. Our capabilities in cybersecurity, network architecture, reverse engineering, software and hardware development uniquely enable us to support sensitive missions for the U.S. military and federal agency partners. Meet HII's Mission Technologies Division Our team of more than 7,000 professionals worldwide delivers all-domain expertise and advanced technologies in service of mission partners across the globe. Mission Technologies is leading the next evolution of national defense - the data evolution - by accelerating a breadth of national security solutions for government and commercial customers. Our capabilities range from C5ISR, AI and Big Data, cyber operations and synthetic training environments to fleet sustainment, environmental remediation and the largest family of unmanned underwater vehicles in every class. Find the role that's right for you. Apply today. We look forward to meeting you. To learn more about Mission Technologies, click here for a short video: *************************** Job Description HII-Mission Technologies is currently seeking a skilled Data Scientist/Engineer to design, develop, and implement a data base architecture, tools, and techniques for management and use of extremely large data sets for use by data analyst for our DoD customer's organization. This position will be performed at the DoD customer site at Wright-Patterson Air Force Base. Essential Job Responsibilities * Investigate, determine, and implement suitable open-source tools for storage, managing, and use of extremely large data sets. * Investigate, determine, and implement tools and methodologies for extracting key data points for analysis by data analyst. * Conducts data exploration, analysis, and feature extraction techniques to incorporate into DoD customer's organization data pipelines and workflows. * Be part of a team responsible for creating, maintaining, and augmenting an on-premises compute environment * Translating requirements into capabilities for technical teams * Present status reports to key internal stakeholders * Evaluate new tools, technologies, and processes to improve on-premises environment * Gain familiarity of customer mission to identify and/or develop data management and workflow solutions. * Proposes solutions and strategies to unique challenges throughout the DoD customer's organization. * Linux system configuration and capacity planning for large scale data analytics on-premises and cloud. * Additional duties as assigned or required. Minimum Qualifications * 5 years relevant experience with Bachelor in related field; 3 years relevant experience with Masters in related field. * Experience with defining requirements for using and maintaining open-source data analytics, data workflow, and database tools. * Develop, implement, and maintain data analytic protocols, standards, and documentation. * Design data models based on requirements/needs for complex analysis of data analyst * Define requirements for vendors to implement into their proposed solutions to meet DoD customer's organization needs and use cases. * Experience working with multiple types of datasets and database technologies * Experience building data pipelines from end to end * Experience with multiple industry standard cloud and on-premise database technologies * Experience with data exploration, analysis, and visualization tools such as Python and Matlab * Experience and familiarity with Linux operating environment * Experience with open-source containerization technologies and tools and how to utilize those technologies as appropriately for database technologies and analytics * Define internal process improvements to automate repetitive tasks to minimize downtime between analysis * Clearance: Secret clearance to start. Must be able to obtain and maintain a TS/SCI security clearance with enhanced security checks. Physical Requirements May require working in an office, industrial, or laboratory environment. Capable of climbing ladders and tolerating confined spaces and extreme temperature variances. The listed salary range for this role is intended as a good faith estimate based on the role's location, expectations, and responsibilities. When extending an offer, HII's Mission Technologies division takes a variety of factors into consideration which include, but are not limited to, the role's function and a candidate's education or training, work experience, and key skills. Together we are working to ensure a future where everyone can be free and thrive. Today's challenges are bigger than ever, and the nation needs the best of us. It's why we're focused on hiring, developing and nurturing our diversity. We believe that diversity among our workforce strengthens the organization, stimulates creativity, promotes the exchange of ideas and enriches the work lives of all our employees. All qualified applicants will receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, physical or mental disability, age, or veteran status or any other basis protected by federal, state, or local law. Do You Need Assistance? If you need a reasonable accommodation for any part of the employment process, please send an e-mail to ************************** and let us know the nature of your request and your contact information. Reasonable accommodations are considered on a case-by-case basis. Please note that only those inquiries concerning a request for reasonable accommodation will be responded to from this email address. Additionally, you may also call ************** for assistance. Press #3 for HII Mission Technologies.
    $99.3k-132.5k yearly 15d ago
  • Data Scientist/Business Analyst

    Rackner

    Data Scientist Job In Dayton, OH

    Title: Data Scientist/Business Analyst Clearance: DoD Secret Eligibility About this Role / Program: Rackner is seeking a Data Scientist/Business Analyst to join our team to support the AFWERX Technical Operations Branch. AFWERX is the innovation arm of the Department of the Air Force (DAF) and accelerates agile and affordable capability transitions by teaming innovative technology developers with Airman and Guardian talent. AFWERX supports both internal and external (federal and industry partners) users across multiple CONUS locations through client hardware support (NIPR, DREN) and cloud-based (e.g. IaaS, PaaS, SaaS) applications. Additionally, the AFWERX Technical Operations Branch provides Risk Management Framework (RMF) and cybersecurity support to the different AFWERX divisions (i.e. AFVentures, Spark, Prime), including Flight Test Program Management (FTPM) support to both manned and unmanned flight tests. The ideal candidate will have the knowledge and ability to perform tasks related to the technical/professional discipline they are performing. Requirements: Bachelor's degree in a related field 6+ years; relevant experience Flexibility to work in fast-paced environments and adapt to changing priorities and have strong time management skills working under tight deadlines. Ability to work autonomously in a highly dispersed remote organization. Travel is required and expected based on needed support. What will make you successful: Gather and analyze requirements through working with various stakeholders to understand their needs and translate them into clear and concise requirements for IT projects and initiatives by conducting interviews, workshops, and surveys to gather information and document processes. Analyze existing workflows and identify and document areas for improvement and optimization by streamlining processes, automating tasks, and suggesting and implementing new technologies to enhance efficiency and effectiveness. Assist project managers in planning, executing, and monitoring IT projects by developing project plans, tracking progress, managing risks, and communicating with stakeholders. Collect and analyze data to identify trends, patterns, and insights that can inform decision-making by creating reports, dashboards, and visualizations to communicate findings to leadership and other stakeholders. Research and evaluate emerging technologies to determine their potential value and applicability to the mission by attending and/or participating in industry events, reading research papers, and conducting pilot projects. Who We Are: Rackner is a software consultancy that builds cloud-native solutions for startups, enterprises, and the public sector. We are an energetic, growing consultancy with a passion for solving big problems for both startups and enterprises. We enable digital transformation for large organizations through the newest in distributed technologies as we are laser focused on end-to-end application development, DevSecOps, AI/ML and systems architecture and our methodology focuses on cloud-first and cost-effective innovation. Our customers hail from a diverse, ever growing list of industries. Benefits / Additional Info: Rackner embraces and promotes employee development and training and covers the cost of certifications relevant to a position and the technologies/services provided . Fitness/Gym membership eligibility, weekly pay schedule and employee swag, snacks & events are offered as well! 401K with 100% matching up to 6% Highly competitive PTO Great health insurance with large network of providers Medical/Dental/Vision Life Insurance, and short & long term disability Industry-Leading Weekly Pay Schedule Home office & equipment plan #DataScientist #BusinessAnalyst #Airforce #AFWERX #DAF #dashboards #visualizations
    $69k-95k yearly est. 7d ago
  • Data Scientist Analyst

    Sawdey Solution Services 4.2company rating

    Data Scientist Job In Dayton, OH

    Pay Rate: The annual base salary range for this position is TBD. Please note that the salary information is a general guideline only. At Sawdey Solution Services, we recognize that attracting the best talent is key to our strategy and success as a company. We will consider several factors when extending an offer to an applicant. These factors include (but are not limited to) the position, associated responsibilities, work experience, education, related training, and related skills. Position Location: Dayton, OH (Remote) Telework/Work-from-Home Authorized: Yes About the Role: We are seeking a Data Scientist Analyst. Additional Responsibilities Include, but are not Limited To: Proven experience in data analysis, database management, and data visualization. Strong programming skills in languages such as Python, R, and SQL. Experience with machine learning and statistical modeling techniques. Familiarity with data visualization tools such as Tableau or Power BI. Excellent communication and presentation skills, with the ability to translate complex data insights for non-technical audiences. Ability to work independently and collaborate effectively with cross-functional teams. Knowledge of the defense industry and the AFWERX program. Assist with proposal development, if necessary. Perform other duties, as assigned. Experience Requirements: Minimum 5 years of Data Science experience. Education Requirements: Bachelor's degree in Data Science, Computer Science, Statistics, or a related field. Other Required Skills & Abilities: Must be able to effectively communicate with customer and fulfill all duties and responsibilities as listed in the contract. Must be proficient in Microsoft Office suite including, but not limited to: Word, PowerPoint, Excel, and Outlook. Security Clearance Requirements: Secret. US Citizenship Requirements: This position supports a U.S. Government Contract whose terms require Sawdey Solution Services to staff it only with U.S. Citizens.
    $67k-92k yearly est. 10d ago
  • Staff Data Scientist

    GE Aerospace 4.8company rating

    Data Scientist Job In Evendale, OH

    The Staff Data Scientist will work in teams addressing statistical, machine learning and data understanding problems in a commercial technology and consultancy development environment. In this role, you will contribute to the development and deployment of modern machine learning, operational research, semantic analysis, and statistical methods for finding structure in large data sets. Roles and Responsibilities: As a Staff Data Scientist, you will work in teams addressing statistical, machine learning, and artificial intelligence problems in a commercial technology and consultancy development environment. You will be part of a data science or cross-disciplinary team driving AI business solutions involving large, complex data sets. Potential application areas include time series forecasting, machine learning regression and classification, root cause analysis (RCA), simulation and optimization, large language models, and computer vision. The ideal candidate will be responsible for developing and deploying machine learning models in production environments. This role requires a strong technical background, excellent problem-solving skills, and the ability to work collaboratively with data engineers, analysts, and other stakeholders. In this role, you will: * Design, develop, and deploy machine learning models and algorithms * Understand business problems and identify opportunities to implement data science solutions. * Develop, verify, and validate analytics to address customer needs and opportunities. * Work in technical teams in development, deployment, and application of applied analytics, predictive analytics, and prescriptive analytics. * Develop and maintain pipelines for Retrieval-Augmented Generation (RAG) and Large Language Models (LLM). * Ensure efficient data retrieval and augmentation processes to support LLM training and inference. * Utilize semantic and ontology technologies to enhance data integration and retrieval. Ensure data is semantically enriched to support advanced analytics and machine learning models. * Participate in Data Science Workouts to shape Data Science opportunities and identify opportunities to use data science to create customer value. * Perform exploratory and targeted data analyses using descriptive statistics and other methods. * Work with data engineers on data quality assessment, data cleansing, data analytics, and model productionization * Generate reports, annotated code, and other projects artifacts to document, archive, and communicate your work and outcomes. * Communicate methods, findings, and hypotheses with stakeholders. Minimum Qualifications: * Bachelor's degree from accredited university or college with minimum of 3 years of professional experience OR an associate's degree with minimum of 5 years of professional experience * 3 years of proficiency in Python (mandatory). * 2 years' experience with machine learning frameworks and deploying models into production environments * Note: Military experience is equivalent to professional experience Eligibility Requirement: * Legal authorization to work in the U.S. is required. We will not sponsor individuals for employment visas, now or in the future, for this job. Desired Characteristics: * Strong analytical and problem-solving skills. * Excellent communication and collaboration abilities. * Familiarity with cloud platforms (e.g., AWS, Azure, Google Cloud) and their machine learning services. * Experience with handling unstructured data, including images, videos, and text. * Understanding of computer vision techniques and tools * Ability to work in a fast-paced, dynamic environment. * Experience with data preprocessing and augmentation tools. * Demonstrated expertise in critical thinking and problem-solving methods * Familiarity with cloud platforms (e.g. AWS, Azure, Google Cloud, Databricks) and their machine learning services * Demonstrated skill in defining and delivering customer value. * Demonstrated expertise working in team settings in various roles * Demonstrated expertise in presentation and communications skills. * Experience with deep learning and neural networks. * Knowledge of data governance and compliance standards. * Demonstrated awareness of how to succeed in ambiguous circumstances Note: To comply with US immigration and other legal requirements, it is necessary to specify the minimum number of years' experience required for any role based within the USA. For roles outside of the USA, to ensure compliance with applicable legislation, the JDs should focus on the substantive level of experience required for the role and a minimum number of years should NOT be used. This Job Description is intended to provide a high level guide to the role. However, it is not intended to amend or otherwise restrict/expand the duties required from each individual employee as set out in their respective employment contract and/or as otherwise agreed between an employee and their manager. This role requires access to U.S. export-controlled information. If applicable, final offers will be contingent on ability to obtain authorization for access to U.S. export-controlled information from the U.S. Government. Additional Information GE Aerospace offers a great work environment, professional development, challenging careers, and competitive compensation. GE Aerospace is an Equal Opportunity Employer. Employment decisions are made without regard to race, color, religion, national or ethnic origin, sex, sexual orientation, gender identity or expression, age, disability, protected veteran status or other characteristics protected by law. GE Aerospace will only employ those who are legally authorized to work in the United States for this opening. Any offer of employment is conditioned upon the successful completion of a drug screen (as applicable). Relocation Assistance Provided: No
    $68k-89k yearly est. 5d ago
  • Senior Data Scientist

    Medpace 4.5company rating

    Data Scientist Job In Cincinnati, OH

    We are currently seeking an experienced data scientist to join our Informatics team who will lead advanced analyses of methodological data to inform study design decisions. The Informatics team utilizes informatics principles and techniques to architect, mine, analyze, and visualize clinical trial data to inform study design choices for pharmaceutical development. The informaticist will create predictive data models to identify and analyze patterns, then program compelling visualizations of the data to support feasibility strategies. The team is seeking an experienced candidate for a senior-level position to contribute new skills to our team, sopport team growth and foster fellow analyst development. The Informatics Team is a highly collaborative team with members in both the Cincinnati and London offices. This team supports clinical operation, medical, and feasibility teams with advanced data query and analysis. The Informatics Team also works side-by-side business analytics and software engineering to architecture innovative data storage and access solutions for optimal data utilization strategies. If you are an individual with experience in informatics, data science, statistics or epidemiology, please review the following career opportunity. Responsibilities * Utilize advanced statistical methods to develop predictive algorithms of methodological and clinical data; * Develop dashboards and web applications to enable Medpace teams' access to, and interpretation of, clinical operations data; * Translate the results of feasibility research and analysis into compelling data visualizations which illustrate the overall feasibility strategy for proposals and bid defense meetings; * Perform comprehensive review of data sources to deliver high quality informatics data and analysis to teams; * Develop and map database architecture of methodological and clinical data systems; * Support departmental process improvement initiatives; and * Participate in training and development of more junior team members. Qualifications * Master's degree in informatics, computer science/engineering, health information, statistics, or related field required, PhD preferred * 1-2 years direct experience applying machine learning to pharmaceutical or clinical data (or translatable artificial intelligence [ai] techniques); * Advanced computer programming skills (preferred lanaguage: R) * Analytical thinker with great attention to detail; * Ability to prioritize multiple projects and tasks within tight timelines; and * Excellent written and verbal communication skills Medpace Overview Medpace is a full-service clinical contract research organization (CRO). We provide Phase I-IV clinical development services to the biotechnology, pharmaceutical and medical device industries. Our mission is to accelerate the global development of safe and effective medical therapeutics through its scientific and disciplined approach. We leverage local regulatory and therapeutic expertise across all major areas including oncology, cardiology, metabolic disease, endocrinology, central nervous system, anti-viral and anti-infective. Headquartered in Cincinnati, Ohio, employing more than 5,000 people across 40+ countries. Why Medpace? People. Purpose. Passion. Make a Difference Tomorrow. Join Us Today. The work we've done over the past 30+ years has positively impacted the lives of countless patients and families who face hundreds of diseases across all key therapeutic areas. The work we do today will improve the lives of people living with illness and disease in the future. Cincinnati Perks * Cincinnati Campus Overview * Flexible work environment * Competitive PTO packages, starting at 20+ days * Competitive compensation and benefits package * Company-sponsored employee appreciation events * Employee health and wellness initiatives * Community involvement with local nonprofit organizations * Discounts on local sports games, fitness gyms and attractions * Modern, ecofriendly campus with an on-site fitness center * Structured career paths with opportunities for professional growth * Discounted tuition for UC online programs Awards * Named a Top Workplace in 2024 by The Cincinnati Enquirer * Recognized by Forbes as one of America's Most Successful Midsize Companies in 2021, 2022, 2023 and 2024 * Continually recognized with CRO Leadership Awards from Life Science Leader magazine based on expertise, quality, capabilities, reliability, and compatibility What to Expect Next A Medpace team member will review your qualifications and, if interested, you will be contacted with details for next steps. EO/AA Employer M/F/Disability/Vets
    $83k-113k yearly est. 60d+ ago
  • Senior Data Scientist

    GAIC Great American Insurance Company

    Data Scientist Job In Cincinnati, OH

    divp style="text-align:left"bBe Here. Be Great. /bWorking for a leader in the insurance industry means opportunity for you. Great American Insurance Group's member companies are subsidiaries of American Financial Group. We combine a "small company" culture where your ideas will be heard with "big company" expertise to help you succeed. With over 30 specialty and property and casualty operations, there are always opportunities here to learn and grow. /pp style="text-align:inherit"/pp style="text-align:left"At Great American, we value diversity and recognize the benefits gained when people from different cultures, backgrounds and experiences work collaboratively to achieve business results. We are intentionally focused on fostering an inclusive culture and know valuing diversity is an essential leadership quality. Our goal is to create a workplace where all employees feel included, empowered and enabled to perform at their best. /ph2/h2p style="text-align:inherit"/pp style="text-align:inherit"/pdivdivdivp Pamp;C IT Services provides professional services to help our business units and corporate functions use technology to create, manage, and optimize information and business processes. IT Services can include a wide range of activities such as: software development, data management, Cloud services, IT security, network security, technical support, establishing and overseeing access rights, procuring and maintaining equipment or software, managing the infrastructure, and defining security procedures, The overall goal of IT Services is to provide technology solutions that increase efficiency, reduce costs, and give our company a competitive advantage over our competitors. /pp/pp The Emerging Technology team within Pamp;C IT Services is seeking a Senior Data Scientist to work a hybrid schedule out of the Cincinnati office. /p/divp/ppspanspan The main job of a Data Scientist is to know how /spanspanturn/spanspan a business problem into a math problem. The lesser job of a Data Scientist is using computers to turn that math problem into a business solution. You must know how to do the second one and at minimum want to know how to do the first and be driven to learn. /spanspan /span/spanspan /span/p/divdivpspanspan You will work on a highly collaborative team of /spanspandata /spanspanprofessionals who are excited to solve the business needs at Great American. Having a strong work ethic, intellectual honesty, and /spanspangood ideas/spanspan are what matter, not seniority and job title. Be willing to learn, be willing to teach, champion your ideas, be the first to admit when they are wrong, and even if you /spanspanaren't/spanspan wrong, be willing to disagree, but commit to the decisions of management. If you do this/spanspan,/spanspan you will succeed. /span/spanspan /span/p/divdivullipspanspan Analyzes internal and external data and performs statistical modeling to develop actionable insights and recommendations to management /spanspanregarding/spanspan business and product performance. /span/spanspan /span/p/li/ul/divdivullipspanspan Develops data mining /spanspanarchitectures/models/protocols,/spanspan statistical reporting, and data analysis methodologies to /spanspanidentify/spanspan trends and extract knowledge from large, high dimensional data sets. /span/spanspan /span/p/li/ul/divdivullipspanspan Identifies/spanspan areas to increase efficiency and automation of data analysis processes. /span/spanspan /span/p/li/ul/divdivullipspanspan Translate business /spanspanproblems/opportunities/spanspan into technical requirements and data science specifications. /span/spanspan /span/p/li/ul/divdivullipspanspan Creates solutions that enable enhanced business performance and decision making through a range of data preparation, modeling, /spanspananalysis/spanspan and/or visualization techniques. /span/spanspan /span/p/li/ul/divdivullipspanspan Uses artificial intelligence and machine learning to develop processes automation. /span/spanspan /span/p/li/ul/divdivullipspanspan Researches and applies knowledge of existing and emerging data science principles, theories, and techniques to inform business decisions. /span/spanspan /span/p/li/ul/divdivullipspanspan Develops and /spanspanmaintains/spanspan strong business knowledge and customer relationships. /span/spanspan /span/p/li/ul/divdivullipspanspan May estimate the economic impact of operational decisions. /span/spanspan /span/p/li/ul/divdivullipspanspan Actively /spanspanparticipates/spanspan in scrum planning and documentation in support of producing /spanspanm/spanspaninimum/spanspan /spanspanv/spanspaniable/spanspan /spanspanp/spanspanroducts as well as /spanspanproduct long term roadmaps/span/spanspan /span/p/li/ul/divdivullipspanspan Performs other duties as assigned. /span/spanspan /span/p/li/ul/div/divdivdivp/ppbspanspan What /spanspanelse /spanspana Senior Data Scientist should bring:/span/span/bspan /span/p/divdivullipspanspan Understand how the /spanspanblack box comes to its answers and be able to explain that to non-technical users/spanspan. /span/spanspan /span/p/li/ul/divdivullipspanspan Ideate and d/spanspaniscuss alternate paths to /spanspanachieving t/spanspanactical/spanspan goals /spanspanwhile/spanspan staying focused on the strategic /spanspangoals. /span/spanspan /span/p/li/ul/divdivullipspanspan Keeping abreast of the latest developments in the AI/ML space. /span/spanspan /span/p/li/ul/divdivullipspanspan Manage interns and ensure the work products they produce meet team standards/spanspan. /span/spanspan /span/p/li/ul/divdivp/ppbspanspan Requirements:/span/spanspan /span/b/p/divdivullip6 years of experience. /p/lilipspanspan Internal drive to expand your personal domain /spanspanexpertise/spanspan and an insatiable curiosity about how to /spanspanuse /spanspanthat /spanspanexpertise/spanspan in collaboration with other team members to mitigate/spanspan our current and future customers'/spanspan financial insecurities. /span/spanspan /span/p/li/ul/divdivullipspanspanB. S. /spanspan (or above)/spanspan in Applied Math, Statistics, Engineering, Finance, Science, or Computer Science/spanspan or /spanspanequivalent experience/span/spanspan /span/p/li/ul/divdivullipspanspan Coding skills in Python/spanspan, the core AI/ML libraries (required)/span/spanspan /span/p/li/ul/divdivullipspanspan PyTorch/spanspan, and the /spanspan HuggingFace/spanspan library (bonus)/span/spanspan /span/p/li/ul/divdivullipspanspan Know your math:/span/spanspan /span/p/li/ul/divdivullipspanspan Statistics - What is a hypothesis test and what is the central limit theorem? (required)/span/spanspan /span/p/li/ul/divdivullipspanspan Linear Algebra - Be able to explain Singular Value Decomposition and what an Eigenvector is/spanspan. /span/spanspan /span/p/li/ul/divdivullipspanspan Stochastic processes/spanspan (bonus)/span/spanspan /span/p/li/ul/divdivullipspanspan Bayesian Statistics/spanspan (I almost want to say /spanspan, but I /spanspanwon't/spanspan)/span/spanspan /span/p/li/ul/divdivullipspanspanA solid understanding of finance/span/spanspan /span/p/li/ul/divdivullipspanspan Option/spanspan pricing model/spanspan - Why is this important to insurance?/span/spanspan /span/p/li/ul/divdivullipspanspan What is the difference between /spanspanrisk avoidance, mitigation, and mediation?/span/spanspan /span/p/li/ul/div/divp style="text-align:inherit"/pp style="text-align:left"bBusiness Unit:/b/pProperty amp; Casualty IT Servicesp style="text-align:inherit"/pp style="text-align:inherit"/pp style="text-align:left"bBenefits: /b/pp Compensation varies by role, position level, and location. Individual pay is influenced by skills, education, training, certifications, experience, and the role's scope and complexity, along with business needs. /ppbr/We offer a competitive Total Rewards package, including medical, dental, and vision plans starting on day one, PTO, paid holidays, commuter benefits, an employee stock purchase plan, education reimbursement, paid parental leave/adoption assistance, and a 401(k) plan with company match. These benefits are available to eligible full-time and part-time employees. /ppbr/Your recruiter can provide more details about our total rewards and specific compensation ranges during the hiring process. /p/div
    $74k-102k yearly est. 38d ago
  • Data Science Intern

    KBR Wyle Services

    Data Scientist Job In Beavercreek, OH

    Title: Data Science Intern KBR is looking for highly motivated rising Juniors, Seniors and Graduate students seeking business, science, and software development opportunities that provide challenging and meaningful work experience to be a part of our 2025 Intern class. Learn and apply state-of-the-art data science practices as a member of an interdisciplinary team Use data science and statistical techniques to identify data of interest, extract structured data from unstructured sources, and apply other data wrangling patterns to make our corpus of data more useful Apply techniques from natural language processing (NLP) to understand textual data, including named entity recognition, topic modeling, and sentiment analysis Use neural networks and other machine learning (ML) techniques to find patterns in data and to predict likely outcomes using a variety of input data types and sources Anticipated Skill Set: Strong analytical and problem-solving skills with the willingness to learn Strong communication and organizational skills, as well as the ability to work independently and as part of a team Proficient in one or more programming languages for data manipulation and analysis, such as Python, R, SQL, C, C#, C++, etc. Proficient in MS Office Products (Word, Excel, PowerPoint) Experience retrieving and analyzing datasets from databases and other systems Experience with various data visualization tools such as Tableau, Qlik Analytics, MS Power BI, Plotly, or Matplotlib Experience with “big data” tools such as Hadoop, Spark, Kafka, Databricks, or Palantir Foundry is preferred **Ability to obtain a DoD security clearance is required to be considered for this position. Belong, Connect and Grow at KBR At KBR, we are passionate about our people and our Zero Harm culture. These inform all that we do and are at the heart of our commitment to, and ongoing journey toward being a People First company. That commitment is central to our team of team's philosophy and fosters an environment where everyone can Belong, Connect and Grow. We Deliver - Together. KBR is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, disability, sex, sexual orientation, gender identity or expression, age, national origin, veteran status, genetic information, union status and/or beliefs, or any other characteristic protected by federal, state, or local law.
    $28k-47k yearly est. 50d ago
  • Museum Data Collection Intern

    Taft Museum of Art

    Data Scientist Job In Cincinnati, OH

    Position: Museum Data Collection Intern Reports to: Manager of Visitor Experience Team: Visitor Experience Status: Hourly, Part-Time Compensation: $14.00 to $21.00 per hour commensurate with experience and education/certification Hours: Sunday 12:00 - 4:00 p.m., and one alternating day, Wednesday, Thursday, or Friday 12:00 - 4:00 p.m. Scheduling will be determined by the Manager of Visitor Experience as events dictate. Benefits: Aflac, Discretionary Time Off (DTO) after 250 hours worked, free downtown parking, Employee Assistance Program (EAP), Museum membership, and other special discounts. The Taft Museum of Art, located in downtown Cincinnati, is seeking a part-time Museum Data Collection Intern, reporting directly to the Manager of Visitor Experience. This position is unique and requires a positive, highly organized team player who communicates proactively, is attentive to detail, manages multiple priorities, and has excellent written communication and collaboration skills. Enjoy this rare opportunity to join the staff of one of the finest small art museums in the United States. Please ensure that you submit a cover letter, resume, and three references. Please submit only through our online portal at *********************************** Our team will review your information, and we will get back to you with the next steps. If you have questions, please email HR at ********************, no phone calls, please. POSITION SUMMARY The Museum Data Collection Associate works within the Visitor Experience Department to collect data from guests. Studies include but are not limited to research and evaluation measuring overall guest demographics, guest experience, and guest outcomes related to exhibition or program experiences. These studies support the mission of the museum by collecting critical information measuring the impact of arts experiences at the Taft Museum of Art. The primary methods used by the Data Collector will be surveys and interviews. The weekly schedule for this position is based upon the timelines for research and evaluation studies at the museum and requires work hours during museum operating hours (which include day, night, weekday, and weekend). It is estimated up to 32 hours per month. CORE RESPONSIBILITIES: Collects data using methods such as observations, interviews, surveys, and video or audio recording through research and evaluation studies to better understand learning experiences. Supports the work of the department and ensures projects run smoothly by scheduling data collection, assisting with the flow of information, making telephone calls, updating files, and tracking elements of the project. Manages data for research/evaluation studies by filing, entering, and tracking data to ensure studies are well organized and accurate. Contributes to creating data collection instruments, collecting, analyzing, and reporting on study data under the direction of a study leader to better understand learning experiences. Performs other work-related duties as required by the Manager. REQUIRED MINIMUM EDUCATION & EXPERIENCE Post high school course work, technical degree, associate's degree, National Career Readiness Certificate, business or vocational certificate preferred. Excellent verbal and written communication skills. Experience working directly with the public and providing first-rate customer service. Strong organizational skills: ability to prioritize and multitask. Proficient in Microsoft Office. DESIRED SKILLS AND CAPABILITIES Project coordination and data management skills, such as tracking and organizing collected data. Demonstrated ability in working with computers and Microsoft Office, and some experience with databases. Experience with data collection methods with human subjects such as interviews, survey administration, and/or structured observations. Experience working in collaborative teams. Experience with basic qualitative or quantitative analysis, such as coding open-ended data or calculating frequencies and means. Experience using quantitative and/or qualitative coding software. MUSEUM SHARED RESPONSIBILITIES Must embrace the museum's core values of respect, integrity, excellence, creativity, and collaboration and demonstrate this understanding through words, behaviors, and interactions with our guests, staff members, volunteers, and the public. Is ready to learn and to teach every day. Shares knowledge freely with colleagues and pursues opportunities to gain new skills to enhance our success as a team. Appreciates, understands, and values each staff member's expertise, background, experience, strengths, and unique perspective. Sharing time, energy, and knowledge with others to ensure everyone has the highest potential to succeed. Strives to achieve excellence in all tasks and goals. Demonstrates professionalism on and off the job; always represents the Taft Museum of Art positively and professionally. Speaks truthfully and fulfills promises and obligations in all museum dealings. Is comfortable and can communicate with people of diverse backgrounds. Focuses on delivering the museum's mission. Adheres to all current museum policies, procedures, protocols, and processes. Creates a pleasant work environment by being a positive influence and respectful to every person. The success of the Taft Museum of Art is driven by our core values of respect, integrity, excellence, creativity, and collaboration, as exemplified by our team members. Our VIEW: Value diversity, equity, access, and inclusion are drivers for attracting and retaining a diverse team of board members, staff, and volunteers who feel empowered to deliver excellence. It also is the key to reaching a diverse community and audience. Include multiple perspectives and believe that differing views strengthen our museum by stretching us to learn, experience, and expand our thinking each day. Embrace our mission and vision to explore the hidden gems connected to our historic house, the people that lived in it for the first 100 years, and the extensive art collection, bringing all of this to life and making it relevant and unique to each guest. Work together as a board and staff to ensure that our members, partners, and key stakeholders reflect and embrace these core values.
    $14-21 hourly Easy Apply 60d+ ago
  • Data Engineer

    DMI 3.5company rating

    Data Scientist Job In Cincinnati, OH

    DMI is a leading provider of digital services and technology solutions, headquartered in Tysons Corner, VA. With a focus on end-to-end managed IT services, including managed mobility, cloud, cybersecurity, network operations, and application development, DMI supports public sector agencies and commercial enterprises around the globe. Recognized as a Top Workplace, DMI is committed to delivering secure, efficient, and cost-effective solutions that drive measurable results. Learn more at ************* About the Opportunity DMI, LLC is seeking a Data Engineer to join us. We are seeking a Data Engineer to help build, optimize, and scale our data pipelines, optimize data ingestion, transformation, and analysis of complex telecom data. This role will be instrumental in integrating large-scale invoices, usage records, and cost allocations into our AWS-based analytics platform while ensuring data integrity and performance. Duties and Responsibilities: Develop and maintain scalable ETL/ELT pipelines to process large volumes of telecom and financial data from carriers, APIs, and external sources. Design and optimize data models for structured and semi-structured data in PostgreSQL, Redshift, and other databases. Automate data processing workflows to improve efficiency and accuracy in processing telecom invoices, CDRs (Call Detail Records), and billing data. Ensure data quality and integrity, implementing error handling, validation, and reconciliation techniques. Optimize query performance and database structures to support analytics and reporting needs. Collaborate with cross-functional teams including Product, Engineering, DevOps, and Expense Management to understand data requirements and ensure alignment with business goals. Implement monitoring and logging solutions for data pipelines, ensuring reliability and auditability. Work with cloud-based infrastructure in AWS, utilizing services like Lambda, S3, Glue, Redshift, RDS, and Step Functions. Integrate machine learning capabilities into the data pipeline to enhance automation and predictive analytics. Qualifications Education and Years of Experience: Bachelor's degree in Computer Science, Data Engineering, Information Systems, or a related field (or equivalent experience). A Master's degree in Data Science, Analytics, or Cloud Computing is a plus but not required. Relevant certifications in AWS (e.g., AWS Certified Data Analytics, AWS Certified Solutions Architect) or database technologies (e.g., PostgreSQL, Redshift, Snowflake) are a plus. 3+ years of experience as a Data Engineer, working with large-scale data processing. Proficiency in Python, SQL, and data transformation frameworks (e.g., dbt, Apache Airflow, or similar). Experience working with AWS cloud services (Lambda, S3, Glue, Redshift, RDS, Step Functions). Strong understanding of ETL/ELT methodologies, data warehousing, and database optimization. Experience with handling large telecom invoices, parsing carrier billing formats, and reconciling financial data is a plus. Experience integrating data from APIs, flat files, XML, JSON, and databases. Knowledge of data governance, compliance, and security best practices. Familiarity with data lake architecture and multi-tenant SaaS platforms. Strong problem-solving and analytical skills, with the ability to debug and optimize complex data processes. Required and Desired Skills/Certifications: Experience with big data technologies such as Spark, Snowflake, or Hadoop. Prior experience working in Telecom Expense Management (TEM) or FinTech. Exposure to DevOps principles, CI/CD pipelines, and infrastructure-as-code tools (Terraform, CloudFormation). Familiarity with BI tools such as Domo, Tableau, or Apache Superset. Knowledge of AI/ML integration for anomaly detection and predictive analytics in telecom data. Min Citizenship Status Required: Must be a U.S. Citizen Physical Requirements: No Physical requirement needed for this position. Location: Cincinnati, OH (potential for remote) #LI-EK1 Working at DMI DMI is a diverse, prosperous, and rewarding place to work. Being part of the DMI family means we care about your wellbeing. As such, we offer a variety of perks and benefits that help meet various interests and needs, while still having the opportunity to work directly with a number of our award-winning, Fortune 1000 clients. The following categories make up your DMI wellbeing: Convenience/Concierge - Virtual visits through health insurance, pet insurance, commuter benefits, discount tickets for movies, travel, and many other items to provide convenience. Development - Annual performance management, continuing education, and tuition assistance, internal job opportunities along with career enrichment and advancement to help each employee with their professional and personal development. Financial - Generous 401k matches both pre-tax and post-tax (ROTH) contributions along with financial wellness education, EAP, Life Insurance and Disability help provide financial stability for each DMI employee. Recognition - Great achievements do not go unnoticed by DMI through Annual Awards ceremony, service anniversaries, peer-to-peer acknowledgment, employee referral bonuses. Wellness - Healthcare benefits, Wellness programs, Flu Shots, Biometric screenings, and several other wellness options. Employees are valued for their talents and contributions. We all take pride in helping our customers achieve their goals, which in turn contributes to the overall success of the company. ***************** No Agencies Please ***************** Applicants selected may be subject to a government security investigation and must meet eligibility requirements for access to classified information. US citizenship may be required for some positions.
    $78k-108k yearly est. 9d ago
  • Data Engineer Level 2

    ACL Digital

    Data Scientist Job In Blue Ash, OH

    Accountable for developing and delivering technological responses to targeted business outcomes. Analyze, design and develop enterprise data and information architecture deliverables, focusing on data as an asset for the enterprise. Understand and follow reusable standards, design patterns, guidelines, and configurations to deliver valuable data and information across the enterprise, including direct collaboration with 84.51, where needed. 2+ years experience with automation production systems (Ansible Tower, Jenkins, Puppet, or Selenium) Working knowledge of databases and SQL Experience with software development methodologies and SDLC Candidate possess a problem-solving attitude and can work independently Must be very organized, able to balance multiple priorities, and self-motivated Key Responsibilities: Experience in administration and configuration of API Gateways (e.g. Apigee/Kong) Apply cloud computing skill to deploy upgrades and fixes Design, develop, and implement integrations based on use feedback. Troubleshoot production issues and coordinate with the development team to streamline code deployment. Implement automation tools and frameworks (Ci/CD pipelines). Analyze code and communicate detailed reviews to development teams to ensure a marked improvement in applications and the timely completion of products. Collaborate with team members to improve the companys engineering tools, systems and procedures, and data security. Deliver quality customer service and resolve end-user issues in a timely manner Draft architectural diagrams, interface specifications and other design documents Participate in the development and communication of data strategy and roadmaps across the technology organization to support project portfolio and business strategy Innovate, develop, and drive the development and communication of data strategy and roadmaps across the technology organization to support project portfolio Drive the development and communication of enterprise standards for data domains and data solutions, focusing on simplified integration and streamlined operational and analytical uses Drive digital innovation by leveraging innovative new technologies and approaches to renovate, extend, and transform the existing core data assets, including SQL-based, NoSQL-based, and Cloud-based data platforms Define high-level migration plans to address the gaps between the current and future state, typically in sync with the budgeting or other capital planning processes Lead the analysis of the technology environment to detect critical deficiencies and recommend solutions for improvement Mentor team members in data principles, patterns, processes and practices Promote the reuse of data assets, including the management of the data catalog for reference Draft and review architectural diagrams, interface specifications and other design documents
    $75k-101k yearly est. 9d ago
  • Data Engineer/Data Administrator

    Blue Bridge People

    Data Scientist Job In Erlanger, KY

    Data Engineer/Database Administrator As a Data Engineer/Data Administrator, you will play a pivotal role in designing, building, and maintaining data infrastructure to support modern data solutions. Your expertise will drive the optimization of applications, streamline data workflows, and ensure the seamless integration of cutting-edge tools and platforms. A key focus will be leveraging cloud-native solutions, with a strong emphasis on Azure and AWS cloud engineering, and advanced SQL data engineering practices to enable scalable and efficient data management. Key Responsibilities: Lead the design and implementation of database structures, including star schema models, to optimize performance and support analytics. Develop, test, and maintain data pipelines using tools such as Azure Data Factory (ADF), Databricks, and AzureSQL for efficient ETL workflows. Collaborate with development and analytics teams to refine database architectures for scalability and reliability. Monitor and address database performance, ensuring high availability and efficiency. Implement and support medallion architecture to manage and optimize data flows across bronze, silver, and gold layers. Apply expertise in backup and recovery processes, database security best practices, and integration of external APIs. Preferred Qualifications: Strong knowledge of database design, including star schema modeling and performance tuning. Experience with SQL Server, Azure Synapse, and AWS cloud services. Hands-on expertise with ADF, Databricks, and Apache Spark for ETL and data engineering workflows. Familiarity with modern cloud-native tools for data integration and orchestration. Knowledge of data visualization tools such as Microsoft Fabric and Power BI SaaS products. This position offers an exciting opportunity to shape modern data solutions, leveraging innovative tools and methodologies to drive impactful results. Salary-- $110-140K/year with 10-15% bonus potential This is a 40 hour a week onsite opportunity.
    $110k-140k yearly 60d+ ago
  • Senior Data Engineer- Hiring Now!! 5+ Roles to be filled IMMEDIATELY!!!

    Revive Staffing Solutions

    Data Scientist Job In Erlanger, KY

    Job Description: 1.Responsible to design, build, refactor, and maintain data pipelines using Microsoft Azure, SQL, Azure Data Factory, Azure Synapse, Databricks, Python, and PySpark to meet business requirements for reporting, analysis, and data science 2. Responsible to teach, adhere to, and contribute to DataOps and MLOps standards and best practices to accelerate and continuously improve data system performance 3. Responsible to design, and integrate fault tolerance and enhancements into data pipelines to improve quality and performance 4. Responsible to lead and perform root cause analysis and solve problems using analytical and technical skills to optimize data delivery and reduce costs 5. Engages business end users and shares responsibility leading a delivery team. 6. Responsible to mentor Data Engineers at all levels of experience How you will do it • Advanced experience with Microsoft Azure, SQL, Azure Data Factory, Azure Synapse, Databricks, Python, PySpark, SAP Datasphere, Power BI or other cloud-based data systems • Advanced experience with Azure Develops, GitHub, CI/CD • Advanced experience with database storage systems such as cloud, relational, mainframe, data lake, and data warehouse • Advanced experience building cloud ETL pipelines using code or ETL platforms utilizing techniques to extract value from large, disconnected datasets • Experienced presenting conceptual and technical improvements to influence decisions • Continuous learning to up skills data engineering techniques and business acumen What we look for • Bachelor's or Master's degree in computer science, software engineering, information technology or equivalent combination of data engineering professional experience and education. • 7+ years proven Data Engineering experience in a complex agile environment database connections, APIs, or file-based • Advanced experience with data warehousing concepts and agile methodology • Advanced experience designing and coding data manipulations applying processing
    $71k-96k yearly est. 60d+ ago
  • Staff Data Scientist

    GE Aerospace 4.8company rating

    Data Scientist Job In Evendale, OH

    As a Staff Data Scientist, you will work in teams addressing statistical, machine learning, and artificial intelligence problems in a commercial technology and consultancy development environment. You will be part of a data science or cross-disciplinary team driving AI business solutions involving large, complex data sets. Potential application areas include time series forecasting, machine learning regression and classification, root cause analysis (RCA), simulation and optimization, large language models, and computer vision. The ideal candidate will be responsible for developing and deploying machine learning models in production environments. This role requires a strong technical background, excellent problem-solving skills, and the ability to work collaboratively with data engineers, analysts, and other stakeholders. Roles and Responsibilities: * Design, develop, and deploy machine learning models and algorithms * Understand business problems and identify opportunities to implement data science solutions. * Develop, verify, and validate analytics to address customer needs and opportunities. * Work in technical teams in development, deployment, and application of applied analytics, predictive analytics, and prescriptive analytics. * Develop and maintain pipelines for Retrieval-Augmented Generation (RAG) and Large Language Models (LLM). * Ensure efficient data retrieval and augmentation processes to support LLM training and inference. * Utilize semantic and ontology technologies to enhance data integration and retrieval. Ensure data is semantically enriched to support advanced analytics and machine learning models. * Participate in Data Science Workouts to shape Data Science opportunities and identify opportunities to use data science to create customer value. * Perform exploratory and targeted data analyses using descriptive statistics and other methods. * Work with data engineers on data quality assessment, data cleansing, data analytics, and model productionization * Generate reports, annotated code, and other projects artifacts to document, archive, and communicate your work and outcomes. * Communicate methods, findings, and hypotheses with stakeholders. Minimum Qualifications: * Bachelor's degree from accredited university or college with minimum of 3 years of professional experience OR an associate's degree with minimum of 5 years of professional experience * 3 years of proficiency in Python (mandatory). * 2 years' experience with machine learning frameworks and deploying models into production environments * Note: Military experience is equivalent to professional experience Eligibility Requirement: * Legal authorization to work in the U.S. is required. We will not sponsor individuals for employment visas, now or in the future, for this job. Desired Characteristics: * Strong analytical and problem-solving skills. * Excellent communication and collaboration abilities. * Familiarity with cloud platforms (e.g., AWS, Azure, Google Cloud) and their machine learning services. * Experience with handling unstructured data, including images, videos, and text. * Understanding of computer vision techniques and tools * Ability to work in a fast-paced, dynamic environment. * Experience with data preprocessing and augmentation tools. * Demonstrated expertise in critical thinking and problem-solving methods * Familiarity with cloud platforms (e.g. AWS, Azure, Google Cloud, Databricks) and their machine learning services * Demonstrated skill in defining and delivering customer value. * Demonstrated expertise working in team settings in various roles * Demonstrated expertise in presentation and communications skills. * Experience with deep learning and neural networks. * Knowledge of data governance and compliance standards. Note: To comply with US immigration and other legal requirements, it is necessary to specify the minimum number of years' experience required for any role based within the USA. For roles outside of the USA, to ensure compliance with applicable legislation, the JDs should focus on the substantive level of experience required for the role and a minimum number of years should NOT be used. This Job Description is intended to provide a high level guide to the role. However, it is not intended to amend or otherwise restrict/expand the duties required from each individual employee as set out in their respective employment contract and/or as otherwise agreed between an employee and their manager. This role requires access to U.S. export-controlled information. If applicable, final offers will be contingent on ability to obtain authorization for access to U.S. export-controlled information from the U.S. Government. Additional Information GE Aerospace offers a great work environment, professional development, challenging careers, and competitive compensation. GE Aerospace is an Equal Opportunity Employer. Employment decisions are made without regard to race, color, religion, national or ethnic origin, sex, sexual orientation, gender identity or expression, age, disability, protected veteran status or other characteristics protected by law. GE Aerospace will only employ those who are legally authorized to work in the United States for this opening. Any offer of employment is conditioned upon the successful completion of a drug screen (as applicable). Relocation Assistance Provided: Yes
    $68k-89k yearly est. 9d ago
  • GCP Data Engineer

    Tata Consulting Services 4.3company rating

    Data Scientist Job In Blue Ash, OH

    As an Advanced Data Engineer, you will have the opportunity to lead the development of innovative data solutions, enabling the effective use of data across the organization. You will be responsible for designing, building, and maintaining robust data pipelines and platforms to meet business objectives, focusing on data as a strategic asset. Your role will involve collaboration with cross-functional teams, leveraging cutting-edge technologies, and ensuring scalable, efficient, and secure data engineering practices. A strong emphasis will be placed on expertise in GCP, Vertex AI, and advanced feature engineering techniques. * 4+ years of professional Data Development experience. * 4+ years of experience with SQL and NoSQL technologies. * 3+ years of experience building and maintaining data pipelines and workflows. * 5+ years of experience developing with Java. * 2+ years of experience developing with Python. * 3+ years of experience developing Kafka solutions. * 2+ years of experience in feature engineering for machine learning pipelines. * Experience with GCP services such as BigQuery, Vertex AI Platform, Cloud Storage, AutoMLOps, and Dataflow. * Experience with CI/CD pipelines and processes. * Experience with automated unit, integration, and performance testing. * Experience with version control software such as Git. * Full understanding of ETL and Data Warehousing concepts. * Strong understanding of Agile principles (Scrum). * Additional Qualifications * Knowledge of Structured Streaming (Spark, Kafka, EventHub, or similar technologies). * Experience with GitHub SaaS/GitHub Actions. * Experience understanding Databricks concepts. * Experience with PySpark and Spark development. Roles & Responsibilities: * Provide Technical Leadership: Offer technical leadership to ensure clarity between ongoing projects and facilitate collaboration across teams to solve complex data engineering challenges. * Build and Maintain Data Pipelines: Design, build, and maintain scalable, efficient, and reliable data pipelines to support data ingestion, transformation, and integration across diverse sources and destinations, using tools such as Kafka, Databricks, and similar toolsets. * Drive Digital Innovation: Leverage innovative technologies and approaches to modernize and extend core data assets, including SQL-based, NoSQL-based, cloud-based, and real-time streaming data platforms. * Implement Feature Engineering: Develop and manage feature engineering pipelines for machine learning workflows, utilizing tools like Vertex AI, BigQuery ML, and custom Python libraries. * Implement Automated Testing: Design and implement automated unit, integration, and performance testing frameworks to ensure data quality, reliability, and compliance with organizational standards. * Optimize Data Workflows: Optimize data workflows for performance, cost efficiency, and scalability across large datasets and complex environments. * Mentor Team Members: Mentor team members in data principles, patterns, processes, and practices to promote best practices and improve team capabilities. * Draft and Review Documentation: Draft and review architectural diagrams, interface specifications, and other design documents to ensure clear communication of data solutions and technical requirements. * Cost/Benefit Analysis: Present opportunities with cost/benefit analysis to leadership, guiding sound architectural decisions for scalable and efficient data solutions * Support flows for an ML platform in GCP and needs to be able to work with data science and understand the ML concepts in terms of requirements to be met by the data. #LI-RJ2 Salary Range - $93,700-$135,000 a year
    $93.7k-135k yearly 46d ago
  • Data Engineer Level 3

    ACL Digital

    Data Scientist Job In Cincinnati, OH

    About 84.51: We are a full stack data science company and a wholly owned subsidiary of The * Company. We own 10 Petabytes of data, and collect 35+ Terabytes of new data each week sourced from 62 Million households. As a member of our software engineering team you will use various cutting-edge technologies to develop applications that turn our data into actionable insights used to personalize the customer experience for shoppers at *. What you'll do: As a Senior Data Engineer, you are part of the software development team. We develop strategies and solutions to ingest, store, and distribute our big data. Our developers use Big Data technologies including (but not limited to) Hadoop, PySpark, Hive, JSON, and SQL to develop products, tools and software features. Responsibilities: Take ownership of features and drive them to completion through all phases of the entire 84.51 SDLC. This includes external facing and internal applications as well as process improvement activities such as: Participate in design of Big Data platforms and SQL based solutions Perform development of Big Data platforms and SQL based solutions Perform unit and integration testing Partner with senior resources, gaining insights Participate in retrospective reviews Participate in the estimation process for new work and releases Be driven to improve yourself and the way things are done Minimum Skills Required: Bachelors degree (typically in Computer Science, Management Information Systems, Mathematics, Business Analytics or another technically strong program), plus 2 years of experience Proven Big Data technology development experience including Hadoop, Spark (PySpark), and Hive Understanding of Agile Principles (Scrum) Experience developing with Python Cloud Development (Azure) Exposure to VCS (Git, SVN) Position Specific Skill Preferences: Experience in the following: Experience developing with SQL (Oracle, SQL Server) Exposure to NoSQL (Mongo, Cassandra) Apache NiFi Airflow Docker Key Responsibilities: Innovate, develop, and drive the development and communication of data strategy and roadmaps across the technology organization to support project portfolio and business strategy Drive the development and communication of enterprise standards for data domains and data solutions, focusing on simplified integration and streamlined operational and analytical uses Drive digital innovation by leveraging innovative new technologies and approaches to renovate, extend, and transform the existing core data assets, including SQL-based, NoSQL-based, and Cloud-based data platforms Define high-level migration plans to address the gaps between the current and future state, typically in sync with the budgeting or other capital planning processes Lead the analysis of the technology environment to detect critical deficiencies and recommend solutions for improvement Mentor team members in data principles, patterns, processes and practices Promote the reuse of data assets, including the management of the data catalog for reference Draft and review architectural diagrams, interface specifications and other design documents Proactively and holistically lead activities that create deliverables to guide the direction, development, and delivery of technological responses to targeted business outcomes. Provide facilitation, analysis, and design tasks required for the development of an enterprise's data and information architecture, focusing on data as an asset for the enterprise. Develop target-state guidance (i.e., reusable standards, design patterns, guidelines, individual parts and configurations) to evolve the technical infrastructure related to data and information across the enterprise, including direct collaboration with 84.51.
    $75k-101k yearly est. 9d ago
  • Azure Data Engineer

    Tata Consulting Services 4.3company rating

    Data Scientist Job In Blue Ash, OH

    * 4+ plus years of experience in solutioning data pipeline for large enterprise data Warehouse applications using Azure Data Factory, Data bricks, Data Ingestion, Data Transformation /Computation, Orchestration, Reporting & Data Analytics * Knowledge of Azure Services such as ADF, Databricks, Event Hub, Delta Lake, Data Lake * Strong command on Python, Pyspark , SQL & Power BI * Experience in GitHub , Git Action , Jira end to end data pipeline. * Strong knowledge in CICD Pipeline for automatic deployments * Experience in Terraform for Infrastructure Provisioning * Strong knowledge on design and integration patterns * Proficient in technical artifacts e.g., Application Architecture, Solution Design Documents, etc * Strong at analytical and problem-solving skills * Experience working with multi-vendor, multi-culture, distributed offshore and onshore development teams in dynamic and complex environment. * Experience in Retail is desired, not mandatory. * Must have excellent written and verbal communication skills. Roles & Responsibilities: * Understand the requirements, current state architecture of the enterprise and create roadmap for the future enhancements accordingly. * Need to lead & mentor the team and solved technical escalations. * Need to proactively in communicating the escalated issues to feature team & product managers. * Create Software Architecture Document, High Level and Low-Level Design document, Nonfunctional requirements for the project. * Define Integration Design and Security Design for the Web Services and Enterprise components involved. * Participate in production of detailed functional des ign documents to match customer requirements. * Participate in production of technical specification for development and integration requirements. * Review design documents for services such as Service Design Document, Service physical Document and define Service SLAs in Nonfunctional requirements. * Collect and Provide estimates for the requirements. * Engage with client architecture group. * Collaborate with internal technology teams and contribute to internal initiatives. #LI-RJ2 Salary Range: $80,000-$90,000 a year
    $80k-90k yearly 51d ago

Learn More About Data Scientist Jobs

How much does a Data Scientist earn in Springdale, OH?

The average data scientist in Springdale, OH earns between $59,000 and $109,000 annually. This compares to the national average data scientist range of $75,000 to $148,000.

Average Data Scientist Salary In Springdale, OH

$80,000

What are the biggest employers of Data Scientists in Springdale, OH?

The biggest employers of Data Scientists in Springdale, OH are:
  1. General Electric
Job type you want
Full Time
Part Time
Internship
Temporary