Data Engineer Jobs in Cincinnati, OH

- 456 Jobs
All
Data Engineer
Software Engineer
ETL Architect
Requirements Engineer
Software Engineer Lead
  • GCP Data Engineer

    Tata Consultancy Services 4.3company rating

    Data Engineer Job 13 miles from Cincinnati

    Job Type: Fulltime Experience: 10+years As an Advanced Data Engineer, you will have the opportunity to lead the development of innovative data solutions, enabling the effective use of data across the organization. You will be responsible for designing, building, and maintaining robust data pipelines and platforms to meet business objectives, focusing on data as a strategic asset. Your role will involve collaboration with cross-functional teams, leveraging cutting-edge technologies, and ensuring scalable, efficient, and secure data engineering practices. A strong emphasis will be placed on expertise in GCP, Vertex AI, and advanced feature engineering techniques. 4+ years of professional Data Development experience. 4+ years of experience with SQL and NoSQL technologies. 3+ years of experience building and maintaining data pipelines and workflows. 5+ years of experience developing with Java. 2+ years of experience developing with Python. 3+ years of experience developing Kafka solutions. 2+ years of experience in feature engineering for machine learning pipelines. Experience with GCP services such as BigQuery, Vertex AI Platform, Cloud Storage, AutoMLOps, and Dataflow. Experience with CI/CD pipelines and processes. Experience with automated unit, integration, and performance testing. Experience with version control software such as Git. Full understanding of ETL and Data Warehousing concepts. Strong understanding of Agile principles (Scrum). Additional Qualifications Knowledge of Structured Streaming (Spark, Kafka, EventHub, or similar technologies). Experience with GitHub SaaS/GitHub Actions. Experience understanding Databricks concepts. Experience with PySpark and Spark development. Roles & Responsibilities: Provide Technical Leadership: Offer technical leadership to ensure clarity between ongoing projects and facilitate collaboration across teams to solve complex data engineering challenges. Build and Maintain Data Pipelines: Design, build, and maintain scalable, efficient, and reliable data pipelines to support data ingestion, transformation, and integration across diverse sources and destinations, using tools such as Kafka, Databricks, and similar toolsets. Drive Digital Innovation: Leverage innovative technologies and approaches to modernize and extend core data assets, including SQL-based, NoSQL-based, cloud-based, and real-time streaming data platforms. Implement Feature Engineering: Develop and manage feature engineering pipelines for machine learning workflows, utilizing tools like Vertex AI, BigQuery ML, and custom Python libraries. Implement Automated Testing: Design and implement automated unit, integration, and performance testing frameworks to ensure data quality, reliability, and compliance with organizational standards. Optimize Data Workflows: Optimize data workflows for performance, cost efficiency, and scalability across large datasets and complex environments. Mentor Team Members: Mentor team members in data principles, patterns, processes, and practices to promote best practices and improve team capabilities. Draft and Review Documentation: Draft and review architectural diagrams, interface specifications, and other design documents to ensure clear communication of data solutions and technical requirements. Cost/Benefit Analysis: Present opportunities with cost/benefit analysis to leadership, guiding sound architectural decisions for scalable and efficient data solutions Support flows for an ML platform in GCP and needs to be able to work with data science and understand the ML concepts in terms of requirements to be met by the data. #LI-RJ2 Salary Range - $93,700-$135,000 a year
    $93.7k-135k yearly 28d ago
  • Data Engineer

    Ltimindtree

    Data Engineer Job In Cincinnati, OH

    About Us: LTIMindtree is a global technology consulting and digital solutions company that enables enterprises across industries to reimagine business models, accelerate innovation, and maximize growth by harnessing digital technologies. As a digital transformation partner to more than 700+ clients, LTIMindtree brings extensive domain and technology expertise to help drive superior competitive differentiation, customer experiences, and business outcomes in a converging world. Powered by nearly 90,000 talented and entrepreneurial professionals across more than 30 countries, LTIMindtree - a Larsen & Toubro Group company - combines the industry-acclaimed strengths of erstwhile Larsen and Toubro Infotech and Mindtree in solving the most complex business challenges and delivering transformation at scale. For more information, please visit ******************** Job Title: Data Engineer Location: Cincinnati OH Job Summary: Skills Required Data Modeling Domain Driven Design Data Profiling and general Analysis Data Platforms Modernization Data Migration Emphasis on data management for OLTP systems but can help Data Warehouse team as well Technical Required ERD Logical Data Modeling tool Erwin or similar Databases and related Postgres Oracle Physical Data Model and architecture strong SQL skills stored procedures reading and understanding JSON Preferable Python Kafka Enterprise Elastic Search Bonus Data Services Rest API Hazelcast Java CRUD Benefits/perks listed below may vary depending on the nature of your employment with LTIMindtree (“LTIM”): Benefits and Perks: Comprehensive Medical Plan Covering Medical, Dental, Vision Short Term and Long-Term Disability Coverage 401(k) Plan with Company match Life Insurance Vacation Time, Sick Leave, Paid Holidays Paid Paternity and Maternity Leave The range displayed on each job posting reflects the minimum and maximum salary target for the position across all US locations. Within the range, individual pay is determined by work location and job level and additional factors including job-related skills, experience, and relevant education or training. Depending on the position offered, other forms of compensation may be provided as part of overall compensation like an annual performance-based bonus, sales incentive pay and other forms of bonus or variable compensation. Disclaimer: The compensation and benefits information provided herein is accurate as of the date of this posting. LTIMindtree is an equal opportunity employer that is committed to diversity in the workplace. Our employment decisions are made without regard to race, color, creed, religion, sex (including pregnancy, childbirth or related medical conditions), gender identity or expression, national origin, ancestry, age, family-care status, veteran status, marital status, civil union status, domestic partnership status, military service, handicap or disability or history of handicap or disability, genetic information, atypical hereditary cellular or blood trait, union affiliation, affectional or sexual orientation or preference, or any other characteristic protected by applicable federal, state, or local law, except where such considerations are bona fide occupational qualifications permitted by law. Safe return to office: In order to comply with LTIMindtree' s company COVID-19 vaccine mandate, candidates must be able to provide proof of full vaccination against COVID-19 before or by the date of hire. Alternatively, one may submit a request for reasonable accommodation from LTIMindtree's COVID-19 vaccination mandate for approval, in accordance with applicable state and federal law, by the date of hire. Any request is subject to review through LTIMindtree's applicable processes.
    $75k-101k yearly est. 29d ago
  • Software Engineer

    Brooksource 4.1company rating

    Data Engineer Job In Cincinnati, OH

    Contract to Hire Hybrid: Cincinnati, Ohio We are seeking an experienced Java Developer to design, develop, and maintain backend services supporting internal fraud application systems. The ideal candidate will be self-sufficient in coding, deploying software, and supporting critical systems. They will also collaborate with an agile team, leading development efforts alongside other experienced engineers. Responsibilities: Design, develop, and maintain backend services using Java 21 and Spring Boot/Spring Framework. Build and maintain RESTful and SOAP APIs for seamless system integrations. Deploy applications on OpenSystems environments (Windows and/or Linux). Work collaboratively within an agile team to deliver high-quality software solutions. Troubleshoot and resolve issues, ensuring system reliability and performance. Participate in code reviews, architectural discussions, and best practice implementations. Must-Have Skills: 5+ years of Java development experience, proficient in Java 21. Strong expertise in Spring Boot and Spring Framework. Excellent knowledge of API development (REST and SOAP). Experience deploying applications on Windows and/or Linux environments. Nice-to-Have Skills: Background in banking or financial services. Experience managing vendor systems and platforms. Familiarity with deployment pipelines and OpenShift. Preferred Qualifications: Bachelor's degree in Computer Science, Engineering, or a related field. Strong problem-solving and analytical skills. Excellent communication and teamwork abilities. About Eight Eleven Group (DBA Brooksource): At Eight Eleven, our business is people. Relationships are at the center of what we do. A successful partnership is only as strong as the relationship built. We're your trusted partner for IT hiring, recruiting and staffing needs. For over 20 years, Eight Eleven has established and maintained relationships that are designed to meet your IT staffing needs. Whether it's contract, contract-to-hire, or permanent placement work, we customize our search based upon your company's unique initiatives, culture and technologies. With our national team of recruiters placed at 28 major hubs around the nation, Eight Eleven finds the people best-suited for your business. When you work with us, we work with you. That's the Eight Eleven promise. Eight Eleven Group provides equal employment opportunities (EEO) to all employees and applicants for employment without regard to race, color, religion, national origin, age, sex, citizenship, disability, genetic information, gender, sexual orientation, gender identity, marital status, amnesty or status as a covered veteran in accordance with applicable federal, state, and local laws.
    $69k-92k yearly est. 8d ago
  • Software Engineer

    Entegee 4.3company rating

    Data Engineer Job 50 miles from Cincinnati

    W2 Contract Onsite in Beavercreek, OH Software Engineer 75-80/HR *** ACTIVE SECRET CLEARANCE IS A MUST*** Requirements: Secret DoD Clearance or higher (Will hold Top-Secret) At least 5 years of C++ experience Embedded platforms experience Must be able to work on-site in Beavercreek, OH Equal Opportunity Employer/Veterans/Disabled Benefit offerings include medical, dental, vision, term life insurance, short-term disability insurance, additional voluntary benefits, commuter benefits and 401K plan. Our program provides employees the flexibility to choose the type of coverage that meets their individual needs. Available paid leave may include Paid Sick Leave, where required by law; any other paid leave required by Federal, State or local law; and Holiday pay upon meeting eligibility criteria. Disclaimer: These benefit offerings do not apply to client-recruited jobs and jobs which are direct hire to a client To read our Candidate Privacy Information Statement, which explains how we will use your information, please visit **************************************************************** The Company will consider qualified applicants with arrest and conviction records .
    $65k-86k yearly est. 9d ago
  • Medical Electronics and Software Engineering Solutions Lead

    Xcutives Inc.

    Data Engineer Job In Cincinnati, OH

    We are looking for a Medical Electronics and Software Engineering Solutions Lead for Medtech engineering focused on Medical devices and software engineering. This leader will be responsible for setting up a Center of Excellence for Product Engineering focused on Medtech electronics. The scope of will include: Integrated hardware and software elements within medical devices covering a range of equipment / devices including : Implantable devices Patient Monitoring devices Imaging systems Respiratory devices Robotics and other Bionic devices He / She will Contextualize existing offerings and solutions and develop new offerings and solutions for this market segment. Build and manage and grow a team of engineers working on projects, solutions and CoE Build a CoE / Lab to demonstrate the offerings to customers to enter and build a strong presence in this space Build a short and long term strategy for products and services for this space thus driving us to a leadership position Responsibilities: Ensure penetration and build a critical mass in terms of capability and mindshare in the Embedded Systems space in the Medtech Industry covering hardware and software (application and embedded) Create a short and long term strategy Build a medical electronics Lab / CoE (US + India) Branding and Positioning - Identify avenues and execute actions to build a strong Medtech Engineering brand Consolidating our product engineering offerings & solutions and contextualize them for medical device segment and help develop new offerings & solutions. Working closely with the Sales and Presales teams to create offerings to address market opportunities Working with Presales team to deliver winning solutions and proposals Building and Nurturing relationships with key customer executives to position our solutions and our brand Skill and Experience Requirements: A strong understanding of the Medtech value chain covering the various processes including : R&D, Product Development, prototyping, product lifecycle Management and Manufacturing Strong knowledge of products for various MedTech sub segments covering (for example) Imaging devices CT scans, MRIs, and Ultrasounds. Electronic devices Oxygen concentrators, Hemodialysis machines, ICU ventilators, Anesthesia machines, and Patient monitors. Implantable devices Pacemakers and Hip prostheses. In vitro diagnostic devices reagents, test kits, and blood glucose meters. Software devices computer aided diagnostics. Other devices surgical lights, wheelchairs, dermal fillers, and insulin pumps. At least 15+ years of experience working in the Medtech industry in Product development Should have had hands on experience in embedded hardware and software development at some part of the career covering Integrated development environments, Debug devices and software, emulators, Testing software & devices etc. Should have experience in putting together, nurturing and growing teams of engineers on these kind of projects Should have worked for at least one large US based company as part of the above experience Strong relationship and interpersonal skills. This role has many internal stakeholders and the successful candidate must be comfortable handling complexity in a diverse environment Demonstrate strong personal communication and executive presence to establish interest, credibility and trust. Excellent written, oral and verbal communication skills a must Education Requirements Master's or Bachelor's degree from a leading college
    $91k-121k yearly est. 11d ago
  • ETL Architect

    Scadea Solutions

    Data Engineer Job In Cincinnati, OH

    Job title: ETL Architect DURATION 18 months YEARS OF EXPERIENCE 7-10 INTERVIEW TYPE Phone Screen to Hire REQUIRED SKILLS • Experience with Data Stage and ETL design Technical • Requirement gathering , converting business requirements to technical specs to profile • Worked hands on in minimum 2 projects with data stage • Understand the process of developing an etl design that support multiple datastage developers • Be able to create an etl design framework and related specifications for use by etl developers • Define standards and best practices of Data Stage etl to be followed by all data stage developers • Understanding of Data Warehouse, Data marts concepts and implementation experience • Be able to look at code produced to insure conformance with developed ETL framework and design for reuse • Preferable experienced user level comptency in IBM's metadata product, datastage and Infosphere product line • Be able to design etl for oracle or sql server or any db • Good analytical skills and process design • Insuring compliance to quality standards, and delivery timelines. Qualifications Bachelors Additional Information Required Skills: Job Description: Performs highly complex application programming/systems development and support Performs highly complex configuration of business rules and technical parameters of software products Review business requirements and develop application design documentation Build technical components (Maximo objects, TRM Rules, Java extensions, etc) based on detailed design. Performs unit testing of components along with completing necessary documentation. Supports product test, user acceptance test, etc as a member of the fix-it team. Employs consistent measurement techniques Include testing in project plans and establish controls to require adherence to test plans Manages the interrelationships among various projects or work objectives
    $86k-113k yearly est. 26d ago
  • Data Visualization Engineer

    Patientpoint 4.4company rating

    Data Engineer Job In Cincinnati, OH

    Join PatientPoint to be part of a dynamic team committed to empower better health. As a leading digital health company, we innovate to positively impact patient behaviors. Our purpose-driven approach offers an inspirational career opportunity where you can contribute to improving health outcomes for millions of patients nationwide. Location: Cincinnati Job Summary We are seeking an experienced and visionary Engineer of Data Visualization and Reporting. In this role, you will contribute to developing and delivering data visualization and report objects for internal and external customers. We will also be looking to move the needle on incorporating LLM's into the reporting space. All of this will assist our community in building compelling narratives and actionable insights enabling stakeholders to extract maximum value from our data assets. This position offers an exciting opportunity to participate in critical, strategic initiatives and shape the future of data visualization and reporting within our organization. What You'll Do Data Visualization & Report Development Work with the team on building an effective and consistent UI. Partner with Data Engineering team during design, build, and testing for certified data sets in dbt and Snowflake. Oversee the design and development of interactive and engaging data visualizations and report using tools such as Looker, Tableau, Power BI, or custom visualization libraries. Ensure adherence to best practices, including principles of clarity, accuracy, governance, and effectiveness. Work closely with business stakeholders to understand requirements, develop and deliver insightful and actionable reports that provide valuable insights into key business metrics, trends, and performance indicators. Stakeholder Engagement Contribute to culture of storytelling through data. Collaborate closely with cross-functional teams, including business leaders, analysts, and data scientists, to understand information needs and deliver impactful insights. Help guide data visualization and reporting best practices. Champion an insight-driven culture within the organization, promoting the value of data visualization and reporting in driving informed decision-making. Drive continuous improvement and governance in the visualization and reporting space. What We Need Bachelor's degree in computer science, Information Systems, Data Analytics, or a related field 4+ years in a data focused role Proven experience in developing and delivering within data visualization, reporting, or business intelligence. Proficiency in Looker and LookML, in addition to other business intelligence platforms like Tableau and PowerBI. Experience with dbt and Snowflake. Advanced SQL knowledge, including writing complex queries, optimizing performance, and working with large datasets. Experience with agile methodologies and project management practices. Desired Qualifications Knowledge of statistical analysis and data mining techniques is a plus. Previous experience working with operations, sales, and customer service team. Healthcare is a plus but not required. What You'll Need to Succeed Strong analytical and problem-solving skills, with the ability to translate complex data into actionable insights. Excellent communication, with the ability to effectively convey technical concepts to non-technical audiences. Highly organized with excellent attention to detail. Ability to work independently and as part of a team. Sense of urgency to learn and take on new challenges. #LI-ED1 #LI-Hybrid About PatientPoint: PatientPoint is a leading digital health company that connects patients, healthcare providers and life sciences companies with the right information in the moments care decisions are made. Our solutions are proven to influence patient behavior and improve health outcomes, driving value for all stakeholders. Across the nation's largest network of connected digital devices in 35,000 physician offices, PatientPoint solutions empower better health for more than 750 million patient visits each year. Latest News & Innovations: Named 2025 Best Places to Work by Built In! Read More New Orleans Saints Partner with PatientPoint to Enhance Player Health & Performance. Read More Featured on Built In's "Insights from Top Sales Leaders." Read More What We Offer: We know you bring your whole self to work every day, and we are committed to supporting our full-time teammates with a comprehensive range of modernized benefits and cultural perks. We offer competitive compensation, flexible time off to recharge, hybrid work options, mental and emotional wellness resources, a 401K plan, and more. While these benefits are available to full-time team members, we strive to create a positive and supportive environment for all teammates. PatientPoint recognizes that privacy is important to you. Please read the PatientPoint privacy policy, we want you to be familiar with how we may collect, use, and disclose your information. Employer is EOE/M/F/D/V
    $79k-111k yearly est. 1d ago
  • ETL Data Engineer

    Insight Global

    Data Engineer Job In Cincinnati, OH

    Insight Global is hiring an ETL Developer for a contract opportunity supporting our largest financial client in the Cincinnati, Ohio area. There is a strong preference is to have these individuals hybrid onsite in Cincinnati 4-5x a week at their downtown location. This individual will design, build, test, and maintain databases. They will play an important part of the development team to tests, designs, and debugs the entire database system before it is implemented. Primary Responsibilities: Coordinate with ETL team to implement all ETL procedures for all new projects and maintain effective awareness of all production activities according to required standards and provide support to all existing applications. Perform tests and provide updates to all ETL activities within schedule and provide support to all large data volumes and assist in data processing. Documents all technical and system specifications documents for all ETL processes and perform unit tests on all processes and prepare required programs and scripts. Analyze and interpret all complex data on all target systems and analyze and provide resolutions to all data issues and coordinate with data analysts to validate all requirements, perform interviews with all users and developers. Perform tests and validate all data flows and prepare all ETL processes according to business requirements and incorporate all business requirements into all design specifications. Perform root cause analysis on all processes and resolve all production issues and validate all data and perform routine tests on databases and provide support to all ETL applications. Develop and perform tests on all ETL codes for system data and analyze all data and design all data mapping techniques for all data models in systems. Provide support to all ETL schedules and maintain compliance to the same and develop and maintain various standards to perform ETL codes and maintain an effective project life cycle on all ETL processes. We are a company committed to creating diverse and inclusive environments where people can bring their full, authentic selves to work every day. We are an equal opportunity/affirmative action employer that believes everyone matters. Qualified candidates will receive consideration for employment regardless of their race, color, ethnicity, religion, sex (including pregnancy), sexual orientation, gender identity and expression, marital status, national origin, ancestry, genetic factors, age, disability, protected veteran status, military or uniformed service member status, or any other status or characteristic protected by applicable laws, regulations, and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application or recruiting process, please send a request to ******************** . To learn more about how we collect, keep, and process your private information, please review Insight Global's Workforce Privacy Policy: *************************************************** . Skills and Requirements -5+ years of experience within data engineering and ETL development -Datastage -SQL writing complex procedures/views/ tables; taking data from SQL database into a centralized database and add/write reporting layers -Snowflake -Oracle/DB2 · Banking industry experience debits, frauds or disputes experience + · Java · Putty · Shell Scripting · Tomcat Apache null We are a company committed to creating diverse and inclusive environments where people can bring their full, authentic selves to work every day. We are an equal employment opportunity/affirmative action employer that believes everyone matters. Qualified candidates will receive consideration for employment without regard to race, color, ethnicity, religion,sex (including pregnancy), sexual orientation, gender identity and expression, marital status, national origin, ancestry, genetic factors, age, disability, protected veteran status, military oruniformed service member status, or any other status or characteristic protected by applicable laws, regulations, andordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application or the recruiting process, please send a request to ********************.
    $75k-101k yearly est. 17d ago
  • Data Engineer Level 3

    ACL Digital

    Data Engineer Job In Cincinnati, OH

    This contractor can sit in Chicago OR Cincinnati once the offices reopen. About Us: We are a full stack data science company and a wholly owned subsidiary of The * Company. We own 10 Petabytes of data, and collect 35+ Terabytes of new data each week sourced from 62 Million households. As a member of our software engineering team you will use various cutting-edge technologies to develop applications that turn our data into actionable insights used to personalize the customer experience for shoppers at *. What you'll do: As a Senior Data Engineer, you are part of the software development team. We develop strategies and solutions to ingest, store, and distribute our big data. Our developers use Big Data technologies including (but not limited to) Hadoop, PySpark, Hive, JSON, and SQL to develop products, tools and software features. Responsibilities: Take ownership of features and drive them to completion through all phases of the entire 84.51 SDLC. This includes external facing and internal applications as well as process improvement activities such as: Participate in design of Big Data platforms and SQL based solutions Perform development of Big Data platforms and SQL based solutions Perform unit and integration testing Partner with senior resources, gaining insights Participate in retrospective reviews Participate in the estimation process for new work and releases Be driven to improve yourself and the way things are done Minimum Skills Required: Bachelors degree (typically in Computer Science, Management Information Systems, Mathematics, Business Analytics or another technically strong program), plus 2 years of experience Proven Big Data technology development experience including Hadoop, Spark (PySpark), and Hive Understanding of Agile Principles (Scrum) Experience developing with Python Cloud Development (Azure) Exposure to VCS (Git, SVN) Position Specific Skill Preferences: Experience in the following: Experience developing with SQL (Oracle, SQL Server) Exposure to NoSQL (Mongo, Cassandra) Apache NiFi Airflow Docker Key Responsibilities: Innovate, develop, and drive the development and communication of data strategy and roadmaps across the technology organization to support project portfolio and business strategy Drive the development and communication of enterprise standards for data domains and data solutions, focusing on simplified integration and streamlined operational and analytical uses Drive digital innovation by leveraging innovative new technologies and approaches to renovate, extend, and transform the existing core data assets, including SQL-based, NoSQL-based, and Cloud-based data platforms Define high-level migration plans to address the gaps between the current and future state, typically in sync with the budgeting or other capital planning processes Lead the analysis of the technology environment to detect critical deficiencies and recommend solutions for improvement Mentor team members in data principles, patterns, processes and practices Promote the reuse of data assets, including the management of the data catalog for reference Draft and review architectural diagrams, interface specifications and other design documents Proactively and holistically lead activities that create deliverables to guide the direction, development, and delivery of technological responses to targeted business outcomes. Provide facilitation, analysis, and design tasks required for the development of an enterprise's data and information architecture, focusing on data as an asset for the enterprise. Develop target-state guidance (i.e., reusable standards, design patterns, guidelines, individual parts and configurations) to evolve the technical infrastructure related to data and information across the enterprise, including direct collaboration with 84.51. Demonstrate the companys core values of respect, honesty, integrity, diversity, inclusion and safety.
    $75k-101k yearly est. 12d ago
  • Senior Data Engineer

    Trihealth HCM Enterprise

    Data Engineer Job In Cincinnati, OH

    Full-time position: 80 hours bi-weekly Day shift The senior data engineer will play a pivotal role in building, managing and optimizing data pipelines and/or datasets and ETL/ELT processes over their life cycles by working closely with a multidisciplinary Agile team. As a Senior Data Engineer you will assemble large, highly complex datasets that meet functional/non-functional business requirements for key stakeholders to ensure the enterprise has seamless access to actionable, meaningful, and well-governed data across all domains 1.Create and maintain optimal data pipeline architecture with a current understanding of emerging technologies aligned to business needs 2. Syncing data between various systems to ensure consistency, integrity, and reliable insights gathering from multiple tools 3. Strong close relationship with data science teams and with business (data) analysts in refining their data requirements for various data and analytics initiatives and their data consumption requirements 4. Provide technical leadership for a team of engineers to create products by removing roadblocks to development through collaboration, communication, and creative solution recommendations 5. Identifying, designing and implementing internal process improvements including re-designing infrastructure for greater scalability, optimizing data delivery, and automating manual processes 6. Working with stakeholders including data, design, product and executive teams and assisting them with data-related technical issues 7. Certification Requirements - Epic Clarity data model certification expected within 1 year of hire date & Python certification using internal training tools within 2 year of hire date as well as maintaining your continued education credits Job Requirements: Bachelor's Degree in Information Systems Computer Science, Statistics, Healthcare Information Management, Engineering or a related field Equivalent relevant experience accepted in lieu of degree Epic Clarity Data Model Certification ( Python Certification ( Epic Clarity Data Model Certification ( Python Certification ( 1.Exposure to Relational Databases like SQL Server & Non-Relational Databases 2. Experience with ETL/ELT, Data pipelines and/or Datasets & BI tools 3. Experience with programming languages such as Python, C# etc 4. Exposure to Agile methods 4-5 years Technical Information Systems Computer Science, Statistics, Healthcare Information Management, Engineering or a related field Up to 1 year Clinical Healthcare Job Responsibilities: Other Job-Related Information: Develop data extracts and analytics solutions for Epic Community Connect projects Working Conditions:
    $75k-101k yearly est. 44d ago
  • Sr. Data Engineer

    Tek Ninjas

    Data Engineer Job In Cincinnati, OH

    Required Skills: Azure Data Factory SQL Hadoop Shell Scripting Nice to Have: Snowflake
    $75k-101k yearly est. 60d+ ago
  • Data Engineer III

    Estaffing

    Data Engineer Job In Cincinnati, OH

    We are seeking an experienced Data Engineer III. The ideal candidate will be responsible for working with business analysts, data engineers and upstream teams to understand impacts to data sources. Take the requirements and update/build ETL data pipelines using Datastage and DBT for ingestion into Financial Crimes applications. Perform testing and ensure data quality of updated data sources. Job Summary:Handle the design and construction of scalable management systems, ensure that all data systems meet company requirements, and also research new uses for data acquisition. Required to know and understand the ins and outs of the industry such as data mining practices, algorithms, and how data can be used. Primary Responsibilities: Design, construct, install, test and maintain data management systems. Build high-performance algorithms, predictive models, and prototypes. Ensure that all systems meet the business/company requirements as well as industry practices. Integrate up-and-coming data management and software engineering technologies into existing data structures. Develop set processes for data mining, data modeling, and data production. Create custom software components and analytics applications. Research new uses for existing data. Employ an array of technological languages and tools to connect systems together. Collaborate with members of your team (eg, data architects, the IT team, data scientists) on the project's goals. Install/update disaster recovery procedures. Recommend different ways to constantly improve data reliability and quality. QualificationsLocals are highly preferred. Open to relocation possibility from within the state of Ohio with no assistance Technical Degree or related work experience Experience with non-relational & relational databases (SQL, MySQL, NoSQL, Hadoop, MongoDB, etc.) Experience programming and/or architecting a back-end language (Java, J2EE, etc) Business Intelligence - Data Engineering ETL DataStage Developer SQL Strong communication skills, ability to collaborate with members of your team
    $75k-101k yearly est. 29d ago
  • Data Engineer

    Amend Consulting 4.0company rating

    Data Engineer Job In Cincinnati, OH

    at AMEND Consulting About AMEND: AMEND is a management consulting firm based in Cincinnati, OH with areas of focus in operations, analytics, and technology, focused on strengthening the people, processes, and systems in organizations to generate a holistic transformation. Our three-tiered approach provides a distinct competitive edge and allows us to build strong relationships and create customized solutions for every client. We work each day to change lives and transform businesses, and we are constantly striving to make a positive impact on our community! The AMEND team continues to grow at a rapid pace, and our technical team will continue to be an important part of that journey. Overview: The Data Engineer consultant role is an incredibly exciting position in the fastest growing segment of AMEND. You will be working to solve real-world problems by designing cutting edge analytic solutions while surrounded by a team of world class talent. You will be entering an environment of explosive growth with ample opportunity for development. We are looking for individuals who can go into a client and optimize (or re-design) companies data architecture, who are the combination of a change agent, technical leader and passionate about transforming companies for the better. We need someone who is a problem solver, a critical thinker, and is always wanting to go after new things; you'll never be doing the same thing twice! Job Tasks: Create and maintain optimal data pipeline architecture Assemble large, complex data sets that meet functional / non-functional business requirements Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency, and other key business performance metrics Work with stakeholders including the Executive, Product, Data, and Design teams to assist with data-related technical issues and support their data infrastructure needs Define project requirements by identifying project milestones, phases, and deliverables Execute project plan, report progress, identify and resolve problems, and recommend further actions Delegate tasks to appropriate resources as project requirements dictate Design, develop, and deliver audience training and adoption methods and materials Qualifications: Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases. Databricks and DBT experience is a plus Experience building and optimizing data pipelines, architectures, and data sets Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement Strong analytic skills related to working with structured and unstructured datasets Build processes supporting data transformation, data structures, metadata, dependency, and workload management A successful history of manipulating, processing, and extracting value from large, disconnected datasets Ability to interface with multiple other business functions (internally and externally) Desire to build analytical competencies in others within the business Curiosity to ask questions and challenge the status quo Creativity to devise out-of-the-box solutions Ability to travel as needed to meet client requirements What's in it for you? Competitive pay and bonus Continued education and individual development plans Unlimited Vacation Full Health, Vision, Dental, and Life Benefits Paid parental leave PTO for your Birthday 3:1 charity match All this to say - we are looking for talented people who are excited to make an impact on our clients. If this job description isn't a perfect match for your skillset, but you are talented, eager to learn, and passionate about our work, please apply! Our recruiting process is centered around you as an individual and finding the best place for you to thrive at AMEND, whether it be with the specific title on this posting or something different. One recruiting conversation with us has the potential to open you up to our entire network of opportunities, so why not give it a shot? We're looking forward to connecting with you. *Applicants must be authorized to work for any employer in the U.S. We are unable to sponsor or take over sponsorship of employment Visa at this time.*
    $70k-91k yearly est. 26d ago
  • Data Engineer

    DMI 3.5company rating

    Data Engineer Job In Cincinnati, OH

    DMI is a leading provider of digital services and technology solutions, headquartered in Tysons Corner, VA. With a focus on end-to-end managed IT services, including managed mobility, cloud, cybersecurity, network operations, and application development, DMI supports public sector agencies and commercial enterprises around the globe. Recognized as a Top Workplace, DMI is committed to delivering secure, efficient, and cost-effective solutions that drive measurable results. Learn more at ************* About the Opportunity DMI, LLC is seeking a Data Engineer to join us. We are seeking a Data Engineer to help build, optimize, and scale our data pipelines, optimize data ingestion, transformation, and analysis of complex telecom data. This role will be instrumental in integrating large-scale invoices, usage records, and cost allocations into our AWS-based analytics platform while ensuring data integrity and performance. Duties and Responsibilities: Develop and maintain scalable ETL/ELT pipelines to process large volumes of telecom and financial data from carriers, APIs, and external sources. Design and optimize data models for structured and semi-structured data in PostgreSQL, Redshift, and other databases. Automate data processing workflows to improve efficiency and accuracy in processing telecom invoices, CDRs (Call Detail Records), and billing data. Ensure data quality and integrity, implementing error handling, validation, and reconciliation techniques. Optimize query performance and database structures to support analytics and reporting needs. Collaborate with cross-functional teams including Product, Engineering, DevOps, and Expense Management to understand data requirements and ensure alignment with business goals. Implement monitoring and logging solutions for data pipelines, ensuring reliability and auditability. Work with cloud-based infrastructure in AWS, utilizing services like Lambda, S3, Glue, Redshift, RDS, and Step Functions. Integrate machine learning capabilities into the data pipeline to enhance automation and predictive analytics. Qualifications Education and Years of Experience: Bachelor's degree in Computer Science, Data Engineering, Information Systems, or a related field (or equivalent experience). A Master's degree in Data Science, Analytics, or Cloud Computing is a plus but not required. Relevant certifications in AWS (e.g., AWS Certified Data Analytics, AWS Certified Solutions Architect) or database technologies (e.g., PostgreSQL, Redshift, Snowflake) are a plus. 3+ years of experience as a Data Engineer, working with large-scale data processing. Proficiency in Python, SQL, and data transformation frameworks (e.g., dbt, Apache Airflow, or similar). Experience working with AWS cloud services (Lambda, S3, Glue, Redshift, RDS, Step Functions). Strong understanding of ETL/ELT methodologies, data warehousing, and database optimization. Experience with handling large telecom invoices, parsing carrier billing formats, and reconciling financial data is a plus. Experience integrating data from APIs, flat files, XML, JSON, and databases. Knowledge of data governance, compliance, and security best practices. Familiarity with data lake architecture and multi-tenant SaaS platforms. Strong problem-solving and analytical skills, with the ability to debug and optimize complex data processes. Required and Desired Skills/Certifications: Experience with big data technologies such as Spark, Snowflake, or Hadoop. Prior experience working in Telecom Expense Management (TEM) or FinTech. Exposure to DevOps principles, CI/CD pipelines, and infrastructure-as-code tools (Terraform, CloudFormation). Familiarity with BI tools such as Domo, Tableau, or Apache Superset. Knowledge of AI/ML integration for anomaly detection and predictive analytics in telecom data. Min Citizenship Status Required: Must be a U.S. Citizen Physical Requirements: No Physical requirement needed for this position. Location: Cincinnati, OH (potential for remote) #LI-EK1 Working at DMI DMI is a diverse, prosperous, and rewarding place to work. Being part of the DMI family means we care about your wellbeing. As such, we offer a variety of perks and benefits that help meet various interests and needs, while still having the opportunity to work directly with a number of our award-winning, Fortune 1000 clients. The following categories make up your DMI wellbeing: Convenience/Concierge - Virtual visits through health insurance, pet insurance, commuter benefits, discount tickets for movies, travel, and many other items to provide convenience. Development - Annual performance management, continuing education, and tuition assistance, internal job opportunities along with career enrichment and advancement to help each employee with their professional and personal development. Financial - Generous 401k matches both pre-tax and post-tax (ROTH) contributions along with financial wellness education, EAP, Life Insurance and Disability help provide financial stability for each DMI employee. Recognition - Great achievements do not go unnoticed by DMI through Annual Awards ceremony, service anniversaries, peer-to-peer acknowledgment, employee referral bonuses. Wellness - Healthcare benefits, Wellness programs, Flu Shots, Biometric screenings, and several other wellness options. Employees are valued for their talents and contributions. We all take pride in helping our customers achieve their goals, which in turn contributes to the overall success of the company. ***************** No Agencies Please ***************** Applicants selected may be subject to a government security investigation and must meet eligibility requirements for access to classified information. US citizenship may be required for some positions.
    $78k-108k yearly est. 12d ago
  • MLOps Engineer

    Tata Consultancy Services 4.3company rating

    Data Engineer Job 13 miles from Cincinnati

    Job Type: Fulltime Experience: 8+years A dynamic Senior Software Engineer with an ML focus to lead the integration and operationalization of machine learning models in our Search area. This role requires collaboration with data scientists and leadership teams, and a strong foundation in MLOps methodologies. Experience in diverse ML platforms, including Google Vertex AI and other cloud and open-source technologies, is essential. The candidate will bridge MLOps, data science, and leadership to ensure the smooth functioning of our ML infrastructure. Diverse ML Platform Expertise: ML platforms, including Google Vertex AI and other cloud and open-source frameworks such as TensorFlow, PyTorch, scikit-learn Maintain expertise in a range of ML technologies and platforms, with a preference for Google Vertex AI, but open to other systems as needed. Leverage support for open-source frameworks like TensorFlow, PyTorch, scikit-learn, and integrate them with ML frameworks via custom containers. Stay updated with the latest trends in MLOps and ML technologies. Recommender System Design and Development: Hands-on experience working on recommender systems, drawing from ML techniques such as embedding based retrieval, reinforcement learning, transformers, and LLMs. Software engineering skills to work with teams integrating the recommender systems into customer facing products. Experience in AB testing and iterative optimization using data driven approaches. Understanding of infrastructure needs required to deploy ML systems (CPU/GPU, networking infrastructure). Feature Store Management: Efficiently manage, share, and reuse machine learning features at scale using Vertex AI Feature Store. Implement feature stores as a central repository for maintaining transparency in ML operations across the organization. Enable feature delivery with endpoint exposure while maintaining authority and security features. Data Management and Collaboration: Assist as needed with data labeling and management, ensuring high-quality data for ML models. Collaborate with data engineers and data scientists to ensure the integrity and efficiency of data used in ML models. Ensure end-to-end integration for data to AI, including the use of BigTable / BigQuery for executing machine learning models on business intelligence tools. #LI-RJ2 Salary Range - $93,700-$135,000 a year
    $93.7k-135k yearly 28d ago
  • ETL Architect

    Scadea Solutions

    Data Engineer Job In Cincinnati, OH

    Job title: ETL Architect DURATION 18 months YEARS OF EXPERIENCE 7-10 INTERVIEW TYPE Phone Screen to Hire REQUIRED SKILLS • Experience with Data Stage and ETL design Technical • Requirement gathering , converting business requirements to technical specs to profile • Worked hands on in minimum 2 projects with data stage • Understand the process of developing an etl design that support multiple datastage developers • Be able to create an etl design framework and related specifications for use by etl developers • Define standards and best practices of Data Stage etl to be followed by all data stage developers • Understanding of Data Warehouse, Data marts concepts and implementation experience • Be able to look at code produced to insure conformance with developed ETL framework and design for reuse • Preferable experienced user level comptency in IBM's metadata product, datastage and Infosphere product line • Be able to design etl for oracle or sql server or any db • Good analytical skills and process design • Insuring compliance to quality standards, and delivery timelines. Qualifications Bachelors Additional Information Required Skills: Job Description: Performs highly complex application programming/systems development and support Performs highly complex configuration of business rules and technical parameters of software products Review business requirements and develop application design documentation Build technical components (Maximo objects, TRM Rules, Java extensions, etc) based on detailed design. Performs unit testing of components along with completing necessary documentation. Supports product test, user acceptance test, etc as a member of the fix-it team. Employs consistent measurement techniques Include testing in project plans and establish controls to require adherence to test plans Manages the interrelationships among various projects or work objectives
    $86k-113k yearly est. 60d+ ago
  • Data Visualization Engineer

    Patientpoint 4.4company rating

    Data Engineer Job In Cincinnati, OH

    Join PatientPoint to be part of a dynamic team committed to empower better health. As a leading digital health company, we innovate to positively impact patient behaviors. Our purpose-driven approach offers an inspirational career opportunity where you can contribute to improving health outcomes for millions of patients nationwide. Location: Cincinnati Job Summary We are seeking an experienced and visionary Engineer of Data Visualization and Reporting. In this role, you will contribute to developing and delivering data visualization and report objects for internal and external customers. We will also be looking to move the needle on incorporating LLM's into the reporting space. All of this will assist our community in building compelling narratives and actionable insights enabling stakeholders to extract maximum value from our data assets. This position offers an exciting opportunity to participate in critical, strategic initiatives and shape the future of data visualization and reporting within our organization. What You'll Do Data Visualization & Report Development Work with the team on building an effective and consistent UI. Partner with Data Engineering team during design, build, and testing for certified data sets in dbt and Snowflake. Oversee the design and development of interactive and engaging data visualizations and report using tools such as Looker, Tableau, Power BI, or custom visualization libraries. Ensure adherence to best practices, including principles of clarity, accuracy, governance, and effectiveness. Work closely with business stakeholders to understand requirements, develop and deliver insightful and actionable reports that provide valuable insights into key business metrics, trends, and performance indicators. Stakeholder Engagement Contribute to culture of storytelling through data. Collaborate closely with cross-functional teams, including business leaders, analysts, and data scientists, to understand information needs and deliver impactful insights. Help guide data visualization and reporting best practices. Champion an insight-driven culture within the organization, promoting the value of data visualization and reporting in driving informed decision-making. Drive continuous improvement and governance in the visualization and reporting space. What We Need Bachelor's degree in computer science, Information Systems, Data Analytics, or a related field 4+ years in a data focused role Proven experience in developing and delivering within data visualization, reporting, or business intelligence. Proficiency in Looker and LookML, in addition to other business intelligence platforms like Tableau and PowerBI. Experience with dbt and Snowflake. Advanced SQL knowledge, including writing complex queries, optimizing performance, and working with large datasets. Experience with agile methodologies and project management practices. Desired Qualifications Knowledge of statistical analysis and data mining techniques is a plus. Previous experience working with operations, sales, and customer service team. Healthcare is a plus but not required. What You'll Need to Succeed Strong analytical and problem-solving skills, with the ability to translate complex data into actionable insights. Excellent communication, with the ability to effectively convey technical concepts to non-technical audiences. Highly organized with excellent attention to detail. Ability to work independently and as part of a team. Sense of urgency to learn and take on new challenges. #LI-ED1 #LI-Hybrid About PatientPoint: PatientPoint is a leading digital health company that connects patients, healthcare providers and life sciences companies with the right information in the moments care decisions are made. Our solutions are proven to influence patient behavior and improve health outcomes, driving value for all stakeholders. Across the nation's largest network of connected digital devices in 35,000 physician offices, PatientPoint solutions empower better health for more than 750 million patient visits each year. Latest News & Innovations: Named 2025 Best Places to Work by Built In! Read More New Orleans Saints Partner with PatientPoint to Enhance Player Health & Performance. Read More Featured on Built In's "Insights from Top Sales Leaders." Read More What We Offer: We know you bring your whole self to work every day, and we are committed to supporting our full-time teammates with a comprehensive range of modernized benefits and cultural perks. We offer competitive compensation, flexible time off to recharge, hybrid work options, mental and emotional wellness resources, a 401K plan, and more. While these benefits are available to full-time team members, we strive to create a positive and supportive environment for all teammates. PatientPoint recognizes that privacy is important to you. Please read the PatientPoint , we want you to be familiar with how we may collect, use, and disclose your information. Employer is EOE/M/F/D/V
    $79k-111k yearly est. 4d ago
  • Sr Data Engineer

    ACL Digital

    Data Engineer Job In Cincinnati, OH

    Title: Senior Data Engineer Duration: 12 Months with Extn Accountable for developing and delivering technological responses to targeted business outcomes. Analyze, design and develop enterprise data and information architecture deliverables, focusing on data as an asset for the enterprise. Understand and follow reusable standards, design patterns, guidelines, and configurations to deliver valuable data and information across the enterprise, including direct collaboration with 84.51, where needed. Demonstrate the company's core values of respect, honesty, integrity, diversity, inclusion and safety. Minimum Position Qualifications * 4+ years experience in the data development and principles including end-to-end design patterns * 4+ years proven track record of delivering large scale, high quality operational or analytical data systems * 4+ years successful and applicable experience building complex data solutions that have been successfully delivered to customers * Any experience in a minimum of two of the following technical disciplines: data warehousing, big data management, analytics development, data science, application programming interfaces (APIs), data integration, cloud, servers and storage, and database mgmt * Excellent oral/written communication skills Essential Job Functions: * Utilize enterprise standards for data domains and data solutions, focusing on simplified integration and streamlined operational and analytical uses * Ensure there is clarity between ongoing projects, escalating when necessary, including direct collaboration with 84.51 * Leverage innovative new technologies and approaches to renovate, extend, and transform the existing core data assets, including SQL-based, NoSQL-based, and Cloud-based data platforms * Define high-level migration plans to address the gaps between the current and future state * Contribute to the development of cost/benefit analysis for leadership to shape sound architectural decisions * Analyze technology environments to detect critical deficiencies and recommend solutions for improvement * Promote the reuse of data assets, including the management of the data catalog for reference * Draft architectural diagrams, interface specifications and other design documents
    $75k-101k yearly est. 12d ago
  • Sr. Data Engineer

    Tek Ninjas

    Data Engineer Job In Cincinnati, OH

    Role overview/Day to Day (in your words, not pasted JD): • PySpark and cloud experience, script snowflake is a bonus. Migrating from Neteeza to snowflake bigger bonus. At least 5 years of experience. They have insights platform called on-demand. Came from a company called market 6 that they acquired. Supply chain inventory data platform. Sits on top of a neteeza backend database. Trying to move off of that legacy hardware. Very similar to oracle. Migrating it to a hybrid data pipeline solutions. On prem and in the cloud. Data mapping and etl logic going on in the neteeza world and writing in onprem pyspark. Curate the data and push the data up to azure. Streamed or synced before snowflake. They will use Snowflake to replace it. Etl from neteeza to onprem with pyspark and pushing up to snowflake. Fact data dimension data. Any type of supply chain how much is in warehouse vs in stores. End users can make better decisions with x amount of product to order. Multiple data pipelines coming into snowflake. Snowflake working as columbus and then going from cincy to columbus or Cleveland to columbus integrating science from other teams. experience with data bricks and pipelines. Databricks and ADF. Pipelines. Nice to have understanding of data warehouse concepts. Required Skills: Azure,SNOWFLAKE,SQL
    $75k-101k yearly est. 60d+ ago
  • Data Engineer Level 3

    ACL Digital

    Data Engineer Job In Cincinnati, OH

    Data Engineer Duration: Long Term About Us: As a member of our software engineering team you will use various cutting-edge technologies to develop applications that turn our data into actionable insights used to personalize the customer experience. As a Senior Data Engineer, you are part of the software development team. We develop strategies and solutions to ingest, store, and distribute our big data. Our developers use Big Data technologies including (but not limited to) Hadoop, PySpark, Hive, JSON, and SQL to develop products, tools and software features. Responsibilities: Take ownership of features and drive them to completion through all phases of SDLC. This includes external facing and internal applications as well as process improvement activities such as: * Participate in design of Big Data platforms and SQL based solutions * Perform development of Big Data platforms and SQL based solutions * Perform unit and integration testing * Partner with senior resources, gaining insights * Participate in retrospective reviews * Participate in the estimation process for new work and releases * Be driven to improve yourself and the way things are done Minimum Skills Required: * Bachelor's degree (typically in Computer Science, Management Information Systems, Mathematics, Business Analytics or another technically strong program), plus 2 years of experience * Proven Big Data technology development experience including Hadoop, Spark (PySpark), and Hive * Understanding of Agile Principles (Scrum) * Experience developing with Python * Cloud Development (Azure) * Exposure to VCS (Git, SVN) Position Specific Skill Preferences: * Experience developing with SQL (Oracle, SQL Server) * Exposure to NoSQL (Mongo, Cassandra) * Apache NiFi * Airflow * Docker
    $75k-101k yearly est. 12d ago

Learn More About Data Engineer Jobs

How much does a Data Engineer earn in Cincinnati, OH?

The average data engineer in Cincinnati, OH earns between $66,000 and $115,000 annually. This compares to the national average data engineer range of $80,000 to $149,000.

Average Data Engineer Salary In Cincinnati, OH

$87,000

What are the biggest employers of Data Engineers in Cincinnati, OH?

The biggest employers of Data Engineers in Cincinnati, OH are:
  1. Concentrix
  2. Pwc
  3. ACL Digital
  4. ETEK International
  5. PatientPoint
  6. Everest Holdings LLC
  7. Molina Healthcare
  8. ADM
  9. Insight Global
  10. Tek Ninjas
Job type you want
Full Time
Part Time
Internship
Temporary