Data Pipeline Trends in Healthcare - CDInsights

Data Pipeline Trends in Healthcare

The CDC, teaming with other institutes, has developed a prototype cloud-based data pipeline, which automatically processes datasets, such as lab results or case studies.

Written By
DC
David Curry
Aug 19, 2023
2 minute read
The CDC, teaming with other institutes, has developed a prototype cloud-based data pipeline, which automatically processes datasets, such as lab results or case studies.

State health departments are under a significant amount of pressure to ensure that individual records and other health data is up-to-date and consumable for a wide variety of use cases. A modern approach to building data pipelines can help.

Why? In today’s world where health records are used for all types of analysis and research, the amount of entities accessing these records is at an all-time high. But many health departments are still stuck in the past when it comes to collection and storage, with some still taking health information via pen and paper, and then manually uploading the data to a database. 

Not only does this use up a large amount of resources, but data added this way can often be incomplete or lack standardization. Without coordination with the wide variety of systems, there can also be duplication of the data, as well as delays. 

See also: Automated Data Pipelines Make it Easier to Use More Data Sources

The CDC partnered with the United States Digital Service to develop a pilot project for the Virginia Department of Health, with the aim of improving the data pipelines for public health agencies. It is part of a broader effort by the CDC to reduce the amount of manual effort needed to access public health data on a state or local level. 

A prototype cloud-based data pipeline was developed by the team, which is customizable with a set of building blocks which automatically process datasets, such as lab results or case studies. Data is standardized and geocoded, and is a single source of truth for all incoming data. 

While the system was built with the Virginia health department, it has been created for adoption and reuse by other health authorities. It is also a signal that the CDC will be cooperative and support efforts by health agencies to improve data processing speeds. 

The team aims to apply the learnings from the Virginia prototype to scale Building Blocks for a wide range of state and local public health services. 

DC

David is a technology writer with several years experience covering all aspects of IoT, from technology to networks to security.

Recommended for you...

What It Takes to Make AI Useful in Enterprise Networking
Santosh Dornal
Apr 27, 2026
Data Masking at Scale: Architecting Privacy for Real-time and AI-driven Systems
Yash Mehta
Apr 23, 2026
7 Key Considerations for Choosing Container Base Images for Java Apps
Dmitry Chuyko
Mar 11, 2026
The Manual Migration Trap: Why 70% of Data Warehouse Modernization Projects Exceed Budget or Fail

Featured Resources from RT Insights

What It Takes to Make AI Useful in Enterprise Networking
Santosh Dornal
Apr 27, 2026
Data Masking at Scale: Architecting Privacy for Real-time and AI-driven Systems
Yash Mehta
Apr 23, 2026
How Cloud Quantum Computing Services are Shaping the Future of HPC
Cloud Spending Trends: From Expansion to Optimization in the AI Era
Cloud Data Insights Logo

Cloud Data Insights is a blog that provides insights into the latest trends and developments in the cloud data space. We cover topics related to cloud data management, data analytics, data engineering, and data science.

Property of TechnologyAdvice. © 2026 TechnologyAdvice. All Rights Reserved

Advertiser Disclosure: Some of the products that appear on this site are from companies from which TechnologyAdvice receives compensation. This compensation may impact how and where products appear on this site including, for example, the order in which they appear. TechnologyAdvice does not include all companies or all types of products available in the marketplace.