Skip to main content
AI developer innovation strategies

Finding the Freedom to Innovate

Whether you prefer betting on the jockey or the horse, the track and training facility matter.

By Michael Schmanske

TAKEAWAY: If you are an AI developer, you need patient data, feedback and real world use validation. The Liver Research Network can help with that.

The Liver Research Network, founded in Tbilisi, Georgia, is attempting to create a new type of partnership development platform for AI Diagnostic startups. They are combining the operational resources and patient access of multiple clinical sites across the region, an untapped but increasingly technologically savvy developing ecosystem. Most importantly, the platform offers more than patient training data: It provides freedom to operate, and generate both feedback and use case validation within the development cycle.  

The concept of packaging and reselling patient data is not entirely new. In the past decade, a few companies (IQVIA, Datavant) and institutions (PCORI, Mayo Clinic Platform_Connect) have begun to aggregate, manage and monetize patient medical data both for classical population health studies and more recently for Artificial Intelligence agent diagnostic training. These organizations have changed the way startups access test data and stress-tested regulators’ appetites for data sharing. This can cause difficulties since the bulk of the data these organizations manage are sourced from the United States healthcare system.

For life science startups, the United States represents the largest market and the gold standard for market validation and access. However, the current regulatory and operational framework in the U.S. is optimized for products like medical devices and the pharmaceutical industry and creates three distinct and relatively new problems for digital health startups, particularly those that rely on patient data for training or implementation:

  • Data ownership and management — Startups face severe processing power and data manipulation limitations as they must keep the data on the host server and can only export the results.
  • Proving monetary value & business case validation — Reimbursement, liability and ops load
  • Beta testing & short feedback loops — IRB, device rules and hospital risk posture

Data ownership and management
Startups face severe processing power and data manipulation limitations as they must keep the data on the host server and can only export the results

  • Rule Example:
    PhysioNet/MIMIC restrictions (2023–2025): Credentialed DUA explicitly prohibits sending data to third-party LLM APIs (e.g., OpenAI/Google) which effectively forces on-premesis or zero-retention setups for model work. (Responsible use of data details: PhysioNet)
  • Why it’s a problem:
    Most U.S. patient data lives inside HIPAA-regulated systems. If a startup wants to process PHI, it either has to work inside a provider’s walled garden (secure enclave/VPC behind the hospital’s firewall) or get a HIPAA Business Associate Agreement (BAA) from every covered entity whose data it will touch. Even then, data often must stay in an enclave and only aggregated, de-identified outputs can leave (after human review). That is great for privacy—but slows iteration, adds cost (review queues, compute inside an enclave) and complicates modern ML workflows (e.g., fine-tuning on external cloud AI APIs).

Key Laws/Rules That Shape Behavior

HIPAA Privacy/Security Rule BAAs

Covered entities must have BAAs with vendors that “create, receive, maintain, or transmit” PHI; the BAA spells out allowed uses and safeguards. (Code of Federal Regulations: eCFR.gov)

Secure Enclave/Output Review Controls (examples)

CMS VRDC (Medicare/Medicaid data): Must analyze inside CMS’s Virtual Research Data Center; only aggregated outputs can leave, subject to cell-suppression (no cells 1–10) and file-size/output reviews. (Research data request: CMS)

NIH/NCATS N3C (national clinical cohort): analysis happens only in the enclave; data cannot be downloaded or removed; access is tiered and governed by a DUA & review committee. (N3C Enclave FAQs: NIH)

Consumer-health Privacy Outside HIPAA

State laws (e.g., Washington My Health My Data Act) and California CPRA tightly regulate “health” data held by apps/brokers even when HIPAA doesn’t apply—affecting marketing, sharing and sale.

Beta testing & short feedback loops
Startups face severe processing power and data manipulation limitations as they must keep the data on the host server and can only export the results

Failure Example:
MD Anderson & IBM Watson for Oncology (2017):
The high-profile project was
terminated after spending tens of millions of dollars; audits and press highlighted integration and performance gaps in real-world workflows—an emblematic example of how clinical complexity and governance stall rapid iteration

Why it’s a problem:
U.S. clinical testing feels like product beta…but it’s actually a regulated clinical investigation when it may influence care. That means IRB review, documented consent or waivers, data monitoring plans and (if your tool is a device) compliance with FDA device study requirements. Hospitals layer malpractice and privacy risk controls on top of that—so “just run a quick pilot next month” often turns into quarters of prep, approvals, and contracting (DUAs, BAAs, indemnities), dulling the rapid build-measure-learn cycles typical in tech.

Key Laws/Rules That Shape Behavior

IRB & Consent

21 CFR Part 50 (informed consent), Part 56 (IRBs) govern clinical investigations; you’ll need IRB approval or a justified waiver. Via The Cancer Letter: MD Anderson spent $62M on an IBM Watson AI project; an audit says it sidestepped UT System rules before being scrapped. [behind paywall]

Investigational Medical Devices

21 CFR Part 812 (IDEs) applies if your clinical investigation involves a device (including many SaMD/AI tools) and could affect diagnosis/treatment. ($62M AI breakthrough stalls as MD Anderson drops IBM Watson project: Forbes)

ONC HTI-1 (Decision Support Interventions)

Certified health IT with predictive models must expose algorithm transparency artifacts—who built it, data characteristics, performance—soaking up product time and documentation. (HHS Updates Health IT Certification Standards: Federal Register)

Proving monetary value & business case validation—Reimbursement, liability and ops load

  • Why it’s a problem:
    Even when an AI works, U.S. clinicians and hospitals need a clear economic rationale—revenue, throughput or cost avoidance—and predictable billing. Many imaging AI tools lack direct CPT payment; absent that, adoption hinges on soft ROI (e.g., fewer callbacks, faster reads) that’s hard to budget, and requires change-management effort in clinics already stretched thin.
  • Positive example:
    Intermountain Health (Utah) has institutionalized a “prove-it” investment culture tied to value-based economics. Intermountain’s Healthcare Delivery Institute (HDI) runs clinical best-practice integration and implementation science across the system—i.e., new tools must show measurable outcomes and efficiency gains to justify upscaling.

Intermountain Health: Successful Investments and a Grand Strategy

The Intermountain Health Hospital system has made a number of successful clinical investments via their Intermountain Ventures division. But they have also been on the forefront of investing in technology that supports the operational processes of adoption, integration and management.  

For example they stood up Castell, a value-based care services company, to hard-wire economic accountability (quality + total cost) into adoption decisions. On the digital side, Intermountain helped launch Graphite Health, a provider-led marketplace designed to make apps plug-and-play and evaluable across systems, and has publicly framed pilots (e.g., Layer Health AI chart abstraction) around a validation step before deployment to ensure clinical performance for registry reporting. This is use-case validation in service of ROI under value-based care.

For better or worse, in the U.S. there is an increased regulatory push for additional efficacy monitoring: ACR’s Assess-AI launched as a national registry to track real-world AI performance, signaling buyer expectations for ongoing QA (and giving payers/regulators data to judge value).  As a result startups seeking product approval will increasingly be expected to show real world results and use case validation, which suggests a major niche that can be filled by other sources of clinical engagement.

Your success in life isn’t based on your ability to simply change. It is based on your ability to change faster than your competition, customers and business.

The Liver Research Alliance

Liver Research Network logoLevan Gogichaishvilli, MD, PhD (Dr. G), is a preeminent liver transplant surgeon, head surgical professor at The Tbilisi Medical Academy and the director of surgery at “New Hospitals” in Tbilisi, Georgia.  For any number of reasons, there has been a significant rise of hepatitis C infections in Eastern Europe and other countries in the region. Associated liver diseases such as cirrhosis, diabetes and cancerous tumors often result.  To address the growing challenge, Dr. Gogichaishvilli and his team were looking for available solutions and instead decided to seek entrepreneurial partners and develop their own AI diagnostics.  

Instead of developing code himself, he leveraged his competitive advantage. While the South Caucuses may not be a region of the world that we currently associate with medical technology, for certain types of testing and trials it is actually an ideal location. Dr. G’s clinicians work with the sort of regulatory and operational freedom rarely seen in western markets, and would like to share it with as many outside partners as they have the bandwidth to manage. 

The Liver Research Alliance is currently seeking entrepreneurs and interested digital and artificial intelligence developers that want access to the unique assets that can be brought to bear in less tightly regulated markets. Namely:

  • A large library of portable and protected patient data
  • Access to real-time clinician and patient feedback during the beta testing process.
  • Early use case implementation and clinical and economic validation in real-world clinical situations.

The clinical centers within the network are interested in partnership with cutting-edge digital solutions entrepreneurs.  The relationship goes beyond training data to include first-use validation platforms of real-world economic and patient care impacts. They will also provide valuable real world data to facilitate small radius turnaround times on clinical usage, user experience and implementation beta testing.

If you are a doctor with an interest in AI Diagnostic development and would like to join the network as a partner—or if you are an entrepreneur seeking turn-key partnership for rapid real-world technology deployment—contact Dr. G at levangogich@gmail.com to see if it’s a fit.

Stay tuned: In the near future Prognosis:Innovation will be publishing a profile on Dr. Gogichaishvilli and his progress.


Michael Schmanske is a 24-year Wall Street veteran with experience on trading desks and asset managers. He is the co-founder of Prognosis:Innovation as well as founder of MD.Capital.

#ArtificialIntelligence  |  #DigitalHealth  |  #HealthTech  |  #MedTech  |  #MedicalImaging  |  #Radiology