By Thomas Wolters · CTO, DUX Healthcare · As of 17 April 2026
The BfArM Data Protection Criteria are the instrument with which the Federal Institute for Drugs and Medical Devices operationalises the GDPR for Digital Health Applications. Since 1 August 2024, DiGA manufacturers have been obliged under DiGAV § 4 Abs. 8 to implement the catalogue determined by the BfArM under § 139e Abs. 11 SGB V. The current version 1.0 was published on 24 April 2024 and contains around 150 testable individual requirements across twelve test areas. Unlike the GDPR’s general clauses and unlike the purely cybersecurity-oriented BSI TR-03161, the BfArM criteria are explicitly tailored to the DiGA-specific situation: pseudonymous use, purpose-bound processing under § 4(2) DiGAV, strict third-country rules, separate consents, electronic withdrawal routes, grace period after the end of the prescription. This article walks the catalogue in detail – with verbatim-cited requirement IDs, from an engineering perspective, for teams not looking for the checkbox but for the architectural decision.
What the BfArM Data Protection Criteria are – and how they differ from the GDPR, DiGAV Annex 1, and BSI TR-03161
The most important precondition for working with the criteria is not to confuse the three instruments standing in parallel.
The GDPR (Regulation (EU) 2016/679) is the European foundation. It defines the protection goals – lawfulness, fairness (in good faith), transparency, purpose limitation, data minimisation, accuracy, storage limitation, integrity, confidentiality, accountability – and it applies to any processing of personal data, including that in a DiGA. Many of its provisions, however, are formulated abstractly; it says that an adequate level of protection must be maintained, rarely how in the concrete DiGA context.
The BfArM Data Protection Criteria fill precisely that gap. They operationalise the GDPR – plus the relevant DiGAV requirements, plus the BDSG, plus positions from the Datenschutzkonferenz (DSK) – into around 150 testable individual requirements. The legal basis is § 139e Abs. 11 SGB V in conjunction with DiGAV § 4 Abs. 8. The instrument is a criteria catalogue for certification: requirements are formulated in RFC-2119 grammar (MUSS / DARF NICHT / SOLL / KANN), and an independent certification body checks implementation against them. Unlike the GDPR’s general clauses they are concrete enough to be testable in audit.
Annex 1 of the DiGAV is the formal bridge between the criteria and the BfArM application. The manufacturer submits the declaration under DiGAV Anlage 1 with their application – a questionnaire referencing the criteria and forcing the manufacturer to support each requirement with evidence. The questionnaire does not replace the catalogue; it only shows that and how it has been implemented.
BSI TR-03161 is an entirely different instrument. The Technical Guideline of the Federal Office for Information Security certifies the cybersecurity of a DiGA – architecture, cryptography, authentication, network protection, third-party components, backend security. The legal basis is § 139e Abs. 10 SGB V, mandatory since 01.01.2025 / application-relevant since 01.07.2025. The two instruments overlap at some points – encryption, authentication, data storage – but the subject of testing is different in each case: the BfArM criteria test data protection, BSI TR-03161 tests data security. Both certificates are prerequisite for a complete application; a separate deep-dive on BSI TR-03161 covers its own systematics.
The twelve test areas at a glance
The catalogue splits into three parts and twelve chapters. Chapters 2 through 13 contain the criteria; chapter 1 is the glossary (GLSR_1 and GLSR_2), chapter 14 the annexes. The twelve test areas and their requirement-ID prefixes:
| # | Area | Prefix | Thematic core |
|---|---|---|---|
| 2 | Lawfulness | CNST | Consents for the purposes permitted under § 4(2) DiGAV |
| 3 | Processing in good faith | TuG | Fairness, expectability, consistency with T&Cs |
| 4 | Transparency | TPZ | Privacy notice, information duties, children/adolescents |
| 5 | Unlinkability | NVK | Purpose limitation, separated storage per purpose |
| 6 | Data minimisation and storage limitation | DMN | Data frugality, deletion concept, user account, grace period |
| 7 | Intervenability | ITV | Access, erasure, rectification, data portability |
| 8 | Integrity, accuracy, confidentiality | IRV | Quality assurance, log protection, eIDAS level of assurance |
| 9 | Accountability | ACC | Administrative log, audit trail, log effectiveness |
| 10 | Exercise of responsibility | CTRL | Data Protection Officer, privacy by design, breach notification |
| 11 | Processing on behalf, data transmission | AV | Third-country rule, DPA requirements, processor management |
| 12 | DPIA and record of processing activities | DSFA | Threshold analysis, risk assessment, ROPA |
| 13 | Technical and organisational measures | TOM | Encryption, push, privacy by default, backup |
Within each chapter the structure is identical: regulatory bases, scope, criteria, general and specific explanations. The criteria themselves are numbered with a prefix ID (e.g. DMN_4.2), often have several letter sub-requirements (a, b, c, …), and cross-reference one another throughout the catalogue – DMN_4 refers to TOM_2.1, ACC_2 refers to DMN_2, DSFA_1.8 refers to TOM_1.3 and DMN_1.2. Reading the catalogue linearly hides the connections; reading it as a graph reveals the architecture.
Basic language rules – RFC-2119 and the difference between “documented” and “established”
Before the substantive entry two definitions from the glossary are worth highlighting, because they carry the rest of the catalogue.
GLSR_1 sets the keywords. GLSR_1.1: “The keyword ‘MUSS’ corresponds to the keyword ‘MUST’ per [RFC 2119] and marks a requirement as mandatory to implement. The obligation is normative, i.e. non-fulfilment amounts to non-fulfilment of the criterion.” GLSR_1.2 assigns “DARF NICHT / DARF KEIN” analogously to MUST NOT, GLSR_1.3 covers “SOLL / SHOULD” (deviation only with a comprehensible justification), GLSR_1.4 “KANN / MAY”. The result: a single MUSS requirement that is not fulfilled makes the criterion fail. The certification does not know partial grades.
GLSR_2.4 is the sentence teams most often underestimate: “The establishment of a technical-organisational measure or a process comprises all steps of anchoring in lived practice. These include, for example, conception, documentation, introduction, application, and continuous improvement.” Many criteria end with the formulation “MUSS … etabliert haben” – not “dokumentiert haben”. The certifier accordingly checks not only whether a document exists, but whether the process is lived. For ACC_3.4 (log quality review), CTRL_2.2 (release process with a data protection gate), or DSFA_1.8 (at least annual re-assessment), that means: evidence from operations – ticket history, release protocols, review calendars – must be produced.
The third load-bearing definition is GLSR_2.3 (manufacturer in the medical device law sense) versus GLSR_2.10 (controller under Art. 4 Nr. 7 DSGVO). The catalogue uses the two terms strictly separately: “manufacturer” where the requirement is derived from DiGAV/DiPAV (GLSR_2.3), “controller” where it stems from the GDPR. In the regular case both are the same legal person – but the sub-requirements address different role pictures and must be assigned cleanly.
Organisational criteria – what teams frequently underestimate
The catalogue is roughly a third organisational. The sharpest pitfalls here are not the obviously technical requirements, but the procedural ones.
CTRL_1 – Internal organisation
CTRL_1.1 – confidentiality obligation. The controller MUSS obligate all persons who, through their activity, have access to personal data to confidentiality. That covers developers, ops, support, QA, management – anyone with access to production data. A one-off confidentiality declaration at onboarding suffices; but it must be documented and personally verifiable. For DiGAs this requirement is, per the specific explanation, derived directly from § 4(5) DiGAV.
CTRL_1.2 – Data Protection Officer. The controller MUSS appoint a DPO. With four sub-requirements:
- CTRL_1.2 a: demonstrate professional suitability, provide training opportunities, monitor their use;
- CTRL_1.2 b: the DPO may not perform development or operations tasks leading to a conflict of interest. A lead developer acting as DPO is regularly inadmissible;
- CTRL_1.2 c: access to all processing operations and documentation; proactive right to propose and remind; any rejection must be justified and documented;
- CTRL_1.2 d: appropriate resources – for internal DPOs a corresponding release of working time.
CTRL_1.4 is where international teams stumble: “To the extent that the controller is not established in the European Union, the requirements of Art. 27 DSGVO MÜSSEN be observed. The representative appointed in writing by the controller MUSS be established in Germany.” The EU representative in Germany – not somewhere in the EU – is a DiGA-specific tightening over Art. 27(3) GDPR (which permits EU-wide establishment) that is almost never considered outside Germany.
CTRL_2 – Privacy by design in the release process
CTRL_2.1 requires an established risk-capture and monitoring process into which residual risks from the DPIA flow. CTRL_2.2 defines the data protection gate in the release process:
- CTRL_2.2 a: features are only assigned to a release after risk assessment.
- CTRL_2.2 b: all TOMs required for a new feature must become active before or with the release.
- CTRL_2.2 c: no release may go live if, after DPIA and risk mitigation, a high risk to the rights and freedoms of natural persons remains.
This is the requirement at which many DiGA releases procedurally tip over. In agile reality features are bundled and the data protection check runs in parallel to the release – but CTRL_2.2 requires the check to sit before the assignment to the release. The specific explanation to CTRL_2.2 and CTRL_2.3 goes even further: “Every release of the digital application must be reported to the certification body within the framework of monitoring.” In agile development the certification body is drawn into product backlog and sprint planning processes. Working with a certification body that knows agile cadence is a massive advantage here; expecting a classical waterfall audit means rebuilding the process three times.
CTRL_2.3 extends the requirement to hotfixes: even short-notice changes must not weaken existing TOMs.
CTRL_3 – Data breaches under pseudonymity
The catalogue recognises the special case of a DiGA running its users pseudonymously – and therefore, in a data breach, having no communication channel outside the app. CTRL_3.1 requires the standard 72-hour notification process under Art. 33 DSGVO, but CTRL_3.2 addresses the DiGA particularity:
- The controller MUSS, even when processing pseudonymous data only, exhaust the given possibilities of the digital application to inform the data subject.
- CTRL_3.2 c: technical-organisational measures enabling a forced change of login data.
In practice this means: the DiGA needs an in-app notification system plus a forced re-auth flow. That is an architectural decision falling at the start of a project – it cannot be retrofitted three weeks before submission.
Technical criteria – zero-trust default instead of checkbox
The TOM block (chapter 13) and the related technical criteria are where the data protection criteria tip into infrastructure.
TOM_2 – Integrity and confidentiality
TOM_2.1 codifies pseudonymity as the architectural default:
- “Within the framework of prescription or approval of a digital application, the manufacturer and the controller receive no identifying data of the data subject. This initially given pseudonymous use DARF grundsätzlich NICHT be broken.”
- TOM_2.1 a: if processing of identifying data is required for intended use, TOMs must exclude non-legitimised identification.
- TOM_2.1 b: no storage of identifying data introduced via platform/distribution for purposes other than security.
- TOM_2.1 c: data export for portability/interoperability in principle pseudonymously – except where regulatorily mandatory (e.g. ePA upload with KVNR).
TOM_2.2 requires the cryptography concept. All communication between components (a), storage on the end device (b), storage on background systems (c) MUSS be encrypted. Authentication tokens MÜSSEN be digitally signed, with the application checking their integrity and authenticity (d). Backend services authenticate against one another (e). Certificates and keys MÜSSEN be state of the art (f) – the specific explanation concretely references BSI TR-02102-1.
TOM_2.3 requires a documented authorisation concept plus an anti-brute-force strategy for authentication attempts.
TOM_3 – Interaction and integration
The TOM_3 block is the most interesting from an engineering perspective, because it breaks several product-side habits.
TOM_3.1 – push notifications:
- TOM_3.1 a: “Push notifications DÜRFEN KEINE health data enthalten.”
- TOM_3.1 b: in principle no push for non-intended-use purposes.
- TOM_3.1 c: education about the associated data processing.
- TOM_3.1 d: “Push notifications MÜSSEN be initially deactivated. They MÜSSEN only be activatable via informed consent … confirmed by an explicit, active action.”
The iOS-/Android-typical “We would like to send notifications” at first app start is not sufficient; the education must substantively cover what is transmitted for which purposes. The cloud push service (APNs/FCM) here usually becomes a processing-on-behalf topic – see AV_1.4.
TOM_3.3 – camera, microphone, location services: The digital application DARF camera, microphone, and/or location services NICHT use, unless this is required for purposes of intended use. TOM_3.3 a requires evidence of necessity, TOM_3.3 b “initially deactivated, active consent” analogously to push.
TOM_3.4 – trust domain: users should move, after authentication, exclusively within the trust domain of the DiGA. Redirects to third-party content must be regularly reviewed beforehand (TOM_3.4 a), leaving the domain must be recognisable (TOM_3.4 b), and before the redirect there MUSS be information (TOM_3.4 c). For teams integrating third-party offerings (knowledge bases, video providers, chatbot platforms) that is an explicit audit process – not a one-off release.
TOM_5 – Privacy by default
TOM_5.1 is the central default prescription: “Where system settings of the digital application that can be influenced by the data subject affect the implementation of data protection principles, the scope of personal data processed, or the exercise of data subject rights, the digital application MUSS have the most privacy-friendly system setting pre-set.” And very explicitly: TOM_5.1 c: no “dark patterns” suggesting that a more privacy-friendly setting is not the default.
TOM_5.2: “The digital application MUSS follow the principle of ‘fail-safe defaults’ consistently, i.e. every non-explicitly authorised access attempt MUSS be denied.” That is the zero-trust premise in one sentence.
TOM_6 – Exceptional situations
TOM_6.1 requires a backup concept explicitly addressing the ransomware scenario (TOM_6.1 a). The specific explanation references the BSI working paper “Maßnahmenkatalog Ransomware”. TOM_6.2 requires a self-service blocking mechanism: in case of loss/theft of the end device or suspected compromise of the access data, the data subject must be able to trigger the blocking of their data against external access themselves.
Data minimisation in practice – DMN_1, DMN_3, and what they mean technically
The DMN block is the area where product decisions most frequently clash with the regulatory framework.
DMN_1 – Necessity and proportionality
DMN_1.1 is the guiding sentence: “The personal data processed via the digital application MÜSSEN be appropriate to the purpose, relevant for achieving the purpose, and limited to what is necessary for the processing purposes.” The manufacturer MUSS be able to justify the contribution of every processed data category and explain that these purposes could not be pursued without these data.
Three sub-requirements with technical sharpness:
- DMN_1.1 a: data minimisation over the lifecycle – data no longer relevant are anonymised or deleted. One-off imports bringing ten data categories where three are used need an automatic cleaning step.
- DMN_1.1 b: “The controller of the digital application DARF über die Anwendung grundsätzlich KEINE Daten erheben that allow, on their own or in interaction, identification of a natural person.” KVNR, real name, address, email addresses with real-name components – all prohibited unless strictly necessary for intended use. The specific explanation makes clear: even “natural” identifiers such as the KVNR should be avoided, as they are not usually separately protected.
- DMN_1.1 c: queries for names must be designed such that the entry only requests a first name or pseudonym.
DMN_1.4 – access permissions to platform resources (camera, GPS, address book, calendar) are only permissible when required for lawful purposes, with special protection for resources that could reveal third parties (DMN_1.4 a).
DMN_3 – Data frugality in the storage format
DMN_3.1 – process data in the most data-sparing format: “age group instead of the exact date of birth”. DMN_3.1 a: IP addresses and device numbers should not be stored; if they are, truncated or masked.
DMN_3.2: “The controller of the digital application MUSS define criteria for determining the non-identifiability of data subjects and ensure compliance with these criteria.” That is the anchor at which anonymisation vs. pseudonymisation decisions are taken and documented – often underestimated, because “pseudonymised” and “anonymised” are confused colloquially but are legally fundamentally different.
DMN_4 – User account and grace period
DMN_4 is one of the most DiGA-specific blocks. DMN_4.1 defines the pseudonymously usable user account:
- DMN_4.1 a: “User accounts MÜSSEN be usable pseudonymously; identity information of the data subject (health insurance number, real name, address, directly identifying email address etc.) DÜRFEN KEINE Voraussetzung für die Nutzung sein.”
- DMN_4.1 b: access via secure authentication against the factors captured at creation.
- DMN_4.1 c: The activation code MUSS be stored separately from the user account and MUSS be deleted after activation. A hash value of the activation code KANN be part of the user account. Computing the hash value MUSS be done per BSI TR-02102-1.
DMN_4.2 – the grace period:
- “This grace period DARF NICHT last longer than one third of the prescription duration or approval duration – but a maximum of three months.”
- DMN_4.2 a: before expiry, inform the data subject plus obtain separate consent for continued retention of the data; no consent → account and data deleted/blocked per the deletion concept.
- DMN_4.2 b: during the grace period, data are blocked; permissible only: authentication against the account for withdrawing consents, for data export, for contacting the controller, for unlocking on new activation.
- DMN_4.2 c: on a follow-up prescription, the person decides between continued use of the old account and a new one; with a new account the old one is immediately deleted.
Teams managing user accounts independently of the prescription logic rebuild the account-lifecycle logic here from the ground up. In a platform architecture that is a module built once and configured per DiGA; in a custom DiGA it is often weeks of work in the last quarter before submission.
Audit trail and logging – ACC_1, ACC_2, ACC_3 in practice
The ACC block operationalises Art. 5 Abs. 2 DSGVO (accountability) in two separate logs – analogous to the distinction the ePA specification draws between data protection control by the data subject and data protection purposes of the controller.
ACC_1 – Administrative log for the data subject
ACC_1.1 requires a log whose purpose is data protection control by the data subject:
- a) when which consent was given/withdrawn in which version – each version of a consent declaration must be versioned and historised;
- b) all permission grants, changes, and revocations;
- c) all changes to authentication data;
- d) context (devices used) SOLL be captured;
- e) entries MÜSSEN be authentic, timely, protected against manipulation;
- f) the data subject MUSS be able to view the log.
ACC_1.2: before ending application use, download from within the application MUSS be possible – including during the grace period. On termination, the administrative log is deleted.
ACC_2 – Audit trail for the controller
ACC_2.1 is the internal audit trail, with eight sub-requirements. The practically most relevant:
- a) documentation of accesses, data transfers, data changes, source evidence, blockings, and deletions.
- b) “The audit trail MUSS capture all accesses by administration, operations, and support personnel to personal data, including the audit trail itself, to enable detection of internal data breaches.” – the audit trail itself is subject to its own monitoring.
- d) transmission and storage of log data encrypted, protected against falsification and loss.
- e) access limited to few persons, secured by TOMs, anchored in the role and rights concept.
- g) “The audit trail MUSS be written such that it does not allow behaviour or performance monitoring of employees of the manufacturer or the controller of the digital application.” – a BetrVG-driven criterion that influences logging formats per operation.
- h) for multi-step processing, the audit trail covers only start or completion. Data frugality applies in the log too.
- i) log data MÜSSEN be securely deleted after three months unless required for ongoing investigations.
ACC_2.3: the controller MUSS be able, based on the logs, to verify the proper implementation of their deletion concept and present log excerpts to supervisory authorities on request.
ACC_3 – Effectiveness of logging
The block that often fails.
ACC_3.3: “The controller of the digital application MUSS … implement technical measures in the digital application and in internal IT systems that ensure processing operations subject to logging are only executed when it is ensured that the required log entries can be written. Write accesses that have taken place but cannot be logged MÜSSEN be rolled back. Read accesses that have taken place but cannot be logged MÜSSEN be aborted before the read datum is handed to the requesting system or user.”
That is a transactional rollback requirement on the logging layer – if the logging backend is unreachable, business logic must not continue. Cloud-native architectures with asynchronous logging pipelines must cover this either via circuit breakers + transactional rollback or via synchronous log confirmation before commit. The specific explanation relaxes it only slightly: the check of loggability need not happen before every step, as long as the system checks regularly – but the rollback option in the error case is mandatory.
ACC_3.4: “The controller … MUSS establish processes for regular review of the quality of the automated evaluation of logs …” – including “manual sample checks” (SOLL). Automated security monitoring alone is not sufficient; the manufacturer must show that the automated evaluation is cross-checked via manual samples.
Pseudonymisation, transparency, third country – NVK, TPZ, and AV in interplay
Three areas in practice often connected, whose criteria condition one another.
NVK – Unlinkability and purpose limitation
NVK_1.1: “A breach or extension of the lawful processing purposes DARF NICHT occur.” – the hard stop against purpose extension without a new legal basis.
NVK_2.1 through NVK_2.3 is the rule that hits teams with a single database hardest: storage and processing of data for different purposes MUSS happen technically separately.
- NVK_2.1 [DiGA only]: data for evidence of positive care effects separately from other purposes; SOLL anonymised; strict separation at the processor; after permanent listing (or rejection decision) an obligation to delete.
- NVK_2.2 [DiGA only]: data for determining performance-dependent price components (AbEM context) separate; SOLL anonymised; no transmission of personally identifiable data to GKV-SV or arbitration body.
- NVK_2.3: data for functional capability/usability/further development separate; SOLL anonymised.
“Technically separate” means in audit practice: separate database schemas or separate instances, separate access rights, separate deletion periods. A single table with a purpose flag is typically insufficient.
TPZ – Transparency and information duties
TPZ_1 structures the privacy notice into nine sub-requirements. The frequently overlooked details:
- TPZ_1.4: contact for data protection enquiries in German and English; sub-requirement a demands a binding assurance of a response time (the specific explanation names 2 working days as appropriate); sub-requirement b also requires telephone reachability (8 hours on working days as a guideline).
- TPZ_1.7: an established process for updating the privacy notice – including clear responsibilities and versioning – is part of product development, not downstream.
- TPZ_1.8 and TPZ_1.9: T&Cs and privacy notice can be unilaterally amended by the manufacturer, but cannot take effect without informed, explicitly confirmed acknowledgement of the change – inform 14 days before entry into force; until confirmation the old version applies.
TPZ_4.2: the DiGA MUSS, 14 days before the end of the prescription/approval duration, point out possible data losses and the right to data portability under Art. 20 DSGVO – an automatic in-app notice that must be laid down in the product logic.
AV – Processing on behalf and third country
AV_1.1 is the sharp DiGA third-country rule: “Any processing of personally identifiable data … MUSS take place exclusively within Germany, in another Member State of the European Union, in a state equivalent under § 35(7) SGB I, or on the basis of an adequacy decision under Artikel 45 DSGVO. This includes personally identifiable inventory, usage, and traffic data.”
AV_1.3 addresses subsidiaries of US-American corporations explicitly (even under EU-US-DPF):
- a) encrypted storage and transmission, key management by the manufacturer in the EU (or an EU trustee).
- b) the service provider must confirm, in the case of official disclosure demands, initially transmitting no data – including not to the parent corporation.
- c) pursue and exhaust legal remedies in the case of a disclosure demand.
The specific explanation reasons: “Subsidiaries of US-American companies are de facto not readily able to keep the given assurances of excluding processing of data in a non-permitted third country (see reasoning on the Schrems II judgment).” The BfArM thus draws a line that does not see the EU-US-DPF as sufficient per se.
AV_1.4 requires that for all components of third-party providers used in the digital application – SDKs, analytics libraries, chat widgets, push providers – current documentation is available showing all occasions of data transmission and the places of data processing. Third-party SDK scans therefore belong in the release gate, not in annual audit preparation.
DPIA and record of processing activities – DSFA_1, DSFA_2
The catalogue structures the DPIA in five steps: threshold analysis, risk assessment, TOM definition, residual-risk assessment, where applicable, coordination with BfDI/LDA under Art. 36 DSGVO.
DSFA_1.5 a sets the procedural requirements for the risk assessment: the manufacturer SOLL use a documented procedure recognised by data protection authorities. For a proprietary procedure, (a) German documentation, (b) at least three-level scales for probability of occurrence and severity, (c) at least three-level risk classification must be in place. The specific explanation references the Standard Data Protection Model (SDM) of the DSK as an example.
DSFA_1.5 e SOLL use a four-level scale (negligible, manageable, substantial, large). DSFA_1.5 f requires risk-matrix documentation with explicit justification for events on the borderline.
DSFA_1.8 is the process part: “The controller … MUSS have established a process for tracking all identified risks or for regular reassessment of the need for a Data Protection Impact Assessment.” At least annual reassessment – the process that, in the second year, gets checked into the certification body’s process and no longer merely into the certificate.
DSFA_2 is the record of processing activities (ROPA). DSFA_2.1 a: for processing for intended use, it MUSS be set out which features are assigned to which processing activity. For neighbouring processing activities a clear delimitation in the sense of purpose separation MUSS be carried out. DSFA_2.2 carries through the details from Art. 30 Abs. 1 DSGVO – with evidence sharpness: for every processing activity (a) controller/representative/DPO named, (b) purpose, (c) categories of data subjects and data (health data flagged as such), (d) categories of recipients, (e) third-country transfer (yes/no, recipient, data categories), (f) deletion periods, (g) TOMs plus effectiveness check.
Typical BfArM requests for further information – from observation
These are the patterns we see from feedback loops with the certification body and BfArM communication. It is not an official BfArM statistic, but the repetitions are significant enough to serve as a warning catalogue.
“Documented instead of established” (GLSR_2.4). A checklist only built in the weeks before submission shows up in the audit conversation. The DPO cannot produce a review history, the release gate under CTRL_2.2 exists as policy but not as an audit trail in the ticket system.
AV_1.4 incomplete. An SDK or library with telemetry does not appear in the third-party-components register – or the documentation submitted is from a version two years old. The request for further information is rarely “remove the component” but “substantiate or replace”.
TOM_3.1 a – health data in push. The wording of the push notification (“You have three new exercises for your migraine therapy”) contains health information. The correction is conceptual, not cosmetic: push content design must structurally reflect the requirement.
NVK_2 – missing technical separation. A single database with a
purposeflag, but no separate access or deletion paths. The request for further information targets architecture, not documentation.DMN_4.2 – grace period not secured. The app allows further input during the grace period, although DMN_4.2 b demands blocking. The architecture confuses “account exists” with “account active”.
CTRL_1.4 – EU representative not in Germany. A classic for international teams: the Art. 27 DSGVO representative sits in Ireland or the Netherlands, not in Germany.
ACC_3.3 – log backend without rollback. Asynchronous logging pipelines without transactional guarantee fall here; the request for further information is a real infrastructure change, not a policy update.
Interplay with BSI TR-03161 – short delimitation
Because the two instruments are often confused: the BfArM Data Protection Criteria test whether the DiGA operationalises the GDPR and DiGAV data protection requirements. BSI TR-03161 tests the technical cybersecurity of the DiGA (architecture, cryptography, authentication, backend, penetration tests). Both are mandatory, both are delivered per DiGA, neither replaces the other.
Some requirements overlap – both catalogues demand encryption in transit and at rest, both require key management per BSI TR-02102-1, both require authorisation concepts. But the perspective is different: TOM_2.2 asks “are personal data protected?”, the BSI TR-03161 criteria O.Cryp_* ask “does the cryptographic implementation match the state of the art, are there hard-coded keys, is the key exchange secured against MITM?”. In a platform architecture the answers often coincide – but the check steps are separate, and the evidence artefacts point in different directions.
In the DUX portfolio we work both catalogues in parallel on a shared platform architecture: the mHealth Suite delivers the infrastructure (key management, identities, logging, encryption, vault integration), which is mirrored per DiGA against TOM_2, AV_1, and the BSI O.Cryp_* family. The platform work happens once, the certification per DiGA. For the BSI side we have worked this out in detail in BSI TR-03161 certification.
How DUX handles the catalogue from a practical perspective
DUX Healthcare builds DiGAs on the mHealth Suite. Our DiGAs have gone through the BfArM Data Protection Criteria process per DiGA, because the catalogue is tested per product, not organisation-wide. The platform logic is: the individual requirement IDs – DMN_4, ACC_2, AV_1.3, TOM_2.2, TOM_3.1 – are solved once at platform level in an architecturally appropriate way, and every new DiGA project inherits that solution. What remains product-specific is the product-specific answers: which purposes under § 4(2) DiGAV are pursued? Which data are appropriate to the purpose (DMN_1.1)? Which third-party components are used (AV_1.4)? Which features belong to which processing activity in the ROPA (DSFA_2.1 a)?
That is the same logic we apply to BSI TR-03161 and the MDR conformity assessment: platform work upstream, DiGA work on top. The difference from a custom build is not the quality of the audit, but the time and cost share to be borne per project. The platform mechanics are available at /build/; for further reading in this knowledge area, see DiGA for adjacent topics – the practical application logic is the most direct neighbour article.
Answers to frequent questions
How often do I have to run through the criteria?
Certification under the BfArM Data Protection Criteria is not a one-off event. At first listing of a DiGA the check is carried out by an independent certification body against the catalogue; the declaration under DiGAV Anlage 1 referencing the criteria is part of the application.
In the lifecycle of the DiGA two mechanisms apply. First: every release of the digital application must, per the specific explanation to CTRL_2.2, be reported to the certification body, including a representation of the change parameters. The certification body decides whether further checks are required – in agile development the certification body is drawn into sprint planning and sprint review. Second: DiGAV § 18 requires notification of substantial modifications to the BfArM – changes with impact on data protection or data security are typically substantial. In practice well-organised teams work with a quarterly rhythm: quarterly re-assessment of risks (DSFA_1.8 requires at least annually) and a re-certification when the scope materially changes. A DiGA without architectural or functional changes gets by with the annual review cadence; an active DiGA sees one to two substantial re-assessments per year.
What is the most common reason for data protection requests for further information?
From our observation: the structural distinction between “documented” and “established” (GLSR_2.4). The catalogue requires, for many criteria, a lived process practice, not merely a policy document. The most common specific points:
First, CTRL_2.2 – the data protection gate in the release process. If the certification body sees release approvals issued without a data protection review, an existing release template is not enough. Second, AV_1.4 – third-party components. The most frequently missing artefact is current third-party documentation matching the submitted software version with substantiated data flows. Third, ACC_3.3 – the transactional logging guarantee: systems with asynchronous logging pipelines that do not roll back on error fall here structurally. Fourth, DMN_4.2 – the grace period: accounts still allowing data input instead of being blocked. Fifth, TOM_3.1 a – health data in push text.
Substantive rejections in the sense of “GDPR violation” are rare; what is actually requested are operational rectifications in the patterns above. The good news: all five points are architectural or procedural decisions that can be made at project start – but no longer three weeks before submission.
DP criteria or BSI TR-03161 – which first?
Neither sequentially. Both must run in parallel, and they are critical at different phases in different ways.
In the early architecture phase BSI TR-03161 dominates, because its requirements – cryptography concept, authentication architecture, backend segmentation, key management, vulnerability disclosure – structurally reach into the infrastructure. A DiGA project that seriously looks at TR-03161 in week 20 rebuilds half of it. The BfArM Data Protection Criteria join in the same phase, but their sharpest structural requirements – DMN_4 (pseudonymous account, grace period), TOM_5.1 (privacy by default), NVK_2 (technical purpose separation), TOM_2.1 (pseudonymity as default) – are also architecture topics that must be set at the start.
In the late project phase the BfArM criteria dominate, because many of their requirements – CTRL_1 (DPO, EU representative), CTRL_2 (release gate), TPZ_1 (privacy notice in multiple languages, telephone hotline), DSFA_1.8 (review process), AV_2 (contractual protection of processors) – demand procedural anchoring in the organisation. BSI TR-03161 is by this time largely completed through pentests, code reviews, and the test report.
In practice this means: architecture must know both catalogues from day one – not sequentially, but as a joint constraint. Certification itself can run in parallel in both dimensions in the last 3–4 months before application submission, if the upstream work is in order. Set up sequentially, the times add; set up in parallel on a viable platform, they run in the same phase.
Are the BfArM criteria a GDPR-lite version?
No – that is the misconception explaining most underestimated reviews. The BfArM Data Protection Criteria are in several places sharper than the GDPR minimum requirements, not more lenient.
Three examples. First, third country: the GDPR permits processing in third countries under an adequacy decision (Art. 45 DSGVO), standard contractual clauses, or other suitable guarantees (Art. 46 DSGVO), or binding corporate rules (Art. 47 DSGVO). AV_1.1 permits for DiGAs only EU/EEA or an adequacy decision – SCCs and BCRs do not suffice. For subsidiaries of US-American corporations AV_1.3 requires additional technical-organisational measures beyond the EU-US-DPF. Second, pseudonymity: the GDPR allows pseudonymisation as a means of data minimisation – but does not require it (Art. 5(1)(c), Art. 25 GDPR only name pseudonymisation as an example of appropriate measures, not as a default). The BfArM catalogue makes pseudonymity the default (DMN_1.1 b, DMN_4.1 a, TOM_2.1); identification is a justifiable exception. Third, consents: DiGAV § 4(2) plus CNST_1.3 require a separate, active consent for further-development purposes – no “coupled” opt-in with other purposes. Art. 7 Abs. 2 DSGVO only requires “clearly distinguishable from other matters”; separate consent per purpose is a DiGA-specific tightening.
Teams going into a DiGA application with “GDPR conformity” systematically miss the sharp points. The catalogue is operationalised, not watered down – it is the narrower intersection of GDPR, DiGAV, BDSG, and DSK positions, not their subset.
Further reading
For teams wanting to follow the tracks laid down here, the knowledge area DiGA has the adjacent deep-dives: the practical guide to BfArM application with a focus on the 26 data blocks under § 2 DiGAV, the BSI TR-03161 certification as a separate cybersecurity instrument, and – once published – DiGAs and US cloud services for the AV_1.3 question in detail, Pseudonymous Use and Account Management for the DMN_4 logic, and Audit Trail and Logging for the ACC criteria at infrastructure level.
For those who want to read the catalogue directly: BfArM Datenschutzprüfkriterien v1.0 (24.04.2024) – 79 pages, free of charge, and a mandatory basis for the application. For those who then also want to operationalise how these criteria carry in a concrete architecture, the DUX platform mechanics are at /build/.
Practical assessment