LEM Platform Requirements for xRED Integrations
| Field | Value |
|---|---|
| Type | Research |
| Status | Active |
| Author | xRED Dev Team |
| Created | 2026-04-07 |
| Source | ORCA Offerings (feature/initial-updates-for-collaboration-with-xRED-vertical branch) |
| Related ADR | ADR-005 |
1. Overview
The LEM (Lab Execution & Management) team governs all integrations with the Revvity Signals Platform at Roche. Any xRED integration that touches Signals — whether reading data via External Lists/Tables, writing back via the API, or launching External Actions — must comply with LEM’s requirements.
This page consolidates the mandatory requirements from the LEM team’s official offerings documentation. It should be treated as a checklist for every new xRED integration.
2. Mandatory Delivery Checklist
Every integration, regardless of type, must deliver all of the following before going to production:
| # | Deliverable | Description |
|---|---|---|
| 1 | Documented Requirements | Clearly defined use cases with detailed Acceptance Criteria |
| 2 | Test Scenarios | Comprehensive test scenarios covering functional and non-functional requirements |
| 3 | Test Report | Formal test report with test matrix mapping requirements → test cases → results |
| 4 | Verification Procedure | Documented procedure verifying the delivered software performs its intended functions |
| 5 | Operations and Support Plan | Monitoring, maintenance, error handling routes, escalation procedures, delegations, comms plan for LEM Support |
| 6 | Deployment Configuration | Documentation of all deployment configs, parameters, environmental dependencies |
| 7 | Technical Operating Procedure (TOP) | End-to-end workflow from requirements through design, development, testing, deployment, and operation |
| 8 | Information Security | Adherence to Roche information security standards |
| 9 | Solution Risk Assessment (SRA) | Valid and approved SRA per Roche guidelines |
| 10 | Credential Management | Secure management of all tokens, API keys, certificates |
| 11 | Change Management | Processes for modifications to the integration |
| 12 | Data Privacy & Compliance | Data privacy, integrity, and regulatory compliance |
3. API Access Rules
All API traffic must go through DataRiver
The LEM team’s strong expectation is that all programmatic access to Signals should be routed through the DataRiver Signals Facade on MuleSoft. Integrations that bypass this will face significant governance friction during review. See ADR-005 for the two access routes and when to use each.
Authentication
| Integration Type | Auth Model | Token Type |
|---|---|---|
| External Lists / Tables / Materials | Basic Auth + client certificate | Single header, no user context |
| External Actions (write-back) | OAuth2 User Bearer Token | Signals-issued, requires user consent |
| Background jobs / ETL | OAuth2 Bearer Token or API Key | API keys are scarce (very limited per tenant) |
Key constraints:
- Client ID registration on the Signals tenant is required for each integration application to obtain Signals-issued User Bearer Tokens
- Users must provide consent in the Signals UI before a token is issued (one-time per integration per user)
- API keys do not support use-case differentiation — OAuth2 Bearer Tokens are strongly preferred
Rate Limits
- 1,000 API calls per minute per tenant — shared across ALL consumers
- Design for this limit: cache aggressively, batch where possible, use facade operations that optimise call counts
Audit
- API audit logs capture modifications only, not read operations
- Design integrations with this in mind for compliance tracing
4. Per-Offering Requirements
External Lists
| Constraint | Value |
|---|---|
| Max items per list | 20,000 |
| Max lists per tenant | 200 |
| Response timeout | 30 seconds |
| Refresh model | Scheduled (hourly/daily), snapshot-based |
| Pagination | Offset/limit via request headers (limit, offset, page[limit], page[offset]) |
| User context | Not available (backend-to-backend, no user identity passed) |
| Endpoint hosting | Must be on MuleSoft API Exchange |
Data format:
{
"data": [
{"field1": "abc", "field2": 101}
],
"paging": {
"total": 100
}
}The paging object can be omitted if all data fits in a single response.
External Tables (Data Sources)
| Constraint | Value |
|---|---|
| Max columns | 20 |
| Max rows | 2,000 |
| Response timeout | 20 seconds |
| Operations | Single-row CRUD (GET by ID, POST, PATCH, DELETE) |
| User context | User ID available as query parameter |
| Endpoint hosting | Must be on MuleSoft API Exchange |
Key limitation: Signals sends only data available directly within the table. Attached files or chemical drawings are passed as Signals IDs only — the external system must resolve them via the Signals API if needed.
External Materials (Chemistry Data Sources)
| Constraint | Value |
|---|---|
| Response timeout | 20 seconds |
| Search model | Exact match on a single attribute |
| User context | User ID available as query parameter |
| Endpoint hosting | Must be on MuleSoft API Exchange |
LEM recommendation: For anything beyond simple single-ID lookup, use an External Action instead — it provides a richer UI and more flexible search.
External Actions
| Constraint | Value |
|---|---|
| Hosting | Your team develops, hosts, secures, and maintains the web app |
| Auth | Dual OAuth — Roche SSO (PingFederate) + Signals OAuth2 Bearer Token |
| Write-back | Must go through DataRiver Signals Facade |
| Rendering | iFrame or new browser tab |
| iFrame messages | SIGCLOSE, SIGCONTINUE, SIGABORT via window.parent.postMessage |
iFrame caveat: iFrames prevent reuse of parent Signals window cookies (Same-Origin Policy, third-party cookie restrictions). This breaks SSO flows that rely on PingFederate cookies. New browser tab is recommended if the app needs cookie-based SSO.
POST payload: Signals sends JSON context about the triggering entity:
- Entity
id,type,name,description, timestamps relationships.createdBy,editedByrelationships.ancestors(parent entities in the hierarchy)
Supported entity types: experiment, worksheet, materialsTable, plateContainer,
sample, viewonly.
External Notifications
| Constraint | Value |
|---|---|
| Webhooks per tenant | 1 (single webhook URL) |
| Payload | Metadata only (entity ID, event type) — no full data |
| Events | Entity creation, signing, data export |
| Status | Not in active widespread use by LEM; requires LEM approval |
If multiple systems need notifications, a fan-out/broadcast service is required.
5. Integration Design Questionnaire
The LEM team requires answers to the following questions during design review. This questionnaire should be completed before implementation begins.
Users
- What are the relevant personas?
- What is their user journey?
- How many users per persona? Daily active users?
- How does each persona interact with Signals?
- How will user feedback be collected post-launch?
- How do you measure success?
Data
- What data types are involved?
- Where is the source of truth for each?
- Who governs each data type?
- What are the expected data flows (source → target)?
- How frequently does the data model change?
- What are the specific information objects exchanged (attribute level)?
- Data security/sharing limitations?
- Expected frequency of data exchange?
- How will data quality and validation be ensured?
- What is the data classification for each type?
Systems
- What systems are involved and their roles?
- What is each system’s lifecycle state (Plan, Phase In, Active, Phase Out, End of Life)?
- What integration ports exist (APIs, message streams)?
- What protocols and data exchange standards (HTTP/JSON, gRPC, etc.)?
- How are integration ports secured?
- Where are ports accessible from (RCN only, external)?
- What are the key limits of each port?
- What is the process for getting access?
- How will sensitive data be protected in transit and at rest?
- Is user context relevant? How is it provided (OAuth token, query param)?
- Escalation process for system failures?
- How will system changes be communicated across teams?
- Who monitors and maintains the integration post go-live?
- What are the key test scenarios?
Use Case Classification
Check all that apply:
- External terminologies (e.g. list of cost centres)
- Imports — simple lookup table (e.g. barcode scanning)
- Imports — complex (e.g. importing compounds into chemical reactions)
- External requests (e.g. making analytical requests)
- External registration (e.g. registering chemical samples)
- Pushing from external systems (e.g. instruments push data into experiments)
- Pulling from external systems (e.g. downstream system pulls from ELN)
- ETL — background exchange from Signals (e.g. exporting reaction data)
- ETL — background exchange to Signals (e.g. syncing material libraries)
- Automate metadata defaulting (e.g. auto-populate fields based on experiment context)
- External workflows
- Other
6. LEM Team Ownership Boundaries
The LEM team owns:
- Platform configurations (setting up External Actions, Lists, Tables within Signals)
- Read data ports / data products (standardised APIs via DataRiver)
- Common platform extensions (e.g. PBAM, cross-RED autopopulation)
- Integrations requiring admin access to the platform
xRED (and other vertical teams) should own integrations when:
- There is a need for UI to a target system (keeps data model, business rules, and UX within one team’s boundary)
- There is a source of truth where data should be entered and automatically documented in the ELN (rather than duplicated)
For ambiguous cases, joint assessment between LEM and the vertical team determines ownership.