Skip to Content
ResearchLEM Platform Requirements

LEM Platform Requirements for xRED Integrations

FieldValue
TypeResearch
StatusActive
AuthorxRED Dev Team
Created2026-04-07
SourceORCA Offerings  (feature/initial-updates-for-collaboration-with-xRED-vertical branch)
Related ADRADR-005

1. Overview

The LEM (Lab Execution & Management) team governs all integrations with the Revvity Signals Platform at Roche. Any xRED integration that touches Signals — whether reading data via External Lists/Tables, writing back via the API, or launching External Actions — must comply with LEM’s requirements.

This page consolidates the mandatory requirements from the LEM team’s official offerings documentation. It should be treated as a checklist for every new xRED integration.

2. Mandatory Delivery Checklist

Every integration, regardless of type, must deliver all of the following before going to production:

#DeliverableDescription
1Documented RequirementsClearly defined use cases with detailed Acceptance Criteria
2Test ScenariosComprehensive test scenarios covering functional and non-functional requirements
3Test ReportFormal test report with test matrix mapping requirements → test cases → results
4Verification ProcedureDocumented procedure verifying the delivered software performs its intended functions
5Operations and Support PlanMonitoring, maintenance, error handling routes, escalation procedures, delegations, comms plan for LEM Support
6Deployment ConfigurationDocumentation of all deployment configs, parameters, environmental dependencies
7Technical Operating Procedure (TOP)End-to-end workflow from requirements through design, development, testing, deployment, and operation
8Information SecurityAdherence to Roche information security standards
9Solution Risk Assessment (SRA)Valid and approved SRA per Roche guidelines
10Credential ManagementSecure management of all tokens, API keys, certificates
11Change ManagementProcesses for modifications to the integration
12Data Privacy & ComplianceData privacy, integrity, and regulatory compliance

3. API Access Rules

All API traffic must go through DataRiver

The LEM team’s strong expectation is that all programmatic access to Signals should be routed through the DataRiver Signals Facade on MuleSoft. Integrations that bypass this will face significant governance friction during review. See ADR-005 for the two access routes and when to use each.

Authentication

Integration TypeAuth ModelToken Type
External Lists / Tables / MaterialsBasic Auth + client certificateSingle header, no user context
External Actions (write-back)OAuth2 User Bearer TokenSignals-issued, requires user consent
Background jobs / ETLOAuth2 Bearer Token or API KeyAPI keys are scarce (very limited per tenant)

Key constraints:

  • Client ID registration on the Signals tenant is required for each integration application to obtain Signals-issued User Bearer Tokens
  • Users must provide consent in the Signals UI before a token is issued (one-time per integration per user)
  • API keys do not support use-case differentiation — OAuth2 Bearer Tokens are strongly preferred

Rate Limits

  • 1,000 API calls per minute per tenant — shared across ALL consumers
  • Design for this limit: cache aggressively, batch where possible, use facade operations that optimise call counts

Audit

  • API audit logs capture modifications only, not read operations
  • Design integrations with this in mind for compliance tracing

4. Per-Offering Requirements

External Lists

ConstraintValue
Max items per list20,000
Max lists per tenant200
Response timeout30 seconds
Refresh modelScheduled (hourly/daily), snapshot-based
PaginationOffset/limit via request headers (limit, offset, page[limit], page[offset])
User contextNot available (backend-to-backend, no user identity passed)
Endpoint hostingMust be on MuleSoft API Exchange

Data format:

{ "data": [ {"field1": "abc", "field2": 101} ], "paging": { "total": 100 } }

The paging object can be omitted if all data fits in a single response.

External Tables (Data Sources)

ConstraintValue
Max columns20
Max rows2,000
Response timeout20 seconds
OperationsSingle-row CRUD (GET by ID, POST, PATCH, DELETE)
User contextUser ID available as query parameter
Endpoint hostingMust be on MuleSoft API Exchange

Key limitation: Signals sends only data available directly within the table. Attached files or chemical drawings are passed as Signals IDs only — the external system must resolve them via the Signals API if needed.

External Materials (Chemistry Data Sources)

ConstraintValue
Response timeout20 seconds
Search modelExact match on a single attribute
User contextUser ID available as query parameter
Endpoint hostingMust be on MuleSoft API Exchange

LEM recommendation: For anything beyond simple single-ID lookup, use an External Action instead — it provides a richer UI and more flexible search.

External Actions

ConstraintValue
HostingYour team develops, hosts, secures, and maintains the web app
AuthDual OAuth — Roche SSO (PingFederate) + Signals OAuth2 Bearer Token
Write-backMust go through DataRiver Signals Facade
RenderingiFrame or new browser tab
iFrame messagesSIGCLOSE, SIGCONTINUE, SIGABORT via window.parent.postMessage

iFrame caveat: iFrames prevent reuse of parent Signals window cookies (Same-Origin Policy, third-party cookie restrictions). This breaks SSO flows that rely on PingFederate cookies. New browser tab is recommended if the app needs cookie-based SSO.

POST payload: Signals sends JSON context about the triggering entity:

  • Entity id, type, name, description, timestamps
  • relationships.createdBy, editedBy
  • relationships.ancestors (parent entities in the hierarchy)

Supported entity types: experiment, worksheet, materialsTable, plateContainer, sample, viewonly.

External Notifications

ConstraintValue
Webhooks per tenant1 (single webhook URL)
PayloadMetadata only (entity ID, event type) — no full data
EventsEntity creation, signing, data export
StatusNot in active widespread use by LEM; requires LEM approval

If multiple systems need notifications, a fan-out/broadcast service is required.

5. Integration Design Questionnaire

The LEM team requires answers to the following questions during design review. This questionnaire should be completed before implementation begins.

Users

  • What are the relevant personas?
  • What is their user journey?
  • How many users per persona? Daily active users?
  • How does each persona interact with Signals?
  • How will user feedback be collected post-launch?
  • How do you measure success?

Data

  • What data types are involved?
  • Where is the source of truth for each?
  • Who governs each data type?
  • What are the expected data flows (source → target)?
  • How frequently does the data model change?
  • What are the specific information objects exchanged (attribute level)?
  • Data security/sharing limitations?
  • Expected frequency of data exchange?
  • How will data quality and validation be ensured?
  • What is the data classification for each type?

Systems

  • What systems are involved and their roles?
  • What is each system’s lifecycle state (Plan, Phase In, Active, Phase Out, End of Life)?
  • What integration ports exist (APIs, message streams)?
  • What protocols and data exchange standards (HTTP/JSON, gRPC, etc.)?
  • How are integration ports secured?
  • Where are ports accessible from (RCN only, external)?
  • What are the key limits of each port?
  • What is the process for getting access?
  • How will sensitive data be protected in transit and at rest?
  • Is user context relevant? How is it provided (OAuth token, query param)?
  • Escalation process for system failures?
  • How will system changes be communicated across teams?
  • Who monitors and maintains the integration post go-live?
  • What are the key test scenarios?

Use Case Classification

Check all that apply:

  • External terminologies (e.g. list of cost centres)
  • Imports — simple lookup table (e.g. barcode scanning)
  • Imports — complex (e.g. importing compounds into chemical reactions)
  • External requests (e.g. making analytical requests)
  • External registration (e.g. registering chemical samples)
  • Pushing from external systems (e.g. instruments push data into experiments)
  • Pulling from external systems (e.g. downstream system pulls from ELN)
  • ETL — background exchange from Signals (e.g. exporting reaction data)
  • ETL — background exchange to Signals (e.g. syncing material libraries)
  • Automate metadata defaulting (e.g. auto-populate fields based on experiment context)
  • External workflows
  • Other

6. LEM Team Ownership Boundaries

The LEM team owns:

  • Platform configurations (setting up External Actions, Lists, Tables within Signals)
  • Read data ports / data products (standardised APIs via DataRiver)
  • Common platform extensions (e.g. PBAM, cross-RED autopopulation)
  • Integrations requiring admin access to the platform

xRED (and other vertical teams) should own integrations when:

  • There is a need for UI to a target system (keeps data model, business rules, and UX within one team’s boundary)
  • There is a source of truth where data should be entered and automatically documented in the ELN (rather than duplicated)

For ambiguous cases, joint assessment between LEM and the vertical team determines ownership.

Last updated on