Vendor Due Diligence: The Hidden Risk in Your Legal Tech Stack

Most Irish firms now understand why their systems matter.

They’ve updated engagement letters for GDPR, refreshed AML procedures, tightened conflicts checks, learned about cyber security and sat through more than one webinar on the EU AI Act.

But often when we ask a simple question – “List every external system holding client data, where it’s stored and what the vendor is allowed to do with it” – the room goes quiet.

That silence is the real risk.

Because going forward your greatest exposure may not be your own processes at all. It may be the vendors you signed up on the strength of a polished demo and a persuasive account manager.

A lot of legal tech buying follows a similar pattern. Someone hears about a system another firm uses, there’s a demo, people like the UI, a short trial is run and a licence is signed. The DPA is filed somewhere.

For years, this felt harmless. Practice management systems lived on a server in the back room. Your “vendor risk” was a local IT company who did backups and fixed printers.

Cloud changed that. AI has multiplied it.

Now it’s routine for a small Irish firm to have client data sitting in:

Practice and case management platforms

Document management systems

E-signature tools

AML / KYC platforms

Smart PDF tools, scanning apps and note taking tools

AI enabled features built into Microsoft 365, Google Workspace, Adobe, cloud storage and more

Every one of those relationships is a risk vector for confidentiality, for GDPR, for AML evidence and for business continuity. Under GDPR and the Data Protection Act 2018 you remain responsible for choosing processors who provide “sufficient guarantees” and for being able to demonstrate that you have appropriate technical and organisational measures in place. Regulators, likewise, are now explicit that systems, controls and record keeping arrangements are part of a firm’s risk profile rather than a footnote.

The EU AI Act adds another layer where your tools use AI. Under the Act businesses deploying certain high risk AI systems must monitor their use, keep logs and respond quickly if risks materialise.

You need to be clear, internally, about the regulatory frame why you’re asking the questions you’re asking.

GDPR and the Data Protection Act 2018

At the risk of oversimplifying:

You are almost always the controller for client and matter data.

Your tech vendors are usually processors (or sub processors of a larger platform).

Under Articles 5, 24 and 28 GDPR and the Irish 2018 Act, you must:

Choose processors who offer “sufficient guarantees” of compliance.

Put appropriate processing terms in place – the DPA is not optional.

Be able to show what’s happening to personal data – where it is, who sees it, how long it’s kept, how it’s secured.

If a vendor misuses client data or can’t support a data subject request or breach investigation properly, the DPC will be looking at you first.

AML and professional regulation

The Law Society’s AML guidance repeatedly links compliance to systems, controls and record keeping. If your client identification data, risk assessments or source of funds documents live inside a vendor platform, inspectors will expect that platform to:

Preserve the records for the required period

Produce them promptly in a usable format

Show who accessed or changed what, and when

If your vendor can’t do that, it becomes your problem.

AI governance

For AI enabled tools, you’re now in a world where:

The EU AI Act sets expectations for both providers and deployers of certain AI systems, especially in high risk domains.

The Law Society’s new guidance on generative AI emphasises that tools like Copilot or ChatGPT don’t dilute professional duties around confidentiality, competence and independence but heighten them.

If a vendor is selling AI powered features, you need a view on:

What data those features see

Whether outputs are checked before they go near a client or court

How you would explain the system’s role to a regulator or judge

Ten questions that catch 90% of vendor risk

Here are ten questions you can take into almost any legal tech procurement or renewal as a way to test whether a prospective supplier truly understands your obligations and, crucially, is prepared to share them.

1. Where is our data stored and who can see it?

You’re looking for clarity on:

Jurisdictions: which countries host your data and backups?

Sub processors: which other providers are in the chain -e.g. cloud platforms, support providers, analytics tools?

Access paths: who within the vendor can access live data e.g. support teams, engineers, contractors?

Good answer: a clear data map, named hosting regions e.g. EU only, a published sub processor list with notification on changes, role based access with logging.

Red flag: No specifics.

2. Do you use our data to train models or improve your service?

For AI enabled tools, this is the central question.

Are your documents, messages or metadata used to train foundation models?

Are they used to train or fine tune product specific models?

Can you opt out?

Is training limited to your own tenancy or pooled with other customers?

Good answer: Your data is used only to provide the service. It is not used to train shared models and you can clearly see how that is enforced contractually and technically.

Red flag: Any reluctance to put their assurances into the DPA or explanations that rely entirely on marketing language.

3. What is our exit plan – in writing?

Imagine the vendor is acquired, goes under or simply stops making sense for your firm. Then ask:

In what formats can we export data and is the schema documented?

How much time do we have after termination to export?

Do we need to keep paying during a migration period?

What happens to backups and archived data? Is deletion certified?

Good answer: a contractual right to export in open, well documented formats, a defined migration window and a clear deletion and certification process.

Red flag: exports limited to PDFs, no structured data.

4. How do you support our GDPR obligations?

Beyond the boilerplate Article 28 language look for:

Tools for access, rectification and deletion – data subject rights.

Role based access and audit logs – who did what, when.

Retention controls: can you set retention by matter, client or category?

Breach procedures: who tells whom, how fast, with what information.

Good answer: specific features that map to real obligations e.g. DSAR search tools, granular deletion, log export.

Red flag: “We’re GDPR compliant” with no examples of how that plays out in the product.

5. How do you secure our data?

You want to know:

Is data encrypted in transit and at rest?

What authentication options exist – MFA, SSO, IP restrictions?

What independent assurance is there e.g. ISO 27001, SOC 2?

How often are penetration tests run and will they share high level results?

Good answer: a concise security overview, backed by independent certifications and a willingness to answer follow up questions.

Red flag: “Our cloud provider handles all that” with no vendor side controls.

6. What happens during an inspection, discovery or investigation?

A Law Society AML inspection

A DPC inquiry about a data subject request

A discovery exercise in litigation

Ask:

Can we export complete, timestamped records for a client, matter or date range?

Can we show who accessed or changed a record?

Can we preserve a forensically sound snapshot if needed?

Good answer: tools to export logs and records in structured form and a support process for legal/regulatory inquiries.

Red flag: Reliance on ad hoc reports, screenshots or manual reconstruction.

7. How resilient are you commercially and operationally?

The right tech is useless if the company disappears or if outages become routine.

Look for:

A simple explanation of the vendor’s scale and runway.

Uptime history and status page.

Backup frequency and disaster recovery targets – RPO/RTO.

Escalation paths if something breaks.

Good answer: a track record of serving regulated clients, published uptime, a DR story that doesn’t sound improvised.

Red flag: no public status page, vague reassurances or reluctance to discuss resilience at all.

8. What’s your AI roadmap and how will it change how you process our data?

If the product already includes AI features or clearly plans to you need to know:

What new data will be processed e.g. behavioural analytics, interaction histories?

Will AI features be opt in or quietly enabled by default?

How will they support your obligations under the AI Act where relevant – documentation, logging, human oversight?

Good answer: a transparent roadmap, opt in controls and specific commitments on data use.

Red flag: No documentation, testing strategy or safeguards.

9. How do you handle incidents – and help us handle ours?

Incidents happen: outages, bugs, misconfigurations – even breaches.

Ask:

How will you notify us of incidents and on what timescales?

What information will you provide to support our own reporting e.g. to the DPC, Law Society, or clients?Do you have a tested incident response plan?

Good answer: contractual SLAs for incident notification, a sample incident report and evidence of rehearsed processes.

10. Who in our firm actually owns this relationship?

This is the only question you can’t outsource to a vendor.

For each critical system, you want a named internal owner:

A partner or senior lawyer who understands the business use and associated risk.

Someone in operations or IT who understands the technical surface area.

A clear link to your risk register, AML Business Risk Assessment and data protection records.

Without that, vendor management drifts and unmanaged risks are the ones that turn into problems

Clauses and levers that actually matter

While the granular details uncovered by this questioning are all important, for most small and mid-sized firms four levers deserve your energy:

(1) Data use and training

Explicit limits on using your data for model training or analytics beyond providing the service.

Clear rules for anonymisation and aggregation, if permitted.

No sharing of identifiable client data with third parties except listed sub processors.

(2) Exit and data portability

Contractual rights to export, in usable formats, at any time – not just on termination.

A defined migration window on exit.

Deletion and certification obligations, including for backups within a realistic timeframe.

(3) Cooperation on investigations

Commitment to support GDPR obligations including DSARs, breach investigation and notifications.

Support for AML / regulatory inspections where data or logs sit in their systems.

Reasonable rights to information about security incidents affecting your data.

(4) Liability for security and misuse

Ensure data breaches caused by the vendor sit in a higher liability cap.

Clarify responsibility for IP infringement in AI generated output, if relevant.

Reserve rights to terminate on serious or repeated non-compliance.

Beyond that, perfection is the enemy of done. You’re looking for a risk profile you understand and can live with rather than a theoretical ideal.

A 30 day vendor check for tools you already use

You don’t need to wait for the next procurement cycle. A light touch review over a month can dramatically reduce the unknowns.

Week 1 – Inventory

List every system that touches client or matter data: practice management, DMS, AML/KYC, e-signature, client portals, secure messaging, AI tools, file sharing.

Note who “owns” each one internally – if you can’t, that’s your first finding.

Week 2 – Paperwork

Collect contracts, DPAs, privacy policies, security summaries.

Create a simple spreadsheet with: data locations, sub processors, export formats, and renewal dates.

Week 3 – The ten questions

For your top 3-5 critical vendors, walk through the ten questions in this article.

Where you don’t know the answer, ask. The quality and speed of response will tell you a lot.

Week 4 – Prioritise actions

Split what you find into three categories:

  1. Tweak – configuration changes e.g. enabling MFA, restricting regions, turning off risky defaults.
  2. Paper – amending DPAs or contracts to reflect how you actually use the system.
  3. Plan – vendors you may need to exit in 12-24 months because the risk/benefit trade off no longer works.

This is also the point to update your:

AML Business Risk Assessment, to reflect real dependencies on third party systems.

Data protection records of processing and vendor lists.

Internal AI and IT use policies so staff understand what’s in play.

Done once, this check becomes the template for a much lighter annual review.

Vendor due diligence is no longer something done once at contract signature. For Irish firms, it’s becoming a core part of professional risk management:

GDPR and the Data Protection Act 2018 make you accountable for your processors.

AML inspections increasingly expect systems and vendors to be part of your risk story rather than a blind spot.

The EU AI Act and new Law Society guidance on generative AI push you to understand how AI enabled tools actually handle client data.

The firms that adapt quickest won’t necessarily have the most sophisticated tech. They’ll be the ones who can answer, calmly and in one page:

These are the systems we use.
This is what they do with our data.
This is how we can get that data back.
And this is how we know they’re safe enough for our clients.

Vendor due diligence isn’t just admin, it’s where your real risk strategy lives and a crucial part of the operating system for a safe modern practice.

Related Articles