Skip to content

Microsoft positions Foundry as a controlled environment for the use of large language models within its own Azure infrastructure. The key message is clear: data remains isolated, is not shared, and is not used for training purposes.

However, a closer examination of the contractual terms reveals a more nuanced picture.

  1. Technical architecture – legally fragmented

Foundry enables the parallel use of different LLMs within a single Azure instance. This multi-model architecture means that the legal assessment cannot be limited to Microsoft.

What is crucial, however, is:

  • which specific model provider is used
  • which functions (e.g. grounding) are activated
  • which data flows actually take place

In practice, this results in a modular risk stack that must be assessed separately.

 

  1. Contractual framework – no uniform level of protection

The Microsoft Data Protection Addendum (DPA) suggests a uniform level of data protection. However, this is qualified by numerous exceptions.

Particularly critical:

  • When using Bing Grounding, the DPA does not apply; instead, the Microsoft Privacy Statement applies
  • Data may therefore leave the controlled environment
  • Geo-boundaries may be breached

Added to this is the use of preview services, where:

  • Data transfers take place to the USA
  • reduced protection mechanisms apply

 

  1. Usage data – independent use by Microsoft

Whilst traditional customer data is relatively clearly protected, this does not apply to usage data.

Microsoft reserves the right to use this data for:

  • Profiling
  • Product development
  • Commercial purposes

The description of the purpose is deliberately broad. From the perspective of Swiss data protection law, this raises significant questions regarding transparency and lawfulness.

 

  1. Lack of clarity regarding TOMs and SCCs

A key problem lies in the contractual formulation of the technical and organisational measures:

  • No clearly assigned TOMs can be identified for Foundry
  • SCCs are not consistently incorporated
  • Practical safeguards rely largely on the EU–CH–US Data Privacy Framework

This creates a structural risk – particularly in the event of future case law.

 

  1. Access by third countries and control deficiencies

Regardless of contractual assurances, risks remain arising from:

  • US access laws (Cloud Act, FISA)
  • Unclear involvement of model providers
  • a lack of transparency in maintenance and update processes

The frequently communicated statement that no prompts are passed on is not clearly guaranteed by contract.

 

  1. Encryption and technical limitations

Microsoft relies on encryption ‘in transit’ and ‘at rest’. However, the following remains unresolved:

  • Processing in plain text during use
  • Limited availability of Confidential Computing

This means that a residual risk remains, particularly with regard to sensitive data.

 

  1. Conclusion

Microsoft Foundry is technologically powerful – but legally, it is not a plug-and-play product.

The responsibility for legally compliant use lies to a significant extent with the customer.

In particular, the following are required:

  • a nuanced assessment of individual LLMs
  • careful management of additional functions
  • targeted contractual refinement

 

  1. How we support you

We regularly assist companies with the legally compliant implementation of AI systems, in particular:

  • Analysis of specific Foundry setups and data flows
  • Review and renegotiation of Microsoft contracts
  • Structuring of multi-LLM governance
  • Implementation of processes compliant with the DSG and AI Act

If you are using or evaluating Foundry, it is worth conducting a targeted legal assessment – often, significant risks can be reduced with a manageable amount of effort.

We would be happy to review your specific configuration and highlight where action is required.