From The Blog

June 18, 2021
June 18, 2021

Supply Chain Resilience using Confidential Computing

The pandemic introduced the global public to the very real problems of supply chain continuity. It also changed the way that supply chains approach resilience. Together, these changes are leading to some fundamental shifts in the technologies being used for supply chain management. Of these, confidential computing is one that targets the single most important… Read more »

The pandemic introduced the global public to the very real problems of supply chain continuity. It also changed the way that supply chains approach resilience.

Together, these changes are leading to some fundamental shifts in the technologies being used for supply chain management. Of these, confidential computing is one that targets the single most important underlying problem in supply chain management: how to process business-sensitive data in use. By using confidential computing, it is now possible to protect sensitive data throughout all stages of its lifecycle, as well as provide technical assurances that it can’t be misused. This opens up several opportunities for the supply chain industry to build and take advantage of new collaborative solutions.

To translate this, we will go through two categories of use cases: aligning suppliers and managing products.

Resilience doesn’t come from working harder

Crisis is not new to supply chain management. Earthquakes, port strikes, road closures, cyberattacks and cranky customs officers are par for the course. Normally, these exceptions are local, and so are the solutions. In the beginning of the pandemic, this is exactly what happened. The explosion in job openings for supply chain professionals indicates that many companies immediately responded by working harder.

But as the pandemic has stretched into 2021, supply chains are looking at ways to work smarter.  Unfortunately, this isn’t always easy to do. The old and outdated systems currently being used aren’t designed to establish trust between counterparties or verify that an organization’s data will be protected. However, with the advent of confidential computing, it is now possible to build systems that aggregate data across multiple parties to build new solutions that align supply chain partners.

Below we outline several ideas on how this new collaboration can be implemented in highly innovative ways.

Use case I: Alignment With Supply Chain Partners

  • Cost reduction – Managing day-to-day operations in supplier relationships often necessitates a delicate balance between demands and incentives. Today’s supply chains experience high levels of inefficiency due to the inability to jointly manage cost. Suppliers are often unlikely to share cost data beyond contractual mandates simply because they fear price cuts. Likewise, related costs such as those incurred for transportation and logistics are often not reported, even when they could be optimized through collaboration.
  • Load tendering – The same applies when industry data is aggregated. Load tenders and cost per mile in transportation are instructive examples. Both can be obtained from a variety of sources to analyze not just who is over- or underpaid but also to identify key trends. It is further conceivable that we will witness the emergence of blockchain solutions that initiate transactions such as the procurement of transportation and warehousing services based on bids or auctions. This is a natural evolution and we already observe the emergence of load matching platforms and data aggregators.
  • Scorecarding – Monitoring supplier performance, or Scorecarding, is typically performed in ERP systems today, but it is hard to obtain data that allows for direct comparisons between different suppliers. If scorecards could be securely shared not just across all suppliers of a manufacturer but between several manufacturers, an entirely different picture of supplier performance would likely emerge. It would be possible to extract best-practices that all parties could directly benefit from.

Use case II: Managing Products and Demand

  • Secure data collection – An obvious example where collaboration is easy to achieve is the management of product features, observation of consumer behavior and collection of usage data. Confidential computing allows manufacturers to securely collect data about product usage in the field without intrusion of privacy or data leakage to improve the products they build. The resulting insights are incredibly valuable, especially when they are shared among all relevant parties in a supply chain.
  • Collaborative planning – Today, forecast accuracy typically ranges from 25% – 75%. Demand planning becomes substantially harder the farther we move away from the point of sale, as it is impossible for upstream suppliers to anticipate demand without downstream market knowledge. Leveraging confidential computing means everyone can pool data to derive accurate forecasts while remaining confident they aren’t giving away their competitive advantage.  This brings down inventory levels across the entire chain while also optimizing costs. The more parties that participate, the more benefits there are for everyone.
  • Inventory planning – Suppliers can also record planning, order, inventory, and production data securely on a blockchain and aggregate the data across multiple organizations on a secure confidential computing platform. All participants are assured confidentiality and anonymity in this way while they benefit from the results of analysis. The key to such a solution is that it must be trusted by all parties and guarantee that the underlying data is inaccessible to everyone, including malicious actors.

Can You Afford to Wait?

In 2021, we’re already referring to “the time before.” Solving local exceptions with brute force is part of that time. The emergence of confidential computing as a feasible technology for data sharing and processing means that supply chain management can evolve on a more fundamental level than ever before.

The use cases described here are just a peek at some of the changes that are happening as industry dynamics change with technology. The question may not be whether you can afford to get started today, but rather whether you can afford to wait any longer.

Want to learn more?

Here are some helpful resources to learn more about Confidential Computing and Conclave.

June 14, 2021
June 14, 2021

Remote Attestation: Your X-ray vision into an enclave

Confidential Computing can prove to be a game-changer in enabling multi-party computation without getting into the risk of data leakage or tampering. It would allow multiple enterprises to share confidential information and run various algorithms on it without the risk of their data being seen by each other. If you are new to Confidential Computing… Read more »

Confidential Computing can prove to be a game-changer in enabling multi-party computation without getting into the risk of data leakage or tampering. It would allow multiple enterprises to share confidential information and run various algorithms on it without the risk of their data being seen by each other.

If you are new to Confidential Computing or Conclave, consider taking a look at this article for a brief introduction.

Confidential Computing could lead to huge benefits in various fields—for instance, we can now develop better machine learning models—because of the availability of bigger datasets which was earlier not possible because of the risk of the data being compromised when shared between organizations.

It all comes down to sharing your confidential data with an enclave, where it would be processed and the result would be returned back. All well and good, but how would you know that the enclave in question is really authentic?

Remote Attestation

Remote attestation is that piece of information that helps us to identify the authenticity of an enclave. It is a special data structure that contains the following information:

  • Information indicating that a genuine Intel CPU is running
  • The public key of the enclave
  • A code hash called the measurement
  • Information indicating whether the computer is up-to-date and configured correctly

The most important piece of information that we are interested in here is the measurement. It is a complex hash of the entire module along with its dependencies that is loaded onto the enclave.

Every time a client wants to connect to an enclave and send confidential information for processing, it must first check the remote attestation of the enclave and verify the authenticity of the enclave by comparing the measurement. The remote attestation can be requested from the host.

Below is an example of remote attestation received from the host for an enclave running in simulation mode:

Remote attestation for enclave DB2AF8DD327D18965D50932E08BE4CB663436162CB7641269A4E611FC0956C5F:
— Mode: SIMULATION
— Code signing key hash: 80A866679B567D6B27F5EF9044C13CCB057E761AB8400AD09CC8D70208579611
— Public signing key: 302A300506032B657003210052C7DFDE99D81DF7FF05A2EBED5F8E25FC659A203FAFCA5B07B18CFFD3C5915E
— Public encryption key: F3F02623B55E908C556CE17A13DF385BA621E5D5BCDCDEA8E92E30D4397E0404
— Product ID: 1
— Revocation level: 0
Assessed security level at 2021-05-10T10:09:08.107702Z is INSECURE
– Enclave is running in simulation mode.

Conclave was developed so that any two builds on the same source code should always produce the same measurement. Thus developers can either generate the measurement themselves or rely on a trusted third-party service provider to provide the measurement of the enclave.

Since any update to the source code would change the measurement, it is guaranteed that the enclave does exactly what it says does.

A note on upgrade

It’s pretty evident that any upgrade to the enclave code would result in a change in measurement. This would result in failure since the client would not identify the enclave anymore. A potential solution is to maintain a whitelist of acceptable hashes.

Alternatively, a signing key could be used and as long as the enclave is signed with the key, it could be deemed as authentic.

Want to learn more?

Here are some helpful resources to learn more about Conclave and Confidential Computing.

June 03, 2021
June 03, 2021

A New Era of Privacy-Enhancing Technology Has Arrived

The next frontier for data privacy is fast approaching: according to analyst firm Gartner, by 2025 50% of large organizations will be looking to adopt privacy-enhancing computation (PEC) for processing data in untrusted environments and for multiparty data analytics. PEC is a cross-industry advance that will cause existing data privacy models and techniques to be… Read more »

The next frontier for data privacy is fast approaching: according to analyst firm Gartner, by 2025 50% of large organizations will be looking to adopt privacy-enhancing computation (PEC) for processing data in untrusted environments and for multiparty data analytics. PEC is a cross-industry advance that will cause existing data privacy models and techniques to be radically disrupted, as it offers a new approach to protecting and sharing data across parties without actually revealing that data to anyone.

The appeal of data sharing is clear: sharing data across parties holds the key to unlocking greater analytics and insights, as well as identifying risks and detecting fraud. But if this is the case, why aren’t companies sharing data more freely? The answer is this: they are concerned about the data privacy and security risks that could come from doing so.

Fortunately, the solutions to these concerns are now at hand, with the introduction of Confidential Computing and other privacy-enhancing techniques that put firms in complete control over how their data will be used. To discuss the potential of these new privacy-enhancing technologies, R3’s Chief Technology Officer Richard Gendal Brown recently hosted a webinar where he was joined by two world-leading experts in the field: Michael Klein, Managing Director for Blockchain and Multiparty Systems Architecture at Accenture, and Paul O’Neill, Senior Director of Strategic Business Development at Intel.

Setting the scene, Richard mapped out the discussion in three stages: first, by scoping out the business problems around privacy that traditional technology can’t solve; second, by looking at some of the new technological approaches such as PEC that can solve these problems; and third, by examining how these technologies can actually be applied. According to Richard, “this isn’t a future-looking phenomenon. This is a collection of technologies that can be applied right now.”

So, what exactly is the business problem? Paul O’Neill of Intel said that looking across different industries – especially highly-regulated sectors such as healthcare and finance – the biggest challenge has been the rise of “incentivized collaboration.”

“Imagine you’re a hospital administrator, and you’re going to submit sensitive patient data and healthcare records to a research firm that’s going to perform a clinical trial with the patient’s consent,” Paul explained. “You desperately want to advance medical science. But as an enterprise, you’re worried. What happens if a rogue employee at the research firm steals that data? What if the research firm is using your patient’s data in a way that they didn’t agree to? To anybody involved in privacy, that’s really, really scary.”

What’s needed is a way for firms to know that their data remains protected at all times in a way that a third party cannot observe or even copy it – which is what technologies like Confidential Computing enable. However, these technologies are perceived as complex, and a recurrent theme during the debate was how to cut through this complexity to get to the core business issues. Accenture’s Michael Klein commented: “There are many techniques to encrypt data in use. Some are completely software-based, while some are hardware-based. And we can talk [to clients] about who they are actually trusting. Are they trusting the creator of the software or the creator of the hardware? And then, what are the features that the technique enables, and how ready is it to scale? I think those questions are probably the two biggest things that we encounter as we introduce our privacy-preserving functions or computations: helping our clients to understand that these are all valid techniques, and then choose the one that’s going to best fit their scenario and also scale to meet their needs.”

There isn’t room in this short blog to go into the full richness of the debate. To experience it, click here to watch the webinar recording in-full.

Want to learn more?

Here are some helpful resources to learn more about PEC, Confidential Computing and Conclave.

  • Hear from Gartner on why PEC is a top strategic technology for 2021 in a recent report.
  • Want to learn more about Conclave? Read Richard Brown’s recent blog post titled, “Introducing Conclave.”
  • Are you an app builder? Try a free 60 day trial of Conclave today.

May 14, 2021
May 14, 2021

Solving Double Pay-out Fraud in the Insurance Industry

Insurance fraud is a huge issue for the industry, estimated to cost insurance companies and agents globally more than US$40 billion per year. A particular challenge is “double dipping” fraud, where one insured event such as a motor insurance claim is claimed twice from two different insurers. But now a solution is at hand: ClaimShare, a leading-edge fraud detection application.

Insurance fraud is a huge issue for the industry, estimated to cost insurance companies and agents globally more than US$40 billion per year. A particular challenge is “double dipping” fraud, where one insured event such as a motor insurance claim is claimed twice from two different insurers. But now a solution is at hand: ClaimShare, a leading-edge fraud detection application. Built on R3’s Corda Enterprise and Conclave platforms, ClaimShare was developed through a collaboration between IntellectEU and KPMG, and recently won the global Corda Challenge: Insurtech from R3 and B3i.

Using blockchain and confidential computing technologies to help insurers collaborate and mitigate fraud, ClaimShare is generating huge interest and excitement across the insurance sector. To share more about the solution, and the benefits and opportunities it opens up, R3 recently hosted a webinar where Victor Boardman, R3’s head of insurance for EMEA & APAC, was joined by ClaimShare director Chaim Finizola and Kami Zargar, ACA Director and Head of Forensic at KPMG in Belgium.

At the beginning of the session, Victor introduced Conclave, and its revolutionary impact which springs from its ability to allow information from different companies to be pooled together for joint analysis while keeping the underlying raw information confidential. Explaining that confidential computing keeps data cryptographically secure at rest, in transit and also in processing, Victor expanded: “That means the data can be pooled between multiple insurance companies and brokers, safe in the knowledge that the underlying information can’t be seen by any party, not even the operator of the service.”

Why is such a solution needed? KPMG’s Kami took up the story: “By some estimates, insurance fraud is the second most common type of fraud after tax fraud…and it is estimated that only half of it is detected. So we have been working over several years with the insurers and regulators to define solutions that will help them detect and manage this issue within the bounds of what is possible, given that there have historically been limitations around data, etc, in terms of how to tackle this issue.”

A large proportion of the insurance fraud that’s going undetected is believed to be double-dipping fraud – and ClaimShare’s Chaim explained how the solution addresses this type of crime: “We’ve worked to create an application that would enable the sharing of claims data between different insurers to detect double dipping fraud without needing to make changes to their back office or front office systems. Previously, there has never been a technology that enabled the sharing of data in a fully compliant manner. Using Corda and Conclave as a platform was a game changer, allowing us to share data and match the sensitive data while being fully compliant.”

Chaim went on to provide the webinar attendees with a live demo of ClaimShare in action. To see this – and learn more about how ClaimShare provides insurers with a powerful new weapon in their battle against fraud – click here to view the webinar in full.

Want to learn more?

Here are some helpful resources to learn more about Privacy-Enhancing Computation (PEC), Confidential Computing and Conclave.

April 12, 2021
April 12, 2021

Privacy-Enhancing Computation: The future of cross-institutional secure data sharing

In an effort to reduce risk, institutions continually monitor and segment their customer data. They employ a host of different techniques to effectively do this, including software tools, precise risk scoring, behavior analytics, and more. However, they still may not get the “full picture” when it comes to a customer’s risk profile. Unknown knowns remain in part because customers often have relationships with more than one bank, insurance firm or provider.

In a recent survey conducted by KPMG, 51% of banks reported a significant number of false positives resulting from their technology solutions, hampering efficiencies in fraud detection.

KPMGConclaveQuote

In an effort to reduce risk, institutions continually monitor and segment their customer data. They employ a host of different techniques to effectively do this, including software tools, precise risk scoring, behavior analytics, and more. However, they still may not get the “full picture” when it comes to a customer’s risk profile. Unknown knowns remain in part because customers often have relationships with more than one bank, insurance firm or provider. On top of this, financial crime, money laundering and other fraudulent activities remain difficult to detect because criminals purposely distribute their activities across different institutions, knowing that those institutions hesitate to share their data with one another.

Companies know that data sharing could hold the key to unlocking greater analytics and insights in their fight against fraud and financial crimes. That said, it rarely happens because of an inherent mistrust in how their data will be used. Concerns around data privacy, lack of control, and fear of proprietary information getting into the wrong hands outweigh the benefits of insights that might come from peers pooling data. But what if firms could receive technical assurances that their data will not be viewed or misused as it’s being pooled and analyzed? That they would be in complete control over how it will be used?

Well, now they can.

How Privacy-Enhancing Computation (PEC) can help fight fraud

While fraudulent activities have become more sophisticated, fortunately so has the technology that has the power to stop them. Techniques like Privacy-Enhancing Computation (PEC) provide protections around data privacy, confidentiality and integrity that make different entities feel comfortable pooling data securely across company lines, to gain insights and identify risk. This means that firms will now be able to combine both data and forces to fight financial crime. In fact, a recent Gartner report highlighted Privacy-Enhancing Computation as one of the top strategic trends for 2021 that IT leaders should assess.

PEC solves the challenge that has been facing security experts for years: how to protect data as it’s being shared, analyzed and operated without exposing or viewing the underlying data. Protecting data in use is the next frontier of data security. Data encryption today covers data at rest and in transit but has not been extended to data in use—until now.

Encrypting data in use is critical to identifying the increasingly complex methods that are evolving to commit financial crimes because it allows firms to pool and analyze sensitive data without compromising on data privacy. In the insurance industry, for example, by using privacy-enhancing techniques, firms can pool, analyze, process, and gain insight from confidential data without exposing the underlying data itself, to identify possible fraud due to multiple claims being presented for the same insured event.

This opens up a world of possibilities across industries. This concept is simple yet was nearly impossible to execute on beforehand due to data privacy concerns. R3, IntellectEU and KPMG have leveraged this technology to deliver ClaimShare, a platform developed to detect and prevent double-dipping across insurers, allowing competing insurers to collaborate to fight fraud.

What are the types of techniques included in PEC?

There are a variety of software and hardware-based methods to protect data in use. Some examples include: secure multi-party computation, homomorphic encryption, zero knowledge proofs and trusted execution environments (TEE). Each technique tackles the problem of how to securely protect data in use, with accompanying advantages and drawbacks.

Confidential Computing stands out as a stable, scalable and highly performant solution for a broad range of use cases. By performing computation in a hardware-based TEE, it prevents unauthorized access or modification of applications and data while in use, thereby increasing the security assurances for organizations that manage sensitive and regulated data.

Meet Conclave – a revolutionary new Confidential Computing platform

Conclave is a new privacy-enhancing platform from R3 that utilizes Confidential Computing. Conclave enables the development of solutions, like ClaimShare, that deliver insight from data shared across parties without actually revealing the data to any other party, thus maintaining controls over how data is used while addressing security and compliance-related obligations.

With Conclave-enabled apps, confidential data can be pooled and processed within an Intel® SGX enclave where none of the contributing parties, nor the enclave host, can access the data. End-users can audit the application’s source code and cryptographically verify the application will run as intended before providing sensitive data for joint pooling or analysis.

As a result, end-users can be confident while sending proprietary data outside of their organization, and software providers can build more predictive financial crime risk management and compliance solutions using sensitive data from multiple firms for cross-institutional insights. ​​

What’s next?

Here are some helpful resources to learn more about PEC, Confidential Computing, and Conclave.

  • Read the Gartner report on why Privacy-Enhancing Computation is a top strategic trend for 2021
  • Learn more about Conclave in a recent blog from R3’s Richard Gendal Brown, titled ‘Introducing Conclave
  • Hear from R3’s Victor Boardman on how data sharing can address ‘double dipping’ in insurance claims in a recent article on Insider Engage
  • Try a free 60 day trial of Conclave today

February 24, 2021
February 24, 2021

Building our First App on Conclave

Conclave is an application development platform that helps build secure applications that can process data confidentially. I gave a brief introduction to Conclave in my previous blog. Now let’s look at how we could build our very first app on Conclave. What are we building? We will build a secure auction app where bids from… Read more »

Conclave is an application development platform that helps build secure applications that can process data confidentially. I gave a brief introduction to Conclave in my previous blog. Now let’s look at how we could build our very first app on Conclave.

What are we building?

We will build a secure auction app where bids from parties would remain confidential using an enclave. All bids would be processed within the enclave and the result of the auction would be revealed, without compromising the bids submitted by each participant.

Components of an application built on Conclave

Conclave has three major components: Enclave, Host, and Client. So let’s get into business and start building each of the components of our application on Conclave.

Enclave

An enclave is a program that runs in a protected memory space that can’t be accessed by the OS, Kernel or the BIOS. Thus you can rest assured that the data sent to the enclave is completely confidential.

To build our secure auction app, we need to write an enclave program which takes bids from participants and process the bids to determine the highest bid in order to come up with a winner.

Conclave provides the Enclave class which can be subclassed to build our own enclaves. Data can be sent to the enclave using the Conclave Mail API. Mail API helps achieve end-to-end encrypted communication between the client and the enclave.

 

 

 

We can use the receiveMail method to receive mails send to the enclave. The userRoute parameter helps to map mails to different clients. We have two maps to store userBids and their public keys respectively. Note that Message is a user-defined model object we use to transfer data.

Conclave uses GraalVM Native Image JVM to run enclave programs, which doesn’t support Java serialization. However it supports Kryo, hence we use Kryo for serialization.

Once we have all the bids submitted, the auctioneer can asks the enclave to process the bids and send the result to all participants. The complete code is shown below:

 

 

We have used the auctionAdmin to store the auctioneer keys and routing string which is used later to send the result back to the auctioneer. Conclave provides the Postman feature which is used to create mail to communicate between enclave and clients which is used in the sendMail method.

That completes our enclave, let’s look at building our host component next.

Host

Host programs are used to load the enclaves and it also serves as a proxy between the client and the enclave. Hosts are considered untrusted at all times and hence all communication between host and enclaves is always encrypted.

 

 

The first thing we do as we initialize our host is to verify the hardware support to run enclaves.

 

 

Once we have verified the platform support, we can go ahead and load the enclave program.

 

 

When the enclave is started it returns a callback which is used to send enclave responses back to the client. The MailCommand object contains the response content and the routing parameter to map responses to different clients.

Once the enclave program has been started we can now start the TCP server to start accepting client requests.

 

 

We are using simple TCP connection (for simplicity) for communication between host and client. You could use more sophisticated protocols like GRPC or whatever suits you better.

All clients must verify the authenticity of the enclave before sending confidential information to the enclave, hence we send the attestation object to the clients as they connect to the host. Clients can utilize this information to verify the measurement and make decisions if they could trust the enclave.

 

 

Finally the clients can send their confidential information which the host can then forward to the enclave.

 

 

You could take a look at the final code here.

Client

Now we are left with the final piece of the puzzle to complete our first application on Conclave. Let’s build our client.

 

 

First we try to establish a connection with the host and get the attestation object.

 

 

For the purpose of this blog I have just printed out the Attestation info on the console. For real applications however the attestation should be verified before sending any information to the enclave.

In real world use cases either the client would have access to the source code of the enclave which they built to reproduce the measurement or use a trusted service provider to verify the attestation information.

The next step would be send the bid to the enclave.

 

 

Notice that we use the postman feature again to create our Mail to be sent to the enclave. To get the bid we take an input from the user from the console.

 

 

Finally we need to write the code for reading the response from the enclave. We could use the postoffice’s decryptMail to decrypt the encrypted message from the enclave to read the response.

 


Congratulations!! We have successfully built our first application on Conclave

Source code

The entire source code for this application we built is available here. If you wish to run the application, please follow our guide in the documentation here.

I hope you liked the tutorial and thanks for reading.

February 11, 2021
February 11, 2021

A Deep Dive into Conclave 1.0

Another day, another vulnerability, another hack. Losing control of critical personal data feels random and inevitable (here’s one very recent example). It’d be great if we could trust IT service providers, but we can’t. Even if they’re totally respectable pillars of society who have only the best intentions, the difficulty of keeping networks secure means… Read more »

Another day, another vulnerability, another hack. Losing control of critical personal data feels random and inevitable (here’s one very recent example).

It’d be great if we could trust IT service providers, but we can’t. Even if they’re totally respectable pillars of society who have only the best intentions, the difficulty of keeping networks secure means their good will isn’t enough.

The computer industry has spent many years researching solutions. One of them is confidential computing (formerly known as trusted computing). Software running on a server can prove its identity over the internet via what’s called a remote attestation. The attestation contains the hash of the program, the fact that it’s been tamper-proofed by the hardware and an encryption key. Software outside the so-called enclave cannot get in to access its secrets, devices outside the CPU are blocked by transparent memory encryption, and clients can communicate with the enclave using the key.

It’s a simple concept yet with it you can solve many problems that previously required intractably slow or complex cryptography:

  • Any computation that combines data from multiple parties and doesn’t want a trusted intermediary.
  • Proving to users how a software-as-a-service actually operates, so the privacy policy stops being just a bunch of words and becomes a hard guarantee.
  • Building decentralized services from peer-to-peer networks of untrusted nodes.
  • Blocking attacks on your servers by keeping the data the attackers want inside an enclave whilst pushing as much software as possible to the outside.

Only one problem: the idea is simple yet using it is hard.

Enter …. 🥁 …. Conclave 1.0.

But first, let’s briefly talk about how to design enclaves the right way.

The Zen Of Enclaves

As of January 2021 the best implementation of confidential computing is Intel SGX. It follows the UNIX philosophy of small programs, each doing one task well. An enclave is meant to be as small as possible and meant to do only one thing — computation on some data. Everything else should be kept outside the enclave: network handling, database management, monitoring, metrics, administration … all of it.

Some approaches to confidential computing don’t do this, and attempt to run an entire operating system and serving stack inside the enclave. This isn’t useless, but it’s also not really maximizing the benefits of the concept. There are two problems with it:

  1. Putting a vulnerable software stack inside a protected memory space doesn’t make it secure. Enclaves erect a hard border between software components so malicious or hacked software on one side can’t get into the other, but that’s no use if the software the enclave is protecting is itself vulnerable. One way to minimize vulnerabilities is to just minimize the amount of code in the protected space that’s handling attacker-controlled data.
  2. Remote attestation is a fundamental part of the concept. Users check what’s running before they upload their data. But, attestations just give you a SHA2 hash. To know what it means someone must audit the software that hashes to that value, and check that it really does what it claims to do. If your software stack changes every day the hash will change every day too so how can your users — or external auditors — possibly keep up?

Reflecting on the zen of enclaves we reach the following conclusions:

  • An enclave is a protected sub-module of your wider application, not an entire application or serving stack by itself.
  • An enclave is only core business logic that your users care about.
  • An enclave is a security weak point: coding errors inside the enclave render the protection useless.

Therefore plumbing — stuff that’s neither here nor there from your users perspective— that stuff should be kept outside the enclave. Upgrades to it won’t change the hash reported to clients and thus won’t imply any additional audit work. The enclave itself should be written with tools that help us avoid coding errors.

Enter Conclave

Conclave is a simple API that lets you take a module of your app and run it inside an enclave. It uses GraalVM, so supports any language that can run on Graal. Languages we’ve tested include Java, JavaScript and Kotlin, but also possible are R, Python, Ruby, C, C++, FORTRAN, Rust, Clojure and more. You apply a plugin for the Gradle build system. Then you structure your app to pass messages in and out of that module, instead of doing function calls.

A JVM is an excellent tool for writing enclaves because of its emphasis on combining performance and safety. Garbage collected and type safe code is provably free of memory management errors, which are still one of the most common ways software gets hacked. In Conclave we use the GraalVM Native Image JVM, which produces self-contained binaries with minimal memory usage.

There are always 3 components in any enclave-oriented application:

  1. The client
  2. The host
  3. The enclave

You can learn more about how these pieces interact in this article on Conclave’s architecture.

Conclave provides a client library that can be used to send and receive messages from the enclave. It works a little differently to how other enclave APIs do, so you can read about the justifications here and here.

Writing a simple apps is straightforward. Follow the hello world tutorial to learn the additional steps required: mostly, this means configuring the build system and then checking the server-side code from the client using the Conclave API.

Conclave 1.0

Today we launched the first stable version of Conclave. It’s been in beta for a while, and during that time we’ve done usability studies on the API to ensure it’s understandable and flexible. It’s free for individuals and early-stage startups who open source their code, and pricing for everybody else starts low and grows only as your solution itself gains adoption. In other words it’s free to experiment and learn. The documentation is available here.

This new release adds to beta 4 the following enhancements:

  • A better API for mail.
  • Padding for mails to ensure that message sizes can’t act as a side channel. Different padding policies are provided: you can pick between a fixed min size, max seen so far, a moving average or a custom policy.
  • The java.nio.file API is now connected to an in-memory file system. This enhances compatibility with libraries that expect to load data or configuration from files, whilst avoiding the complexities of running a full filesystem engine inside the enclave. For persistence use the mail-to-self pattern.
  • A new script is provided to run Gradle inside a Linux container, on macOS. This can simplify running tests against a fully compiled enclave (i.e. not using mock mode).
  • Enclaves are now locked by default, i.e. multi-threaded enclaves are now opt-in rather than opt-out. This ensures a malicious host can’t multi-thread an enclave that’s not thread-safe.
  • GraalVM has been upgraded to 20.3, improving performance and compatibility. An upgrade to 21.0 will come soon which will add support for Java object serialization.
  • Various usability enhancements, bug fixes, and other safer defaults.
Thanks to Shams Asari and Richard Brown. 

February 11, 2021
February 11, 2021

Introducing Conclave

R3 launched Conclave this week, a new platform that makes it easy to build privacy-protecting solutions, through the use of Confidential Computing. But what is Confidential Computing, why should you care and why is Conclave the platform to watch? The data sharing dilemma Imagine you’re a hospital administrator whose mouse is hovering over the “submit”… Read more »

R3 launched Conclave this week, a new platform that makes it easy to build privacy-protecting solutions, through the use of Confidential Computing. But what is Confidential Computing, why should you care and why is Conclave the platform to watch?

The data sharing dilemma

When you click “submit data” you lose control of your data. But what if there was a technological way to retain control?

Imagine you’re a hospital administrator whose mouse is hovering over the “submit” button. When you click that button some of your patients’ most sensitive healthcare records will be uploaded to a research firm which is performing a clinical trial. You have the patients’ consent; they desperately want to help advance medical science. But you’re still worried.

What would happen if a rogue employee at the research firm stole the data? What if the research firm is using your patients’ data in a way they didn’t agree to? It’s a scary thought.

And it’s not just hospitals. Organizations in all industries face this dilemma every single time they send sensitive data to a third party. The brutal reality is that the moment you send information to somebody else, you’ve just given up any technological control over it.

You’re reliant on privacy policies, goodwill, contracts and law. Those privacy policies that nobody reads? That’s the only thing protecting your most sensitive information when you send it somewhere else!

This is truly terrifying when you think about it. Yet we somehow regard it as normal. “It’s just the way computers work”

Ultimately, when you send data to somebody else’s computer, you’ve handed over full control to them. They could change the algorithms that are running and you would never know. They could take a copy of your data and use it for their own purposes and you would never know. They could give you false results and you would never know.

But what if there was a way that you could know? What if you could know, for sure, what algorithm your service provider was running? What if you could know if it had changed? And what if you could know that your data remains protected at all times so the service provider could not observe or copy it even if they wanted to?

It turns out you now can!

Confidential Computing: protect your data when in someone else’s hands

A new technology called “Confidential Computing” provides the answer.

Confidential Computing allows us to imagine a new weapon in our privacy arsenal: the ability to protect your data even whilst it’s in the hands of somebody else.

Imagine if the hospital administrator could examine the algorithms that the research firm will use before they upload their data, and that not even a rogue employee in the IT department of the research firm could see the patient data or change those algorithms.

Confidential Computing lets us imagine a future where you know, for sure, what will happen to your information after you click the ‘upload’ button.

And this technology isn’t limited to doctors and hospitals. Imagine a multi-national bank can now share data for analytics between its branches located in different jurisdictions, all without worrying about violating privacy laws. Or a group of insurers can detect fraudulent claims by cross-checking against each other’s private client data, yet never revealing what that data actually is.

That’s the future Confidential Computing techniques allows us to imagine.

But there’s a problem: the technology is hard to use and few people even know it exists. 

After all, did you know this future was a possibility before you read the paragraphs above?!

Conclave makes Confidential Computing easy

This is why R3 developed Conclave: to make it easy to use this technology.

Conclave makes it possible for regular developers to write applications that their users can remotely ‘audit’. And Conclave makes it easy for these users to actually wield that power. Conclave makes the promise of “know exactly what will happen to your data when it’s in somebody else’s hands” a reality.

And we at R3 are working with our network of clients and partners to bring Conclave-enabled solutions rapidly to market so everybody can see for themselves how powerful this new technology could be.

Indeed, the insurance scenario I described above isn’t just an idea; ClaimShare is using Conclave to do just that!

Conclave is available as a 60 day trial if you’re an established firm. And if you’re an individual or an early-stage startup, you can use Conclave for free. The new world of Confidential Computing is so big and so under-explored that we at R3 want as many innovators as possible to join us in exploring the potential of this new technological super-power. All we ask is that if take up the free option after any trial you share what you learn by open sourcing your apps so others can follow in your footsteps. The potential of Confidential Computing is awe-inspiring. And Conclave is the key to mass adoption. Join us to make this new world a reality!

February 11, 2021
February 11, 2021

Conclave: Secure Confidential Computing

You might have heard this phrase — “Data is the new Currency”, very true indeed. Over the past decade, we have been inventing technologies that use data to benefit humans in ways unimaginable. Thus data is precious and should be protected from misuse in all ways possible. While we have ways to protect data from unauthorized access,… Read more »

You might have heard this phrase — “Data is the new Currency”, very true indeed. Over the past decade, we have been inventing technologies that use data to benefit humans in ways unimaginable. Thus data is precious and should be protected from misuse in all ways possible. While we have ways to protect data from unauthorized access, something that has been neglected is the misuse of data by authorized personnel. We always rely on trust in various cases on some person or organization, to be honest, and handle sensitive data in a proper way. This is not a great idea and leads to a lot of issues in many cases.

Let’s take an example of a tender processing use case. Suppose an organization issues an RFP (Request for Proposal) to invite bids for a particular project. While different interested parties submit their bids/ proposals in a confidential manner, they are still at the mercy of the person/ organization handling the entire process to keep their bids confidential. It is pretty much possible that someone having access to this confidential information might leak it for their own benefit.

This definitely seems to be a huge problem. Someone needs to have access to the data to be able to process it. The only possible way to protect data in such cases is to process it without revealing it. But is it even possible?

Yes, it is! Let me introduce you to Conclave.

What exactly is Conclave?

Conclave is an application development platform that can be used to build enclaves. In simple words, an enclave is a small piece of software that runs in an isolated region in the memory. Access to this region of memory is blocked to everyone, even privileged software like kernel and BIOS. Thus code and data on the enclave can’t be read/ tempered by anyone, not even by the owner of the computer in which it runs.

Enclaves require some hardware support, Intel SGX (Software Guard Extensions) is an implementation of enclave-oriented computing. Conclave builds on SGX to give developers a toolkit to build enclaves using high-level languages like Java.

While SGX enabled hardware is required to run apps in production, it’s not essential for development. You could run your application in simulation mode which doesn’t require an SGX hardware. Learn about different enclave modes here: http://docs.conclave.net/tutorial.html#enclave-modes

Thus multiple parties can use a conclave app to solve a multi-party compute problem, without worrying about the data being compromised. Data is encrypted and send to the enclave where its decrypted and processed and the result is sent back. Thus no one has access to the private data other than the enclave. Ans since enclaves are loaded in protected memory space which can’t be accessed, data can’t be tempered.

Getting to know a Conclave powered app

Before we start building your first application on Conclave, we first need to understand some of the basics so that we know how to design your app.

An app built on Conclave has 3 major components:

  • Enclave
  • Host
  • Client

Enclaves are the programs that are loaded in the protected memory space.

Hosts are programs that are responsible for loading the enclaves and provide resources required by the enclave. They mostly act as a proxy between the client and the enclave. Hosts are untrusted and are assumed to be malicious at all times, hence communication between host and enclaves are encrypted.

Clients send encrypted data to the enclave for processing via the host. Conclave comes with the Mail API to ease the communication between enclaves and clients.

Client’s don’t directly communicate with the enclave, They send encrypted messages to the host and the host forwards them to the enclave for processing. Enclaves have their own key-pair which is used for encryption. Thus though data is transferred via the host, the host cant tamper with it since only the enclave can decrypt the messages using its private key.

But how does a client trust that a public key actually belongs to an enclave and not something pretending to be an enclave? To handle this issue, something called remote attestation is used.

Remote Attestation

Remote attestation is a piece of data that contains important information that can be used to verify an enclave. Among other information, it contains something called a measurement. A measurement is a unique hash generated using a special tool, the hash is generated using the fat-jar that’s loaded onto the enclave. The measurement can be verified by compiling the enclave source code. Conclave takes care of the fact that the multiple builds of the same source-code produce the same measurement.

The approach however could get a little complicated across upgrades, thus a signing key could be used as an alternative. The enclaves could be signed with a specific key and its information can be included in the remote attestation.

Enclave Application Components Interaction

In addition to remote attestation, clients can also request Intel for an assessment of the enclave to verify if it is secure.


That should give you a brief idea of what Conclave is and how you could benefit from building on it.

We will look at how to build your first application on Conclave in my next blog. Stay tuned and thanks and reading!

January 21, 2021
January 21, 2021

Protect your Crown Jewels with Conclave

In 1671, a man named Thomas Blood almost managed to steal the Crown Jewels. He achieved this by befriending the keeper of the jewels, Talbot Edwards. After gaining his trust, Blood convinced Edwards to let him into the Jewel House to show the jewels to Blood and his companions. Once the thieves had been let… Read more »

In 1671, a man named Thomas Blood almost managed to steal the Crown Jewels.

He achieved this by befriending the keeper of the jewels, Talbot Edwards. After gaining his trust, Blood convinced Edwards to let him into the Jewel House to show the jewels to Blood and his companions. Once the thieves had been let in, they knocked Edwards unconscious and took the jewels from right under his nose!

Why did Blood choose to steal the jewels in this way? What were his alternatives? He could have attempted to break into the Jewel House while the jewels were unattended. Alternatively, he could have waited until the jewels had to be moved, and attempted to steal them while they were being transported to their destination. In both scenarios, the jewels would have been under strong protection. It would have been nearly impossible to break into the Jewel House, and the jewels were likely to be guarded very closely while in transit. Instead, Blood took advantage of the trust he had gained to try to steal the jewels while they were most vulnerable: when the majority of their protection had been lifted.

Data exists in one of 3 states: at rest in local storage, in transit between two locations, and in use when it is being processed by applications. Jewels unattended in the vault are like data at rest, and jewels being transported to a new location can be thought of as data in transit. In both of these scenarios, modern encryption makes accessing the data infeasible.

When we want a third party to do something with our data, we have to decrypt it to allow them to use it. Third party services allow us to make the most of our data by providing software and hardware that lets us process it, but this comes with associated risk. Our data, like the jewels, is most vulnerable while in use.

Knowing this, how can we protect data that is in use while still allowing third parties to help us generate insight from it? Is there a way to leverage the power of cloud computing but protect our data from cloud providers? How can we ensure that the people using our data will not be able to misuse it in this vulnerable state?

What if Edwards had been able to place a secure, tamperproof barrier in front of the Crown Jewels, even while they were being viewed? What if he could have kept Blood bound to his promise that he only intended to view the jewels and nothing more?

In software, we can achieve this using secure enclaves, which create a trusted environment that is isolated from the operating system that hosts them. Enclaves can access and perform computations on data, while the data inside remains encrypted to the host operating system. We can cryptographically assert exactly what code is running inside our enclaves so we know that there will be no surprises.