Security and privacy are arguably the most significant concern for enterprises and consumers using public cloud platforms. The Confidential Computing group at Microsoft Research Cambridge has been conducting pioneering research in the design of systems, hardware, and tools that guarantee strong security and privacy properties to cloud users.
As part of our research, we are designing
- new security primitives in general purpose processors and specialized accelerator hardware
- languages and language runtimes that guarantee safety by construction
- new services such as databases, blockchains, and machine learning platforms that utilize secure hardware and runtimes to guarantees strong security properties
- leakage analysis and mitigation techniques for privacy-preserving machine learning pipelines
We place strong emphasis on designing systems that can be deployed at scale in the real world. We work very closely with products groups in Microsoft such as Azure and SQL Server, and we have contributed to deployed services such as Azure Confidential Computing, CCF (Confidential Consortium Framework), Open Enclave, Azure ML, and Always Encrypted.
Responsibilities
Propose, explore, and analyse new ideas in security, safety, and privacy domains through theoretical, practical, and application-level contributions.
Please find more information about our group through our publications and projects (Confidential Computing - Microsoft Research) including Azure Confidential Computing, CCF, Confidential AI, and Everest.
Qualifications
Students enrolled in a PhD program or outstanding undergraduate/Master’s students with research experience.
One or two papers at top security conferences (e.g., USENIX, CCS, S&P, NDSS) or papers focusing on security, safety, or privacy and appearing at top machine learning (e.g., NeurIPS, ICML, ICLR), systems (e.g., ASPLOS, OSDI, SOSP) or programming language (e.g., PLDI, POPL, OOPSLA) venues is strongly desired.
Letter of support from a supervisor and, optionally, from another senior collaborator.