Surveillance First: How Not to Write a Contact Tracing App
Publish date: Apr 30, 2020
How do we convince everyone it is safe? Add blockchain.
NOTE: This post is not a critique of contact tracing in general; it is specific to contact tracing implementations that ignore better and more transparent user privacy mechanisms already available. Using blockchain doesn’t fix privacy/security issues such as tracking gps, collecting phone numbers, posting infra secrets on public repos in contact tracing apps. Transparent design, rigorous review is a must to build public trust.
Introduction
Yet another contact tracing app is out. Touted as Privacy-first COVID-19 app TRACY, the application claims to help track Covid spread without compromising privacy. There is a whitepaper and a public GitHub repository.
Let’s take a quick look under the hood to see what the app does and how it makes everyone unsafe.
Actors
Three actors are defined.
- Citizens
- Authorities, defined very broadly to include Medical, Police, Customs and “other registered volunteers”
- Control room, which is basically a dashboard to track everybody and see their location and if they have breached quarantine
Types of PII data collected
Tons of PII is collected.
- Social media accounts - the whitepaper suggests the user can login via social media apps, such as LinkedIn, Facebook, Twitter.
- The source code in repository shows that Name, Email and Mobile numbers are required to register. The mobile number is verified using OTP.
- Location data - source code shows that highest possible GPS accuracy is used to pinpoint the user.
- Recurrent location tracking every three hours in background
- User is asked to input personal health symptoms
- Emergency contact information is also collected
- App has access to user’s camera and audio. User is required to take regular selfies if quarantined
Lack of Data protection
- Google’s cloud platform seems to be in use to upload all the data and store “signed and encrypted”.
- The data is touted as signed and immutable; sounds great, yet it’s perplexing. Signing the data takes the privacy away as it makes it hard to repudiate the collected data.
- The data is touted as encrypted, yet if the data was encrypted by the user’s app, it wouldn’t be possible for authorities to collect the location data whenever they want for all users. Clearly, if there is any encryption, it is decryptable on server side, which means no real protection for user.
- Data seems to be kept indefinitely, great tool for authorities to track all users’ past movements, even beyond possible infection window.
Linkage Attacks
- The app collects PII.
- The app doesn’t rotate identifiers regularly.
- The app keeps data indefinitely.
Infrastructure Insecurity
- Infrastructure secrets are stored on the public GitHub repository.
- Using those creds, anyone can dump the notification tokens for all users, and send any arbitray notification to any or all phone.
- One can only assume the insecurity of cloud configuration as well as the dashboard and authorities apps, that would have access to everything.
References
- Privacy-first COVID-19 app TRACY chooses Matic Network, https://medium.com/moibit/privacy-first-covid-19-app-tracy-chooses-matic-network-633f1bba4c66
- Tracy App - Whitepaper, https://gettracyapp.netlify.app/assets/files/TRACY-Whitepaper_NoCopy.pdf
- Tracy Mobile App - GitHub, https://github.com/moibit/tracy-mobile-app/