Chromium‎ > ‎Chromium Security‎ > ‎

Quarterly Updates

We post a newsletter-y update quarterly on security-dev@chromium.org. It's an open list, so subscribe if you're interested in updates, discussion, or feisty rants related to Chromium security.


Q1 2017


Greetings and salutations,


It's time for another update from your friends in Chrome Security, who are hard at work trying to make Chrome the most secure platform to browse the Internet. Here’s a recap from last quarter:


Our Bugs-- effort aims to find (and exterminate) security bugs. In order to get bugs fixed faster, we released a new tool to improve developer experience when trying to reproduce ClusterFuzz bugs. We have overhauled a significant part of the ClusterFuzz UI which now feature a new fuzzer statistics page, crash statistics page and fuzzer performance analyzer. We’ve also continued to improve our OSS-Fuzz offering, adding numerous features requested by developers and reaching 1000 bugs milestone with 47 projects in just five months since launch.


Members of the Chrome Security team attended the 10th annual Pwn2Own competition at CanSecWest. While Chrome was again a target this year, no team was able to demonstrate a fully working chain to Windows SYSTEM code execution in the time allowed!


Bugs still happen, so our Guts effort builds in multiple layers of defense. Chrome 56 takes advantage of Control Flow Guard (CFG) on Windows for Microsoft system DLLs inside the Chrome.exe processes. CFG makes exploiting corruption vulnerabilities more challenging by limiting valid call targets, and is available from Win 8.1 Update 3.


Site Isolation makes the most of Chrome's multi-process architecture to help reduce the scope of attacks.  The big news in Q1 is that we launched --isolate-extensions to Chrome Stable in Chrome 56! This first use of out-of-process iframes (OOPIFs) ensures that web content is never put into an extension process. To maintain the launch and prepare for additional uses of OOPIFs, we fixed numerous bugs, cleaned up old code, reduced OOPIF memory usage, and added OOPIF support for more features (e.g., IntersectionObserver, and hit testing and IME on Android). Our next step is expanding the OOPIF-based <webview> trial from Canary to Dev channel and adding more uses of dedicated processes.


Beyond the browser, our web platform efforts foster cross-vendor cooperation on developer-facing security features. Over the holidays, Google's security team gave us a holiday gift consisting entirely of interesting ways to bypass CSP's nonces. We've fixed some obvious bugs they uncovered, and we'll continue working with other vendors to harden the spec and our implementations. In other CSP news, we polished a mechanism to enforce CSP on child frames, shipped a `script-sample` property in CSP reports, and allowed hashes to match external scripts. We're also gathering data to support a few dangling markup mitigations, and dropped support for subresource URLs with embedded credentials and legacy protocols.


We also spend time building security features that users see. To protect users from Data URI phishing attacks, Chrome shows the “not secure” warning on Data URIs and intends to deprecate and remove content-initiated top-frame navigations to Data URIs. We also brought AIA fetching to Chrome for Android, and early metrics show over an 85% reduction in the fraction of HTTPS warnings caused by misconfigured certificate chains on Android. We made additional progress on improving Chrome’s captive portal detection. Chrome now keeps precise attribution of where bad downloads come from, so we can catch malware and UwS earlier. Chrome 57 also saw the launch of a secure time service, for which early data shows detection of bad client clocks when validating certificates improving from 78% to 95%.


We see migration to HTTPS as foundational to any web security whatsoever, so we're actively working to drive #MOARTLS across Google and the Internet at large. To help people understand the security limitations of non-secure HTTP, Chrome now marks HTTP pages with passwords or credit card form fields as “not secure” in the address bar, and is experimenting with in-form contextual warnings. We’ll remove support for EME over non-secure origins in Chrome 58, and we’ll remove support for notifications over non-secure origins in Chrome 61. We talked about our #MOARTLS methodology and the HTTPS business case at Enigma.


In addition to #MOARTLS, we want to ensure more secure TLS through work on protocols and the certificate ecosystem. TLS 1.3 is the next, major version of the Transport Layer Security protocol. In Q1, Chrome tried the first, significant deployment of TLS 1.3 by a browser. Based on what we learned from that we hope to fully enable TLS 1.3 in Chrome in Q2.


In February, researchers from Google and CWI Amsterdam successfully mounted a collision attack against the SHA-1 hash algorithm. It had been known to be weak for a very long time, and in Chrome 56 dropped support for website certificates that used SHA-1. This was the culmination of a plan first announced back in 2014, which we've updated a few times since.


As ever, many thanks to all those in the Chromium community who help make the web more secure!


Cheers


Andrew, on behalf of the Chrome Security Team


For more thrilling security updates and feisty rants, subscribe to security-dev@chromium.org. You can find older updates at https://dev.chromium.org/Home/chromium-security/quarterly-updates.


Q4 2016


Greetings and salutations,


It's time for another update from your friends in Chrome Security, who are hard at work trying to make Chrome the most secure platform to browse the Internet. Here’s a recap from the last quarter of 2016:


Our Bugs-- effort aims to find (and exterminate) security bugs.


We announced OSS-Fuzz, a new Beta program developed over the past years with the Core Infrastructure Initiative community. This program will provide continuous fuzzing for select core open source software. See full blog post here. So far, more than 50 projects have been integrated with OSS-Fuzz and we found ~350 bugs.


Security bugs submitted by external researchers can receive cash money from the Chrome VRP.


Last year the Chrome VRP paid out almost one million dollars! More details in a blog post we did with our colleagues in the Google and Android VRPs.


Bugs still happen, so our Guts effort builds in multiple layers of defense.


Win32k lockdown for Pepper processes, including Adobe Flash and PDFium was shipped to Windows 10 clients on all channels in October 2016. Soon after the mitigation was enabled, a Flash 0-day that used win32k.sys as a privilege escalation vector was discovered being used in the wild, and this was successfully blocked by this mitigation! James Forshaw from Project Zero also wrote a blog about the process of shipping this new mitigation.


A new security mitigation on >= Win8 hit stable in October 2016 (Chrome 54). This mitigation disables extension points (legacy hooking), blocking a number of third-party injection vectors. Enabled on all child processes - CL chain. As usual, you can find the Chromium sandbox documentation here.


Site Isolation makes the most of Chrome's multi-process architecture to help reduce the scope of attacks.


Our earlier plan to launch --isolate-extensions in Chrome 54 hit a last minute delay, and we're now aiming to turn it on in Chrome 56. In the meantime, we've added support for drag and drop into out-of-process iframes (OOPIFs) and for printing an OOPIF. We've fixed several other security and functional issues for --isolate-extensions as well. We've also started an A/B trial on Canary to use OOPIFs for Chrome App <webview> tags, and we're close to starting an A/B trial of --top-document-isolation.


Beyond the browser, our web platform efforts foster cross-vendor cooperation on developer-facing security features.


After a good deal of experimentation, we (finally) tightened the behavior of cookies' `secure` attribute. Referrer Policy moved to a candidate recommendation, we made solid progress on Clear-Site-Data, and we expect to start an origin trial for Suborigins shortly.


Looking to the future, we've started to flesh out our proposal for stronger origin isolation properties, continued discussions on a proposal for setting origin-wide policy, and began working with the IETF to expand opt-in Certificate Transparency enforcement to the open web. We hope to further solidify all of these proposals in Q1.


We also spend time building security features that users see.


Our security indicator text labels launched in Chrome 55 for “Secure” HTTPS, “Not Secure” broken HTTPS, and “Dangerous” pages flagged by Safe Browsing. As part of our long-term effort to mark HTTP pages as non-secure, we built address-bar warnings into Chrome 56 to mark HTTP pages with a password or credit card form fields as “Not secure”.


We see migration to HTTPS as foundational to any web security whatsoever, so we're actively working to drive #MOARTLS across Google and the Internet at large.


We added a new HTTPS Usage section to the Transparency Report, which shows how the percentage of Chrome pages loaded over HTTPS increases with time. We talked externally at O’Reilly Security NYC + Amsterdam and Chrome Dev Summit about upcoming HTTP UI changes and the business case for HTTPS. We published positive stories about HTTPS migrations.


In addition to #MOARTLS, we want to ensure more secure TLS.


We concluded our experiment with post-quantum key agreement in TLS. We implemented TLS 1.3 draft 18, which will be enabled for a fraction of users with Chrome 56.


And here are some other areas we're still investing heavily in:


Keeping users safe from Unwanted Software (UwS, pronounced 'ooze') and improving the Chrome Cleanup Tool, which has helped millions remove UwS that was injecting ads, changing settings, and otherwise blighting their machines.


Working on usable, understandable permissions prompts. We're experimenting with different prompt UIs, tracking prompt interaction rates, and continuing to learn how best to ensure users are in control of powerful permissions.


As ever, many thanks to all those in the Chromium community who help make the web more secure!


Cheers


Andrew, on behalf of the Chrome Security Team


For more thrilling security updates and feisty rants, subscribe to security-dev@chromium.org. You can find older updates at https://dev.chromium.org/Home/chromium-security/quarterly-updates.


Q3 2016

Greetings and salutations!


It's time for another update from your friends in Chrome Security, who are hard at work trying to make Chrome the most secure platform to browse the Internet. Here’s a recap from last quarter:


Our Bugs-- effort aims to find (and exterminate) security bugs.

We have continued to improve upon our libFuzzer and AFL integration with ClusterFuzz, which includes automated performance analysis and quarantining of bad units (like slow units, leaks, etc). We have scaled our code coverage to ~160 targets with help from Chrome developers, who contributed these during the month-long Fuzzathon. We have improved our infrastructure reliability and response times by adding a 24x7 monitoring solution, and fixing more than two dozen fuzzers in the process. Finally, we have refined our crash bucketization algorithm and enabled automatic bug filing remove human latency in filing regression bugs — long live the machines!


For Site Isolation, the first uses of out-of-process iframes (OOPIFs) have reached the Stable channel in Chrome 54!

We're using OOPIFs for --isolate-extensions mode, which ensures that web content is never put into a privileged extension process.  In the past quarter, we made significant progress and fixed all our blocking bugs, including enabling the new session history logic by default, supporting cross-process POST submissions, and IME in OOPIFs.  We also fixed bugs in painting, input events, and many other areas.  As a result, --isolate-extensions mode has been enabled for 50% of M54 Beta users and is turned on by default in M55.  From here, we plan to further improve OOPIFs to support --top-document-isolation mode, Chrome App <webview> tags, and Site Isolation for real web sites.


We also spend time building security features that users see.

We overhauled Chrome’s site security indicators in Chrome 52 on Mac and Chrome 53 on all other platforms, including adding new icons for Safe Browsing. These icons were the result of extensive user research which we shared in a peer-reviewed paper. Lastly, we made recovering blocked-downloads much less confusing.


We like to avoid showing unnecessarily scary warnings when we can. We analyzed data from opted-in Safe Browsing Extended Reporting users to quantify the major causes of spurious TLS warnings, like bad client clocks and misconfigured intermediate certificates. We also launched two experiments, Expect-CT and Expect-Staple, to help site owners deploy advanced new TLS features (Certificate Transparency and OCSP stapling) without causing warnings for their users.


Beyond the browser, our web platform efforts foster cross-vendor cooperation on developer-facing security features.  

We continued to lock down the security of the web platform while also expanding capabilities to developers. We helped lock down cookies by starting to ship Strict Secure Cookies. Similarly, we also shipped the Referrer Policy spec and policy header. Content Security Policy was expanded with the strict-dynamic and unsafe-hashed-attributes directives. Our work on suborigins continued, updating the serialization and adding new web platform support.


We've also been working on making users feel more in control of powerful permissions.

In M55 and M56 we will be running experiments on permissions prompts to evaluate how this affects acceptance and decision rates. The experiments are to let users make temporary decisions, to auto-deny prompts if users keep ignoring them, and making permission prompts modal.


We see migration to HTTPS as foundational to any web security whatsoever, so we're actively working to drive #MOARTLS across Google and the Internet at large.

We announced concrete steps towards marking HTTP sites as non-secure in Chrome UI — starting with marking HTTP pages with password or credit card form fields as “Not secure” starting in Chrome 56 (Jan 2017). We added YouTube and Calendar to the HTTPS Transparency Report. We’re also happy to report that www.google.com uses HSTS!


In addition to #MOARTLS, we want to ensure more secure TLS.

We continue to work on TLS 1.3, a major revision of TLS. For current revisions, we’re also keeping the TLS ecosystem running smoothly with a little grease. We have removed DHE based ciphers and added RSA-PSS. Finally, having removed RC4 from Chrome earlier this year, we’ve now removed it from BoringSSL’s TLS logic completely.


We launched a very rough prototype of Roughtime, a combination of NTP and Certificate Transparency. In parallel we’re investigating what reduction in Chrome certificate errors a secure clock like Roughtime could give us.


We also continued our experiments with post-quantum cryptography by implementing CECPQ1 to help gather some real world data.


As ever, many thanks to all those in the Chromium community who help make the web more secure!


Cheers


Andrew on behalf of the Chrome Security Team


Q2 2016


Greetings Earthlings,


It's time for another update from your friends in Chrome Security, who are hard at work trying to make Chrome the most secure platform to browse the Internet. Here’s a recap from last quarter:


Our Bugs-- effort aims to find (and exterminate) security bugs. At the start of the quarter, we initiated a team-wide Security FixIt to trim the backlog of open issues… a bit of Spring cleaning our issue tracker, if you will :) With the help of dozens of engineers across Chrome, we fixed over 61 Medium+ severity security bugs in 2 weeks and brought the count of open issues down to 22! On the fuzzing front, we’ve added support for AFL and continued to improve the libFuzzer-ClusterFuzz integration, both of which allow coverage-guided testing on a per-function basis. The number of libFuzzer based fuzzers have expanded from 70 to 115, and we’re processing ~500 Billion testcases every day! We’re also researching new ways to improve fuzzer efficiency and maximize code coverage (example). In response to recent trends from Vulnerability Reward Program (VRP) and Pwnium submissions, we wrote a new fuzzer for v8 builtins, which has already yielded bugs. Not everything can be automated, so we started auditing parts of mojo, Chrome’s new IPC mechanism, and found several issues (1, 2, 3, 4, 5).


Bugs still happen, so our Guts effort builds in multiple layers of defense.  Many Android apps use WebView to display web content inline within their app. A compromised WebView can get access to an app’s private user data and a number of Android system services / device drivers. To mitigate this risk, in the upcoming release of Android N, we’ve worked to move WebView rendering out-of-process into a sandboxed process. This new process model is still experimental and can be enabled under Developer Options in Settings. On Windows, a series of ongoing stability experiments with App Container and win32k lockdown for PPAPI processes (i.e. Flash and pdfium) have given us good data that puts us in a position to launch both of these new security mitigations on Windows 10 very soon!


For Site Isolation, we're getting close to enabling --isolate-extensions for everyone.  We've been hard at work fixing launch blocking bugs, and out-of-process iframes (OOPIFs) now have support for POST submissions, fullscreen, find-in-page, zoom, scrolling, Flash, modal dialogs, and file choosers, among other features.  We've also made lots of progress on the new navigation codepath, IME, and the task manager, along with fixing many layout tests and crashes. Finally, we're experimenting with --top-document-isolation mode to keep the main page responsive despite slow third party iframes, and with using OOPIFs to replace BrowserPlugin for the <webview> tag.


We also spend time building security features that users see. We’re overhauling the omnibox security iconography in Chrome -- new, improved connection security indicators are now in Chrome Beta (52) on Mac and Chrome Dev (53) for all other platforms. We created a reference interstitial warning that developers can use for their implementations of the Safe Browsing API. Speaking of Safe Browsing, we’ve extended protection to cover files downloaded by Flash apps, we’re evaluating many more file types than before, and we closed several gaps that were reported via our Safe Browsing Download Protection VRP program.


Beyond the browser, our web platform efforts foster cross-vendor cooperation on developer-facing security features.  We shipped an implementation of the Credential Management API (and presented a detailed overview at Google I/O), iterated on Referrer Policy with a `referrer-policy` header implementation behind a flag, and improved our support for SameSite cookies. We're continuing to experiment with Suborigins with developers both inside and outside Google, built a prototype of CORS-RFC1918, and introduce safety nets to protect against XSS vulnerabilities due to browser bugs[1].


We've also been working on making users feel more in control of powerful permissions. All permissions will soon be scoped to origins, and we've started implementing permission delegation (which is becoming part of feature policy). We’re also actively working to show fewer permission prompts to users, and to improve the prompts and UI we do show... subtle, critical work that make web security more human-friendly (and thus, effective).


We see migration to HTTPS as foundational to any web security whatsoever, so we're actively working to drive #MOARTLS across Google and the Internet at large. Emily and Emily busted HTTPS myths for large audiences at Google I/O and the Progressive Web App dev summit. The HSTS Preload list has seen 3x growth since the beginning of the year – a great problem to have! We’ve addressed some growth hurdles by a rewrite of the submission site, and we’re actively working on the preload list infrastructure and how to additionally scale in the long term.


In addition to #MOARTLS, we want to ensure more secure TLS. Some of us have been involved in the TLS 1.3 standardization work and implementation. On the PKI front, and as part of our Expect CT project, we built the infrastructure in Chrome that will help site owners track down certificates for their sites that are not publicly logged in Certificate Transparency logs. As of Chrome 53, we’ll be requiring Certificate Transparency information for certificates issued by Symantec-operated CAs, per our announcement last year. We also launched some post-quantum cipher suite experiments to protect everyone from... crypto hackers of the future and more advanced worlds ;)


For more thrilling security updates and feisty rants, subscribe to security-dev@chromium.org. You can find older updates at https://dev.chromium.org/Home/chromium-security/quarterly-updates.


Happy Hacking,

Parisa, on behalf of Chrome Security


[1] Please let us know if you manage to work around them!


Q1 2016


Greetings web fans,

The Bugs-- effort aims to find (and exterminate) security bugs. On the fuzzing front, we’ve continued to improve the integration between libFuzzer and ClusterFuzz, which allows coverage-guided testing on a per-function basis. With the help of many developers across several teams, we’ve expanded our collection of fuzzing targets in Chromium (that use libFuzzer) to 70! Not all bugs can be found by fuzzing, so we invest effort in targeted code audits too. We wrote a guest post on the Project Zero blog describing one of the more interesting vulnerabilities we discovered. Since we find a lot of bugs, we also want to make them easier to manage. We’ve updated our Sheriffbot tool to simplify the addition of new rules and expanded it to help manage functional bugs in addition just security issues. We’ve also automated assigning security severity recommendations. Finally, we continue to run our vulnerability reward program to recognize bugs discovered from researchers outside of the team. As of M50, we’ve paid out over $2.5 million since the start of the reward program, including over $500,000 in 2015. Our median payment amount for 2015 was $3,000 (up from $2,000 for 2014), and we want to see that increase again this year!


Bugs still happen, so our Guts effort builds in multiple layers of defense.  On Android, our seccomp-bpf experiment has been running on the Dev channel and will advance to the Stable and Beta channels with M50.

Chrome on Windows is evolving rapidly in step with the operating system. We shipped four new layers of defense in depth to take advantage of the latest capabilities in Windows 10, some of which patch vulnerabilities found by our own research and feedback!  There was great media attention when these changes landed, from Ars Technica to a Risky Business podcast, which said: “There have been some engineering changes to Chrome on Windows 10 which look pretty good. … It’s definitely the go-to browser, when it comes to not getting owned on the internet. And it’s a great example of Google pushing the state of the art in operating systems.”


For our Site Isolation effort, we have expanded our on-going launch trial of --isolate-extensions to include 50% of both Dev Channel and Canary Channel users!  This mode uses out-of-process iframes (OOPIFs) to keep dangerous web content out of extension processes. (See here for how to try it.) We've fixed many launch blocking bugs, and improved support for navigation, input events, hit testing, and security features like CSP and mixed content.  We improved our test coverage and made progress on updating features like fullscreen, zoom, and find-in-page to work with OOPIFs. We're also excited to see progress on other potential uses of OOPIFs, including the <webview> tag and an experimental "top document isolation" mode.


We spend time building security features that people see. In response to user feedback, we’ve replaced the old full screen prompt with a new, lighter weight ephemeral message in M50 across Windows and Linux. We launched a few bug fixes and updates to the Security panel, which we continue to iterate on and support in an effort to drive forward HTTPS adoption. We also continued our work on removing powerful features on insecure origins (e.g. geolocation).


We’re working on preventing abuse of powerful features on the web. We continue to support great “permissions request” UX, and have started reaching out to top websites to directly help them improve how they request permissions for powerful APIs. To give top-level websites more control over how iframes use permissions, we started external discussions about a new Permission Delegation API. We also extended our vulnerability rewards program to support Safe Browsing reports, in a first program of its kind.


Beyond the browser, our web platform efforts foster cross-vendor cooperation on developer-facing security features.  We now have an implementation of Suborigins behind a flag, and have been experimenting with Google developers on usage. We polished up the Referrer Policy spec, refined its integration with ServiceWorker and Fetch, and shipped the `referrerpolicy` attribute from that document. We're excited about the potential of new CSP expressions like 'unsafe-dynamic', which will ship in Chrome 52 (and is experimentally deployed on our shiny new bug tracker). In that same release, we finally shipped SameSite cookies, which we hope will help prevent CSRF. Lastly, we're working to pay down some technical debt by refactoring our Mixed Content implementation and X-Frame-Options to work in an OOPIF world.


We see migration to HTTPS as foundational to any security whatsoever (and we're not the only ones), so we're actively working to drive #MOARTLS across Google and the Internet at large. We worked with a number of teams across Google to help publish an HTTPS Report Card, which aims to hold Google and other top sites accountable, as well as encourage others to encrypt the web. In addition to #MOARTLS, we want to ensure more secure TLS. We mentioned we were working on it last time, but RC4 support is dead! The insecure TLS version fallback is also gone. With help from the libFuzzer folks, we got much better fuzzing coverage on BoringSSL, which resulted in CVE-2016-0705. We ended up adding a "fuzzer mode" to the SSL stack to help the fuzzer get past cryptographic invariants in the handshake, which smoked out some minor (memory leak) bugs.

Last, but not least, we rewrote a large chunk of BoringSSL's ASN.1 parsing with a simpler and more standards-compliant stack.


For more thrilling security updates and feisty rants, subscribe to security-dev@chromium.org. You can find older updates at https://dev.chromium.org/Home/chromium-security/quarterly-updates.


Happy Hacking,

Parisa, on behalf of Chrome Security


Q4 2015


Happy 2016 from the Chrome Security Team!


For those that don’t know us already, we do stuff to help make Chrome the most secure platform to browse the Internet. Here’s a recap of some work from last quarter:


The Bugs-- effort aims to find (and exterminate) security bugs. We’ve integrated libFuzzer into ClusterFuzz, which means we can do coverage-guided fuzz testing on a per-function basis. The result, as you may have guessed, is several new bugs. The Bugs-- team has a larger goal this year to help Chromium developers write a ClusterFuzz fuzzer alongside every unittest, and libFuzzer integration is an important step toward achieving that goal. Separately, we’ve made security improvements and cleanups in the Pdfium codebase and fixed lots of open bugs. We also started some manual code auditing efforts, and discovered several high severity bugs (here, here, and here), and 1 critical severity bug.


Bugs still happen, so our Guts effort builds in multiple layers of defense. On Android, we’re running an experiment that adds an additional seccomp-bpf sandbox to renderer processes, like we already do on Desktop Linux and Chrome OS. On Windows 8 (and above), a Win32k lockdown experiment has been implemented for PPAPI plugins including Flash and Pdfium to help reduce the kernel attack surface for potential sandbox escapes. Also on Windows 8 (and above), an AppContainer sandbox experiment has been introduced, which further reduces kernel attack surface and blocks network communication from renderers.


Our Site Isolation effort reached a large milestone in December: running trials of the --isolate-extensions mode on real Chrome Canary users! This mode uses out-of-process iframes to isolate extension processes from web content for security. (Give it a try!) The trials were made possible by many updates to session history, session restore, extensions, painting, focus, save page, popup menus, and more, as well as numerous crash fixes. We are continuing to fix the remaining blocking issues, and we aim to launch both --isolate-extensions and the broader Site Isolation feature in 2016.


We also spend time building security features that users see. The Safe Browsing team publicly announced a new social engineering policy, expanding Chrome’s protection against deceptive sites beyond phishing. One major milestone is the launch of Safe Browsing in Chrome for Android, protecting hundreds of millions of additional users from phishing, malware, and other web threats! This is on by default and is already stopping millions of attacks on mobile Chrome users. The next time you come across a Safe Browsing warning, you can search for the blocked website in the new Site Status section of the Transparency Report to learn why it’s been flagged by our systems. On the other hand, we’re also trying to show users fewer security warnings in the first place by decreasing our false positive rate for HTTPS warnings. We spent a large part of the quarter analyzing client errors that contribute to false alarm HTTPS errors; check out our Real World Crypto talk for more details.


Beyond the browser, our web platform efforts foster cross-vendor cooperation on developer-facing security features. We've made good progress with folks in the IETF to make some meaningful changes to cookies; cookie prefixes and locking down 'secure' cookies will be shipping shortly. Subresource Integrity and Mixed Content are trucking along the W3C Recommendation path, we've solidified our Suborigins proposal, and have our eyes on some new hotness like HSTS Priming, CSP3 bits and pieces, and limiting access to local network resources.


We see migration to HTTPS as foundational to any security whatsoever (and we're not the only ones), so we're actively working to drive #MOARTLS across Google and the Internet at large. We've continued our effort to deprecate powerful features on insecure origins by readying to block insecure usage of geolocation APIs. We also took to the stage at the Chrome Dev Summit to spread the word, telling developers about what we’re doing in Chrome to make deploying TLS easier and more secure.


In addition to more TLS, we want to ensure more secure TLS, which depends heavily on the certificate ecosystem. Via Certificate Transparency, we detected a fraudulent Symantec-issued certificate in September, which subsequently revealed a pattern of additional misissued certificates. Independent of that incident, we took proactive measures to protect users from a Symantec Root Certificate that was being decommissioned in a way that puts users at risk (i.e. no longer complying with the CA/Browser Forum’s Baseline Requirements). Other efforts include working with Mozilla and Microsoft to phase out RC4 ciphersuite support, and continuing the deprecation of SHA-1 certificates, which were shown to be even weaker than previously believed. To make it easier for developers and site operators to understand these changes, we debuted a new Security Panel that provides enhanced diagnostics and will continue to be improved with richer diagnostics in the coming months.


For more thrilling security updates and feisty rants, subscribe to security-dev@chromium.org. You can find older updates at https://dev.chromium.org/Home/chromium-security/quarterly-updates.


Happy Hacking,

Parisa, on behalf of Chrome Security


Q3 2015


Hello from the Chrome Security Team!

For those that don’t know us already, we do stuff to help make Chrome the most secure platform to browse the Internet. Here’s a recap of some work from last quarter:


The Bugs-- effort aims to find (and exterminate) security bugs. We’ve continued our collaboration with Android Security team and now have a fully functional AddressSanitizer (ASAN) build configuration of AOSP master (public instructions here). ClusterFuzz is helping Android Security team triage and verify bugs, including incoming vulnerability reward submissions, and now supports custom APK uploads and the ability to launch commands. Back on the Chrome front, we’re working on enabling Control Flow Integrity (CFI) checks on Linux, which converts invalid vptr accesses into non-exploitable crashes; 8 bugs discovered so far! We’ve made numerous improvements to how we fuzz Chrome on Android with respect to speed and accuracy. We also made some progress toward our goal of expanding ClusterFuzz platform support to include iOS. In our efforts to improve Chrome Stability, we added LeakSanitizer (LSAN) into our list of supported memory tools, which has already found 38 bugs.


Bugs still happen, so our Guts effort builds in multiple layers of defense. Plugin security remains a very important area of work. With the final death of unsandboxed NPAPI plugins in September, we’ve continued to introduce mitigations for the remaining sandboxed PPAPI (Pepper) plugins. First, we implemented support for Flash component updates on Linux, a long-standing feature request, which allows us to respond to Flash 0-day incidents without waiting to qualify a new release of Chrome. We’ve also been spending time improving the code quality and test coverage of Pdfium, the now open-source version of the Foxit PDF reader. In addition, we have been having some success with enabling Win32k syscall filtering on Windows PPAPI processes (PDFium and Adobe Flash). This makes it even tougher for attackers to get out of the Chromium Flash sandbox, and can be enabled on Windows 8 and above on Canary channel right now by toggling the settings in chrome://flags/#enable-ppapi-win32k-lockdown.


We’ve been making steady progress on Site Isolation, and are preparing to enable out-of-process iframes (OOPIFs) for web pages inside extension processes. You can test this mode before it launches with --isolate-extensions.  We have performance bots and UMA stats lined up, and we'll start with some early trials on Canary and Dev channel.  Meanwhile, we've added support for hit testing in the browser process, scrolling, context menus, and script calls between all reachable frames (even with changes to window.opener).


Not all security problems can be solved in Chrome’s guts, so we work on making security more user-friendly too. To support developers migrating to HTTPS, starting with M46, Chrome is marking the “HTTPS with Minor Errors” state using the same neutral page icon as HTTP pages (instead of showing the yellow lock icon). We’ve started analyzing invalid (anonymized!) TLS certificate reports gathered from the field, to understand the root causes of unnecessary TLS/SSL warnings. One of the first causes we identified and fixed was certificate hostname mismatches due to a missing ‘www’. We also launched HPKP violation reporting in Chrome, helping developers detect misconfigurations and attacks by sending a report when a pin is violated. Finally, in an effort to support the Chrome experience across languages and locales, we made strides in improving how the omnibox is displayed in RTL languages.


Beyond the browser, our web platform efforts foster cross-vendor cooperation on developer-facing security features. We shipped Subresource Integrity (SRI), which defends against resource substitution attacks by allowing developers to specify a hash against which a script or stylesheet is matched before it's executed. We’re excited to see large sites, like Github, already deploying SRI! We've sketched out a concept for a Clear Site Data feature which we hope will make it possible for sites to reset their storage, and we're hard at work on the next iteration of Content Security Policy. Both of these will hopefully start seeing some implementation in Q4.


We see migration to HTTPS as foundational to any security whatsoever (and we're not the only ones), so we're actively working to drive #MOARTLS across Google and the Internet at large. We shipped Upgrade Insecure Requests, which eases the transition to HTTPS by transparently correcting a page's spelling from `http://` to `https://` for all resources before any requests are triggered. We've also continued our effort to deprecate powerful features on insecure origins by solidifying the definition of a "Secure Context", and applying that definition to block insecure usage of getUserMedia().


For more thrilling security updates and feisty rants, subscribe to security-dev@chromium.org.


Happy Hacking,

Parisa, on behalf of Chrome Security


Q2 2015



Hello from the Chrome Security Team!


For those that don’t know us already, we do stuff to help make Chrome the most secure platform to browse the Internet. Here’s a recap of some work from last quarter:


The Bugs-- effort aims to find (and exterminate) security bugs. At the start of the quarter, we initiated a Security FixIt to trim back the fat backlog of open issues. With the help of dozens of engineers across Chrome, we fixed over 40 Medium+ severity security bugs in 2 weeks and brought the count of issues down to 15! We also collaborated with Android Security Attacks Team and added native platform fuzzing support to ClusterFuzz (and imported their fuzzers), which resulted in ~30 new bugs discovered. ClusterFuzz now supports fuzzing on all devices of the Nexus family (5,6,7,9) and Android One and is running on a few dozen devices in the Android Lab. On top of this, we have doubled our fuzzing capacity in Compute Engine to ~8000 cores by leveraging Preemptible VMs. Lastly, we have upgraded all of our sanitizer builds on Linux (ASan, MSan, TSan and UBSan) to report edge-level coverage data, which is now aggregated in the ClusterFuzz dashboard. We’re using this coverage information to expand data bundles by existing fuzzers and improve our corpus distillation.


Bugs still happen, so our Guts effort builds in multiple layers of defense.  Our Site Isolation project is getting closer to its first stage of launch: using out-of-process iframes (OOPIFs) for web pages inside extension processes.  We've made substantial progress (with lots of help from others on the Chrome team!) on core Chrome features when using --site-per-process: OOPIFs now work with back/forward, DevTools, and extensions, and they use Surfaces for efficient painting (and soon input event hit-testing).  We've collected some preliminary performance data using Telemetry, we've fixed lots of crashes, and we've started enforcing cross-site security restrictions on cookies and passwords.  Much work remains, but we're looking forward to turning on these protections for real users!


On Linux and Chrome OS, we’ve made changes to restrict one PID namespace per renderer process, which strengthens and cleans-up our sandbox (shipping in Chrome 45). We also finished up a major cleanup necessary toward deprecating the setuid sandbox, which should be happening soon. Work continued to prepare for the launch of Windows 10, which offers some opportunities for new security mitigations; the new version looks like the most secure Windows yet, so be sure to upgrade when it comes out!


Not all security problems can be solved in Chrome’s guts, so we work on making security more user-friendly too. We’ve continued our efforts to avoid showing unnecessary TLS/SSL warnings: decisions are now remembered for a week instead of a session, and a new checkbox on TLS/SSL warnings allows users to send us invalid certificate chains that help us root out false-positive warnings. Since developers and power users have been asking for more tools to debug TLS/SSL issues, we’ve started building more security information into DevTools and plan to launch a first version in Q3!


Another large focus for the team has been improving how users are asked for permissions, like camera and geolocation. We’ve finalized a redesign of the fullscreen permission flow that we hope to launch by the end of the year, fixed a number of bugs relating to permission prompts, and launched another round of updates to PageInfo and Website Settings on Android.


Beyond the browser, our web platform efforts foster cross-vendor cooperation on developer-facing security features. The W3C's WebAppSec working group continues to be a fairly productive venue for a number of important features: we've polished the Subresource Integrity spec and shipped an implementation in Chrome 46, published first drafts of Credential Management and Entry Point Regulation, continue to push Content Security Policy Level 2 and Mixed Content towards "Recommendation" status, and fixed some longstanding bugs with our Referrer Policy implementation.


Elsewhere, we've started prototyping Per-Page Suborigins with the intent of bringing a concrete proposal to WebAppSec, published a new draft of First-Party-Only cookies (and are working through some infrastructure improvements so we can ship them), and poked at sandboxed iframes to make it possible to sandbox ads.


We see migration to HTTPS as foundational to any security whatsoever (and we're not the only ones), so we're actively working to drive #MOARTLS across Google and the Internet at large. As a small practical step on top of the HTTPS webmasters fundamentals section, we’ve added some functionality to Webmaster Tools to provide better assistance to webmasters when dealing with common errors in managing a site over TLS (launching soon!). Also, we're now measuring the usage of pre-existing, powerful features on non-secure origins, and are now printing deprecation warnings in the JavaScript console. Our ultimate goal is to make all powerful features, such as Geolocation and getUserMedia, available only to secure origins.


For more thrilling security updates and feisty rants, subscribe to security-dev@chromium.org.


Happy Hacking,

Parisa, on behalf of Chrome Security


Q1 2015



Hello from the Chrome Security Team!


For those that don’t know us already, we do stuff to help make Chrome the most secure platform to browse the Internet. Here’s a recap of some work from last quarter:


The Bugs-- effort aims to find (and exterminate) security bugs. Last quarter, we rewrote our IPC fuzzer, which resulted in lots more bugs discovered by ClusterFuzz! We also expanded fuzzing platform support (Android Lollipop, Linux with Nvidia GPU), added archived builds for proprietary media codecs testing on all platforms, and used more code annotations to find bugs (like this or this). We auto-add previous crash tests to our data corpus, which helps to catch regressions even if a developer forgets to add a test (example). We’ve also started experimenting with enabling and leveraging code coverage information from fuzzing. Contrary to what some reports may imply, we don’t think vulnerability counting is a good standalone metric for security, and more bugs discovered internally (653 bugs in 2014 vs. 380 bugs in 2013), means more bugs fixed, means safer software! Outside of engineering, inferno@ gave a talk at nullcon about Chrome fuzzing (slides) and we launched never-ending Pwnium with a rewards pool up to $∞ million!


Bugs still happen, so our Guts effort builds in multiple layers of defense. On Linux and Chrome OS, we did some work to improve the seccomp-BPF compiler and infrastructure. On modern kernels, we finally completed the switch from the setuid sandbox to a new design using unprivileged namespaces. We’re also working on a generic, re-usable sandbox API on Linux, which we hope can be useful to other Linux projects that want to employ sandboxing. On Android, we’ve been experimenting with single-threaded renderer execution, which can yield performance and security benefits for Chrome. We’ve also been involved with the ambitious Mojo effort. On OSX, we shipped crashpad (which was a necessary project to investigate those sometimes-security-relevant crashes!). Finally, on Windows, the support to block Win32k system calls from renderers on Windows 8 and above is now enabled on Stable - and renderers on these systems are also running within App Containers on Chrome Beta, which blocks their access to the network. We also ensured all Chrome allocations are safe - and use less memory (!) - by moving to the Windows heap.


On our Site Isolation project, we’ve made progress on the underlying architecture so that complex pages are correct and stable (e.g. rendering any combination of iframes, evaluating renderer-side security checks, sending postMessage between subframes, keeping script references alive). Great progress has also been made on session history, DevTools, and test/performance infrastructure, and other teams have started updating their features for out-of-process iframes after our Site Isolation Summit.


Not all security problems can be solved in Chrome’s guts, so we work on making security more user-friendly too. In an effort to determine the causes of SSL errors, we’ve added a new checkbox on SSL warnings that allows users to send us invalid certificate chains for analysis. We’ve started looking at the data, and in the coming months we plan to introduce new warnings that provide specific troubleshooting steps for common causes of spurious warnings. We also recently launched the new permissions bubble UI, which solves some of the problems we had with permissions infobars (like better coalescing of multiple permission requests). And for our Android users, we recently revamped PageInfo and Site Settings, making it easier than ever for people to manage their permissions. Desktop updates to PageInfo and Site Settings are in progress, too. Finally, we just launched a new extension, Chrome User Experience Surveys, which asks people for in-the-moment feedback after they use certain Chrome features. If you’re interested in helping improve Chrome, you should try it out!


Beyond the browser, our web platform efforts foster cross-vendor cooperation on developer-facing security features. We're working hard with the good folks in the W3C's WebAppSec working group to make progress on a number of specifications: CSP 2 and Mixed Content have been published as Candidate Recommendations, Subresource Integrity is implemented behind a flag and the spec is coming together nicely, and we've fixed a number of Referrer Policy issues. First-Party-Only Cookies are just about ready to go, and Origin Cookies are on deck.


We see migration to HTTPS as foundational to any security whatsoever (and we're not the only ones), so we're actively working to define the properties of secure contexts, deprecate powerful features on insecure origins, and to make it simpler for developers to Upgrade Insecure Requests on existing sites.


For more thrilling security updates and feisty rants, subscribe to security-dev@chromium.org.


Happy Hacking,

Parisa, on behalf of Chrome Security


P.S. Go here to travel back in time and view previous Chrome security quarterly updates.


Q4 2014



Hello from the Chrome Security Team!


For those that don’t know us already, we do stuff to help make Chrome the most secure platform to browse the Internet. Here’s a recap of some work from last quarter:


The Bugs-- effort aims to find (and exterminate) security bugs. Last quarter, we incorporated more coverage data into our ClusterFuzz dashboard, especially for Android. With this, we hope to optimize our test cases and improve fuzzing efficiency. We also incorporated 5 new fuzzers from the external research community as part of the fuzzer reward program. This has resulted in 33 new security vulnerabilities. Finally, we wrote a multi-threaded test case minimizer from scratch based on delta debugging (a long-standing request from blink devs!) which produces clean, small, reproducible test cases. In reward program news, we've paid over $1.6 million for externally reported Chrome bugs since 2010 ($4 million total across Google). In 2014, over 50% of reward program bugs were found and fixed before they hit the stable channel, protecting our main user population. Oh, and in case you didn’t notice, the rewards we’re paying out for vulnerabilities went up again.


Bugs still happen, so our Guts effort builds in multiple layers of defense. We’re most excited about progress toward a tighter sandbox for Chrome on Android (via seccomp-bpf), which required landing seccomp-bpf support in Android and enabling TSYNC on all Chrome OS and Nexus kernels. We’ve continued to improve our Linux / Chrome OS sandboxing by (1) adding full cross-process interaction restrictions at the BPF sandbox level, (2) making API improvements and some code refactoring of //sandbox/linux, and (3) implementing a more powerful policy system for the GPU sandbox.


After ~2 years of work on Site Isolation, we’re happy to announce that out-of-process iframes are working well enough that some Chrome features have started updating to support them! These include autofill (done), accessibility (nearly done), <webview> (prototyping), devtools, and extensions. We know how complex a rollout this will be, and we’re ready with testing infrastructure and FYI bots. As we announced at our recent Site Isolation Summit (video, slides), our goal for Q1 is to finish up OOPIF support with the help of all of Chrome.


Not all security problems can be solved in Chrome’s Guts, so we work on making security more user-friendly too. For the past few months, we’ve been looking deeper into the causes of SSL errors by looking at UMA stats and monitoring user help forums. One source of SSL errors is system clocks with the wrong time, so we landed a more informative error message in Chrome 40 to let users know they need to fix their clock. We’ve also started working on a warning interstitial for captive portals to distinguish those SSL errors from the rest. Finally, we proposed a plan for browsers to migrate their user interface from marking insecure origins (i.e. HTTP) as explicitly insecure; the initial discussion and external attention has been generally positive.


Over the past few years, we’ve worked on a bunch of isolated projects to push security on the Open Web Platform forward and make it possible for developers to write more secure apps. We recognized we can move faster if we get some of the team fully dedicated to this work, so we formed a new group that will focus on web platform efforts.


As usual, for more thrilling security updates and feisty rants, subscribe to security-dev@chromium.org.


To a safer web in 2015!

Parisa, on behalf of Chrome Security



Q3 2014


Hello from the Chrome Security Team!


For those that don’t know us already, we do stuff to help make Chrome the most secure platform to browse the Internet. Here’s a recap of some work from last quarter:


The Bugs-- effort aims to find (and exterminate) security bugs. We increased Clusterfuzz cores across all desktop platforms (Mac, Android, Windows, and Linux), resulting in 155 security and 275 functional bugs since last update! We also started fuzzing D-Bus system services on Chrome OS, which is our first attempt at leveraging Clusterfuzz for the operating system. One of the common security pitfalls in C++ is bad casting (often rooted in aggressive polymorphism). To address, one of our interns tweaked UBSAN (Undefined Behavior Sanitizer) vptr to detect bad-casting at runtime, which resulted in 11 new security bugs! We’ve continued to collaborate with external researchers on new fuzzing techniques to find bugs in V8, Pdfium, Web Workers, IDB, and more. Shout out to attekett, cloudfuzzer, decoder.oh, and therealholden for their attention and bugs over the past quarter!


Finding bugs is only half the battle, so we also did a few things to make it easier to get security bugs fixed, including (1) a new security sheriff dashboard and (2) contributing to the FindIt project, which helps narrow down suspected CL(s) for a crash (given a regression range and stacktrace), thereby saving manual triage cycles.


Bugs still happen, so our Guts effort builds in multiple layers of defense. We did a number of things to push seccomp-bpf onto more platforms and architectures, including: (1) adding support for MIPS and ARM64, (2) adding a new capability to initialize seccomp-bpf in the presence of threads (bringing us a big step closer to a stronger sandbox on Android), (3) general tightening of the sandboxes, and (4) writing a domain-specific language to better express BPF policies. We also helped ensure a safe launch of Android apps on Chrome OS, and continued sandboxing new system services.


On Windows, we launched Win64 to Stable, giving users a safer, speedier, and more stable version of Chrome! On Windows 8, we added Win32k system call filtering behind a switch, further reducing the kernel attack surface accessible from the renderer. We also locked down the alternate desktop sandbox tokens and refactored the sandbox startup to cache tokens, which improves new tab responsiveness.


Finally, work continues on site isolation. Over the past few months, we’ve started creating RemoteFrames in Blink's frame tree to support out-of-process iframes (OOPIF) and got Linux and Windows FYI bots running tests with --site-per-process. We’ve also been working with the Accessibility team as our guinea pig feature to support OOPIF, and since that work is nearly done, we’re reaching out to more teams over the next few months to update their features (see our FAQ about updating features).


Not all security problems can be solved in Chrome’s guts, so we work on making security more user-friendly too. SSL-related warnings are still a major source of user pain and confusion. Over the past few months, we’ve been focused on determining the causes of false positive SSL errors (via adding UMA stats for known client / server errors) and investigating pinning violation reports. We’ve also been experimenting with cert memory strategies and integrating relevant detail when we detect a (likely) benign SSL error due to captive portal or a bad clock.


Developers are users too, so we know it’s important to support new web security features and ensure new features are safe to use by default. In that vein, we recently landed a first pass at subresource integrity support behind a flag (with useful console errors), we’re shipping most of CSP 2 in M40, weve continued to tighten up handling of mixed content, and are working to define and implement referrer policies. We’ve also been helping on some security consulting for Service Worker; kudos to the team for making changes to handle plugins more securely, restrict usage to secure origins, and for addressing some memory caching issues. If you want to learn more about what’s going on in the Blink Security world, check out the Blink-SecurityFeature label.


And then there’s other random things, like ad-hoc hunting for security bugs (e.g. local privilege escalation bug in pppd), giving Chromebooks to kids at DEFCON, and various artistic endeavors, like color-by-risk diagramming and security-inspired fashion.


For more thrilling security updates and feisty rants, subscribe to security-dev@chromium.org.


Happy Hacking (and Halloween),

Parisa, on behalf of Chrome Security



Q2 2014


Hello from the Chromium Security Team!


For those that don’t know us already, we do stuff to help make Chrome the most secure platform to browse the Internet. Here’s a recap of some work from last quarter:


One of our primary responsibilities is security adviser, and the main way we do this is via security reviews. A few weeks ago, jschuh@ announced a new and improved security review process that helps teams better assess their current security posture and helps our team collect more meaningful data about Chrome engineering. All features for M37 went through the new process, and we’ll be shepherding new projects and launches through this process going forward.


The Bugs-- effort aims to find (and exterminate) security bugs. One of our best ways of finding bugs and getting them fixed quickly is fuzz testing via ClusterFuzz. This quarter, we started fuzzing Chrome on Mac OS (extending the existing platform coverage on Windows, Linux, and Android). We also added code coverage stats to the ClusterFuzz UI, which some teams have been finding helpful as a complement to their QA testing, as well as fuzzer stats, which V8 team now checks in new rollouts. Finally, we added some new fuzzers (WebGL, GPU commands) and integrated a number of memory debugging tools to find new classes of bugs (e.g. AddressSanitizer on Windows found 22 bugs, Dr. Memory on Windows found 1 bug, MemorySanitizer on Linux found 146 bugs, and LeakSanitizer on Linux found 18 bugs).


Another source of security bugs is our vulnerability reward program, which saw a quiet quarter: only 32 reports opened in Q2 (lowest participation in 12 months) and an average payout of $765 per bug (lowest value in 12 months). This trend is likely due to (1) fuzzers, both internal and external, finding over 50% of all reported bugs in Q2, (2) a reflection of both the increasing difficulty of finding bugs and outdated reward amounts being less competitive, and (3) researcher fatigue / lack of interest or stimulus. Plans for Q3 include reinvigorating participation in the rewards program through a more generous reward structure and coming up with clever ways to keep researchers engaged.


Outside of external bug reports, we spent quite a bit of time improving the security posture of Pdfium (Chrome's recently opensourced PDF renderer) via finding / fixing ~150 bugs, removing risky code (e.g. custom allocator), and using secure integer library for overflow checks. Thanks to ifratric@, mjurczyk@, and gynvael@ for their PDF fuzzing help!


Bugs still happen, so our Guts effort builds in multiple layers of defense. We did lots of sandboxing work across platforms last quarter. On Mac OS, rsesek@ started working on a brand new bootstrap sandbox for OSX (//sandbox/mac) and on Android, he got a proof-of-concept renderer running under seccomp-bpf. On Linux and Chrome OS, we continued to improve the sandboxing testing framework and wrote dozens of new tests; all our security tests are now running on the Chrome OS BVT. We also refactored all of NaCl-related “outer” sandboxing to support a new and faster Non-SFI mode for NaCl. This is being used to run Android apps on Chrome, as you may have seen demoed at Google I/O.


After many months of hard work, we’re ecstatic to announce that we released Win64 on dev and canary to our Windows 7 and Windows 8 users. This release takes advantage of High Entropy ASLR on Windows 8, and the extra bits help improve the effectiveness of heap partitioning and mitigate common exploitation techniques (e.g. JIT spraying). The Win64 release also reduced ~⅓ of the crashes we were seeing on Windows, so it’s more stable too!


Finally, work continues on site isolation: lots of code written / rewritten / rearchitected and unknown unknowns discovered along the way. We're close to having "remote" frames for each out-of-process iframe, and you can now see subframe processes in Chrome's Task Manager when visiting a test page like this with the --site-per-process flag.


Not all security problems can be solved in Chrome’s guts, so we work on making security more user-friendly too. The themes of Q2 were SSL and permissions. For SSL, we nailed down a new "Prefer Safe Origins for Powerful Features" policy, which we’ll transition to going forward; kudos to palmer@ and sleevi@ for ironing out all the details and getting us to a safer default state. We’ve also been trying to improve the experience of our SSL interstitial, which most people ignore :-/ Work includes launching new UX for SSL warnings and incorporating captive portal status (ongoing). Congrats to agl@ for launching boringssl - if boring means avoiding Heartbleed-style hysteria, sounds good to us!


On the permissions front, we’re working on ways to give users more control over application privileges, such as (1) reducing the number of install-time CRX permissions, (2) running UX experiments on the effectiveness of permissions, and (3) working on building a security and permissions model to bring native capabilities to the web.


For more thrilling security updates and feisty rants, subscribe to security-dev@chromium.org.


In the meantime, happy hacking!

Parisa, on behalf of Chrome Security


P.S. A big kudos to the V8 team, and jkummerow@ in particular, for their extra security efforts this quarter! The team rapidly responded to and fixed a number of security bugs on top of doing some security-inspired hardening of V8 runtime functions.


Q1 2014


Hello from the Chrome Security Team!


For those that don’t know us already, we help make Chrome the most secure platform to browse the Internet. In addition to security reviews and consulting, running a vulnerability reward program, and dealing with security surprises, we instigate and work on engineering projects that make Chrome safer. Here’s a recap of some work from last quarter:


The Bugs-- effort aims to find (and exterminate) exploitable bugs. A major accomplishment from Q1 was getting ClusterFuzz coverage for Chrome on Android; we’re aiming to scale up resources from a few devices on inferno@’s desk to 100 bots over the next few months. On the fuzzer front, mbarbella@ wrote a new V8 fuzzer that helped shake out 30+ bugs; kudos to the V8 team for being so proactive at fixing these issues and prioritizing additional proactive security work this quarter. Spring welcomed a hot new line of PoC exploits at Pwn2Own and Pwnium 4: highlights included a classic ensemble of overly broad IPC paired with a Windows “feature,” and a bold chain of 5 intricate bugs for persistent system compromise on Chrome OS; more details posted soon here. Beyond exploit contests, we’ve rewarded $52,000 for reports received this year (from 16 researchers for 23 security bugs) via our ongoing vulnerability reward program. We also started rewarding researchers for bugs in Chrome extensions developed "by Google.” Outside of finding and fixing bugs, jschuh@ landed a safe numeric class to help prevent arithmetic overflow bugs from being introduced in the first place; use it and you'll sleep better too!


Bugs still happen, so we build in multiple layers of defense. One of our most common techniques is sandboxing, which helps to reduce the impact of any single bug. Simple in theory, but challenging to implement, maintain, and improve across all platforms. On Linux and Chrome OS, we spent a lot of the quarter paying back technical debt: cleaning up the GPU sandbox, writing and fixing tests, and replacing the setuid sandbox. On Android, we reached consensus with the Android Frameworks team on a path forward for seccomp-bpf sandboxing for Clank. We've started writing the CTS tests to verify this in Android, landed the baseline policy in upstream Clankium, and are working on the required upstream Linux Kernel changes to be incorporated into Chrome Linux, Chrome OS, and Android L. The site isolation project (i.e. sandboxing at the site level) landed a usable cross-process iframe implementation behind --site-per-process, which supports user interaction, nested iframes (one per doc), sad frame, and basic DevTools support. Major refactoring of Chrome and Blink, performance testing, and working with teams that need to update for site isolation continues this quarter. On Windows, we shipped Win64 canaries, landed code to sandbox the auto update mechanism, and improved the existing sandboxing, reducing the win32k attack surface by ~30%. Thanks to the Windows Aura team, we’ve also made tremendous progress on disabling win32k entirely in the Chrome sandbox, which will eventually eliminate most Windows-specific sandbox escapes.


Not all security can be solved in Chromium’s Guts, so we work on making security more user-friendly too. We finally landed the controversial change to remember passwords, even when autocomplete='off' in M34, which is a small, but significant change to return control back to the user. We also made some tweaks to the malware download UX in M32; previously users installed ~29% of downloads that were known malware, and that number is now down to <5%! We’ve recently been thinking a lot about how to improve the security of Chrome Extensions and Apps, including experimenting with several changes to the permission dialog to see if we can reduce the amount of malicious crx installed by users without reducing the amount of non-malicious items. Separately, we want to make it easier for developers to write secure APIs, so meacer@ wrote up some security tips to help developers avoid common abuse patterns we’ve identified from bad actors.


Finally, since Heartbleed is still on the forefront of many minds, a reminder that Chrome and Chrome OS were not directly affected. And if you're curious about how and why Chrome does SSL cert revocation the way it does, agl@ wrote a great post explaining that too.


For more thrilling security updates and feisty rants, subscribe to security-dev@chromium.org.


Happy Hacking,
Parisa, on behalf of Chrome Security

Q4 2013


Hello from the Chrome Security Team!

For those that don’t know us already, we help make Chromium the most secure browsing platform in the market. In addition to security reviews and consulting, running a vulnerability reward program, and dealing with security surprises, we instigate and work on engineering projects that make Chrome more secure.

The end of last year flew by, but here are a couple of things we’re most proud of from the last quarter of 2013:

Make security more usable: We made a number of changes to the malware download warning to discourage users from installing malware. We also worked on a reporting feature that lets users upload suspicious files to Safe Browsing, which will help Safe Browsing catch malicious downloads even faster.
Since PDFs are a common vehicle for exploit delivery, we’ve modified PDF handling in Chrome so that they're all opened in Chrome’s PDF viewer by default. This is a huge security win because we believe Chrome’s PDF viewer is the safest, most hardened, and security-tested viewer available. Malware via Microsoft .docs are also common, so we’re eagerly awaiting the day we can open Office Docs in Quickoffice by default.

Find (and fix) more security bugs: We recently welcomed a new member to the team, Sheriffbot. He’s already started making the mortal security sheriffs’ lives easier by finding new owners, adding Cr- area labels, helping apply and fix bug labels, and reminding people about open security bugs they have assigned to them.

Our fuzzing mammoth, ClusterFuzz, is now fully supported on Windows and has helped find 32 new bugs. We’ve added a bunch of new fuzzers to cover Chromium IPC (5 high severity bugs), networking protocols (1 critical severity bug from a certificate fuzzer, 1 medium severity bug from an HTTP protocol fuzzer), and WebGL (1 high severity bug in Angle). Want to write a fuzzer to add security fuzzing coverage to your code? Check out the ClusterFuzz documentation, or get in touch.

In November, we helped sponsor a Pwn2Own contest at the PacSec conference in Tokyo. Our good friend, Pinkie Pie, exploited an integer overflow in V8 to get reliable code execution in the renderer, and then exploited a bug in a Clipboard IPC message to get code execution in the browser process (by spraying multiple gigabytes of shared memory). We’ll be publishing a full write-up of the exploit on our site soon, and are starting to get excited about our upcoming Pwnium in March.

Secure by default, defense in depth: In Chrome 32, we started blocking NPAPI by default and have plans to completely remove support by the end of the year. This change significantly reduces Chrome’s exposure to browser plugin vulnerabilities. We also implemented additional heap partitioning for buffers and strings in Blink, which further mitigates memory exploitation techniques. Our Win64 port of Chromium is now continuously tested on the main waterfall and is on track to ship this quarter. Lastly, we migrated our Linux and Chrome OS sandbox to a new policy format and did a lot of overdue sandbox code cleanup.

On our site isolation project, we’ve started landing infrastructure code on trunk to support out-of-process iframes. We are few CLs away from having functional cross-process iframe behind a flag and expect it to be complete by the end of January!

Mobile, mobile, mobile: We’ve started focusing more attention to hardening Chrome on Android. In particular, we’ve been hacking on approaches for strong sandboxing (e.g. seccomp-bpf), adding Safe Browsing protection, and getting ClusterFuzz tuned for Android.

For more thrilling security updates and feisty rants, catch ya on security-dev@chromium.org.

Happy Hacking,
Parisa, on behalf of Chrome Security

Q3 2013


An early boo and (late) quarter update from the Chrome Security Team!


For those that don’t know us already, we help make Chromium the most secure browsing platform in the market. In addition to security reviews and consulting, running a vulnerability reward program, and dealing with security surprises, we instigate and work on engineering projects that make Chrome more secure.


Last quarter, we reorganized the larger team into 3 subgroups:


Bugs--, a group focused on finding security bugs, responding to them, and helping get them fixed. The group is currently working on expanding Clusterfuzz coverage to other platforms (Windows and Mac), adding fuzzers to cover IPC, networking, and WebGL, adding more security ASSERTS to catch memory corruption bugs. They're also automating some of the grungy and manual parts of being security sheriff to free up human cycles for more exciting things.


Enamel, a group focused on usability problems that affect end user security or the development of secure web applications. In the near-term, Enamel is working on: improving the malware download warnings, SSL warnings, and extension permission dialogs; making it safer to open PDFs and .docs in Chrome; and investigating ways to combat popular phishing attacks.


Guts, a group focused on ensuring Chrome’s architecture is secure by design and resilient to exploitation. Our largest project here is site isolation, and in Q4, we’re aiming to have a usable cross-process iframe implementation (behind a flag ;) Other Guts top priorities include sandboxing work (stronger sandboxing on Android, making Chrome OS’s seccomp-bpf easier to maintain and better tested), supporting NPAPI deprecation, launching 64bit Chrome for Windows, and Blink memory hardening (e.g. heap partitioning).


Retrospectively, here are some of notable security wins from recent Chrome releases:


In Chrome 29, we tightened up the sandboxing policies on Linux and added some defenses to the Omaha (Chrome Update) plugin, which is a particularly exposed and attractive target in Chrome. The first parts of Blink heap partition were released, and we’ve had “backchannel” feedback that we made an impact on the greyhat exploit market.


In Chrome 30 we fixed a load of security bugs! The spike in bugs was likely due to a few factors: (1) we started accepting fuzzers (7 total) from invited external researchers as part of a Beta extension to our vulnerability reward program (resulting in 26 new bugs), (2) we increased reward payouts to spark renewed interest from the public, and (3) we found a bunch of new buffer (over|under)flow and casting bugs ourselves by adding ASSERT_WITH_SECURITY_IMPLICATIONs in Blink. In M30, we also added a new layer of sandboxing to NaCl on Chrome OS, with seccomp-bpf.


Last, but not least, we want to give a shout out to individuals outside the security team that made an extraordinary effort to improve Chrome security:


  • Jochen Eisinger for redoing the pop-up blocker... so that it actually blocks pop-ups (instead of hiding them). Beyond frustrating users, this bug was a security liability, but due to the complexity of the fix, languished in the issue tracker for years.
  • Mike West for his work on CSP, as well as tightening downloading of bad content types.
  • Avi Drissman for fixing a longstanding bug where PDF password input was not masked.
  • Ben Hawkes and Ivan Fratic for finding four potentially exploitable Chrome bugs using WinFuzz.
  • Mateusz Jurczyk on finding ton of bugs in VP9 video decoder.

Happy Hacking,
Parisa, on behalf of Chrome Security

Q2 2013


Hello from the Chrome Security Team!


For those that don’t know us, we’re here to help make Chrome a very (the most!) secure browser. That boils down to a fair amount of work on security reviews (and other consulting), but here’s some insight into some of the other things we were up to last quarter:


Bug Fixin’ and Code Reviews

At the start of the quarter, we initiated a Code 28 on security bugs to trim back the fat backlog of open issues. With the help of dozens of engineers across Chrome, we fixed over 100 security bugs in just over 4 weeks and brought the count of Medium+ severity issues to single digits. (We’ve lapsed a bit in the past week, but hopefully will recover once everyone returns from July vacation :)


As of July 1st, Clusterfuzz has helped us find and fix 822 bugs! Last quarter, we added a new check to identify out of bound memory accesses and bad casts (ASSERT_WITH_SECURITY_IMPLICATION), which resulted in ~72 new bugs identified and fixed. We’re also beta testing a “Fuzzer Donation” extension to our vulnerability reward program.


Anecdotally, this quarter we noticed an increase in the number of IPC reviews and marked decrease in security issues! Not sure if our recent security tips doc is to credit, but well done to all the IPC authors and editors!


Process hardening

We’ve mostly wrapped up the binding integrity exploit mitigation changes we started last quarter, and it’s now landed on all desktop platforms and Clank. Remaining work entails making additional V8 wrapped types inherit from ScriptWrappable so more Chrome code benefits from this protection. We also started a new memory hardening change that aims to place DOM nodes inside their own heap partition. Why would we want to do that? Used-after-free memory bugs are common. By having a separate partition, the attacker gets a more limited choice of what to overlap on top of the freed memory slot, which makes these types of bugs substantially harder to exploit. (It turns out there is some performance improvement in doing this too!)


Sandboxing++

We’re constantly trying to improve Chrome sandboxing. On Chrome OS and Linux, The GPU process is now sandboxed on ARM (M28) and we’ve been been working on sandboxing NaCl under seccomp-bpf. We’ve also increased seccomp-bpf test coverage and locked down sandbox parameters (i.e. less attack surface). Part of the Chrome seccomp-bpf sandbox is now used in google3 (//third_party/chrome_seccomp), and Seccomp-legacy and SELinux have been deprecated as sandboxing mechanisms.


Chrome work across platforms

  • Mobile platforms pose a number of challenges to replicating some of the security features we’re most proud of on desktop, but with only expected growth of mobile, we know we need to shift some security love here. We’re getting more people ramped up to help on consulting (security and code reviews) and making headway on short and long-term goals.

  • On Windows, we’re still chugging along sorting out tests and build infrastructure to get a stable Win64 release build for canary tests.

  • On Chrome OS, work on kernel ASLR is ongoing, and we continued sandboxing system daemons.


Site Isolation Efforts

After some design and planning in Q1, we started building the early support for out-of-process iframes so that Chrome's sandbox can help us enforce the Same Origin Policy. In Q2, we added a FrameTreeNode class to track frames in the browser process, refactored some navigation logic, made DOMWindow own its Document (rather than vice versa) in Blink, and got our prototype to handle simple input events.  We'll be using these changes to get basic out-of-process iframes working behind a flag in Q3!


Extensions & Apps

This quarter, we detected and removed ~N bad extensions from the Web Store that were either automatically detected or manually flagged as malicious or violating our policies. We’ve started transitioning manual CRX malware reviews to a newly formed team, who are staffing and ramping up to handle this significant workload. Finally, we’ve been looking at ways to improve the permission dialog for extensions so that it’s easier for users to understand the security implications of what they’re installing, and working on a set of experiments to understand how changes to the permissions dialog affect user installation of malware.

Happy Q3!
Parisa, on behalf of Chrome Security

Q1 2013


Hi from the Chrome Security Team!


For those that don’t know us already, we’re here to help make Chrome the most secure browser in the market. We do a fair bit of work on security reviews of new features (and other consulting), but here’s a summary of some of the other things we were up to last quarter:


Bug, bugs, bugs

Though some time is still spent handeling external security reports (mainly from participants of our vulnerability reward program), we spent comparatively more time in Q1 hunting for security bugs ourselves. In particular, we audited a bunch of IPC implementations after the two impressive IPC-based exploits from last year - aedla found some juicy sandbox bypass vulnerabilities (161564, 162114, 167840, 169685) and cdn and cevans found / fixed a bunch of other interesting memory corruption bugs (169973, 166708, 164682). Underground rumors indicate many of these internally discovered bugs collided with discoveries from third party researchers (that were either sitting on or using them for their own purposes). At this point, most of the IPCs that handle file paths have been audited, and we’ve started putting together a doc with security tips to mind when writing IPC.


On the fuzzing front, we updated and added a number of fuzzers to Clusterfuzz: HTML (ifratric, mjurczyk), Flash (fjserna), CSS (bcrane), V8 (farcasia), Video VTT (yihongg), extension APIs (meacer), WebRTC (phoglund), Canvas/Skia (aarya), and Flicker/media (aarya); aarya also taught Clusterfuzz to look for dangerous ASSERTs with security implications, which resulted in even more bugs. Kudos to Clusterfuzz and the ASAN team for kicking out another 132 security bugs last quarter! One downside to all these new bugs is that our queue of open security bugs across Chrome has really spiked (85+ as of today). PIease help us fix these bugs!


Process hardening

We’re constantly thinking about proactive hardening we can add to Chrome to eliminate or mitigate exploitation techniques. We find inspiration not only from cutting edge security defense research, but also industry chatter around what the grey and black hats are using to exploit Chrome and other browsers. This past quarter jln implemented more fine grained support for sandboxing on Linux, in addition to some low level tcmalloc changes that improve ASLR and general allocator security on 64-bit platforms. With jorgelo, they also implemented support for a stronger GPU sandbox on Chrome OS (which we believe was instrumental in avoiding a Pwnium 3 exploit). tsepez landed support for V8 bindings integrity on Linux and Mac OS, a novel feature that ensures DOM objects are valid when bound to Javascript; this avoids exploitation of type confusion bugs in the DOM, which Chrome has suffered from in the past. palmer just enabled bindings integrity for Chrome on Android, and work is in progress on Windows.


Work across platforms

One of our key goals is to get Chrome running natively on 64-bit Windows, where the platform mitigations against certain attacks (such as heap spray) are stronger than when running within a WOW64 process. (We’ve also seen some performance bump on graphics and media on 64-bit Windows!) We made serious progress on this work in Q1, coordinating with engineers on a dozen different teams to land fixes in our codebase (and dependencies), working with Adobe on early Flapper builds, porting components of the Windows sandbox to Win64, and landing 100+ generic Win64 build system and API fixes. Thanks to all that have made this possible!


As Chrome usage on mobile platforms increases, so too must our security attention. We’ve set out some short and long-term goals for mobile Chrome security, and are excited to start working with the Clank team on better sandboxing and improved HTTPS authentication.


Site isolation

Work continues on the ambitious project to support site-per-process sandboxing, which should help us prevent additional attacks aimed at stealing or tampering with user data from a specific site. Last quarter, we published a more complete design for out-of-process iframes, set up performance and testing infrastructure, and hacked together a prototype implementation that helped confirm the feasibility of this project and surface some challenges and open questions that need more investigation.


Extensions

When not feeding the team fish, meacer added a lot of features to Navitron to make flagged extensions easier to review and remove from the WebStore. To put this work in perspective, each week ~X new items are submitted to Webstore, ~Y of them are automatically flagged as malware (and taken down), ~Z malware escalations are manually escalated from extension reviewers (and then reviewed again by security;. meacer also added a fuzzer for extensions and apps APIs, and has been fixing the resulting bugs.


Until we meet again (probably in the issue tracker)...
Parisa, on behalf of Chrome Security