Tremendous effort is undergoing to make the Web faster. However, quantifying speed on the Web is complex: usually we are attempting to capture human perception with a computer-generated metric. In many studies, participants are simply shown a page loading, in person, in a controlled environment, which has a clear scalability problem. MAMI partners at Telefonica Research (in collaboration with Carnegie Mellon University) took a different approach and built Eyeorg, an automated system for crowdsourcing Web Quality of Experience (QoE) measurements. Eyeorg uses crowdsourced participants to scale and shows videos of pages loading to provide a consistent experience to all participants, regardless of their network connections and device configurations. In their paper, to be published at CONEXT 2016, they present hands-on experience from using Eyeorg to 1) study the quality of several PLT metrics, 2) compare HTTP/1.1 and HTTP/2 performance, and 3) assess the impact of online advertisements and ad blockers on user experience. A key result they observed is that many videos have two modes, one for participants who consider the pages “ready” when the primary content is in place and one for those who wait for auxiliary content like advertisement (see below). These results show the potential of Eyeorg to measure the impact changes to the web have on people. For example, Eyeorg can be used to evaluate TCP vs. QUIC, TLS 1.2 vs TLS 1.3, HTTP/2 push/priority strategies, web design techniques like domain sharding or image spriting, browser plugins, or even in-network services like Google’s Flywheel compression proxy.