For the last year or so, I have been solely focused on wrapping up my Ph.D. work. As such, I often find myself lost in the gritty, low-level details; missing opportunities to take in the wider view of what we are trying to accomplish in the greater research community.

Therefore, a recent interview took me by surprise (although it should not have) by posing the same question in multiple sessions, namely “how would you fix the Internet?”. My first thought, “am I expected to answer this differently in every session?”. Since, I have pondered it a bit, coming up with dire view of the Internet at it’s future.

I began by attempting to identify, what I believe to be, the largest issues facing the Internet. Then I envision the Internet, it’s protocols, actors, and relationships thereof, as a complex system. The study of such abstractions is applied to many domains and is most useful to identify leverage points, or various components where small changes will have significant impact. Finally, I made effort to classify the catalysts which may enacting changes on each proposal.

So, in my view, to fix the Internet we need to:

Transfer data more effectively and efficiently

I am, at my roots, a 1’s and 0’s type guy. I think any discussion about Internet improvements must include a (hopefully brief) low-level analysis. There are glaring inefficiencies in the underlying protocols driving data transfer. For example, TCP over Ethernet reaches far below the theoretical speed. In my understanding, there was plenty of work in the 90’s proposing new transport protocol solutions to drastically improve performance. Unfortunately, like similar networking technological improvements, these face significant deployment issues. Foremost of which may be interoperability with existing solutions which make adoption at scale nearly impossible or slow moving at best (yeah, we’re looking at you IPv6). However, we shouldn’t be overly critical. These protocols, at least the foundations, were developed in the 70’s and are still working relatively well today.

There are countless analyses of infrastructural deficiencies. To provide a brief synopsis, many rural and / or economically disadvantaged regions have very poor or limited access to the Internet. This is a real problem. As the Internet becomes a more integral societal medium these communities are left behind, imposing these disadvantages on social, educational, and economic opportunities. Seriously, it is the 21st century, access to high-speed Internet should be a right.

For these changes to be enacted, ISPs must be the catalyst. Unfortunately, we see that a combination of the lack of competition or strong economic incentives allow these to be continually overlooked. Instead, the near-sightedness of shareholders and profit margins block advancements in protocol adoption or large-scale infrastructural improvements.

Lower the walls that stymie innovation at scale

Technology, especially regarding computerization in the Internet era, benefits from fast-moving adoption of innovative solutions. We refer to the pinnacle of these efforts as disruptive applications, or new technology that upends the status quo. Examples of disruptive applications in the technology sector are uber, airbnb, netflix, etc. A foundational issue the Internet currently faces is that existing entities have a vested interest in deterring disruptive applications in their domain, lest their user-base migrates. Therefore, there are a variety of “walls” that have been erected to deter disruptive applications and competition.

In the United States there is an ongoing debate over net neutrality. At it’s core, it refers to a scheme where ISPs are not allowed to favor some Internet traffic over others. In absence, it allows an ISP to throttle competing applications below the threshold of usability; effectively locking customers into which ever applications they choose to make available. At risk of over politicizing this piece, net neutrality should not be an issue. Nationwide polls consistently show >75% of the population favors net neutrality. Therefore, one would think politicians, who are elected to voice the people, might align on this issue. Unfortunately, we see that money often has a strong pull on regulatory bodies. Concisely, net neutrality is paramount to a competitive Internet.

Interoperability is the idea that one application may seamlessly interface with another through standardized, public APIs. As a budding young developer I had this idea to implement a mobile application to aggregate social media feeds into one place. As it turns out, I was far from the first. In fact, many social media platforms have specific verbiage to restrict third-party use of it’s APIs and have pursued legal discourse in many contesting situations. I believe that users should be afforded the freedom to change platforms as desired. The current lack in interoperability effectively locks users into a specific platform or application and bars innovation offered by new and creative solutions.

The aforementioned restrictions on innovation are uniquely positioned for regulatory oversight. Unfortunately, as aforementioned, these issues are typically politicized and subject to financial influence manifesting as campaign or party donations. Fortunately, there are a number of steadfast organizations (EFF among others) advocating for a healthy and competitive Internet.

Fix the toxic culture of data hoarding

Just recently, we have seen public acknowledgement and subsequent outcry over the data hoarding practices of tech giants. The implications of ease of availability and lack of oversight are being evaluated, but it certainly doesn’t look good. The current landscape, “he who has the most data has the most influence”. This means that companies with troves of user data are able to effect large-scale social behavior, a dystopic horror. Additionally, marginalized groups are disproportionally effected.

Degradation of user data ownership, and consequently privacy protections, have put us in this situation. We see that often technologies with virtuous aspirations are continually leveraged for more nefarious ends. For example, say the entire Internet. Think about the breadth of humanities knowledge, the ability to contact and instantaneously exchange with diverse individuals who could be on the other side of the globe, and the magnificent societal development thereof. Instead, we have a advertising mega-machine and cat meme’s. On a smaller scale, third-party cookies are commonly discussed at length in the media. They are certainly not all bad, single sign-on is a solution I think everybody agrees with (who wants to track 1000s of passwords?).

Fortunately there are a large number of efforts, academic and industrial, to segue into self-hosted data. Currently, I’m exploring a variety of work focused on using DFSs (Distributed File Systems) and DLTs (Distributed Ledger Technologies) to provide access control over globally distributed, self-hosted, data stores. The implications can not be overstated. As a double-edged sword, developers have a real ability to impact this issue, serving as a catalyst for more privacy preserving technologies. However, this often requires open-source contributions where financial incentives are unclear at best. Additionally, the issue of widespread adoption is often challenging.