It is no secret that many of the protocols that make up the Web were not designed to accommodate a network such as the one that emerged in the early 90s. Protocols such as TCP/IP were built in an environment of trust, so they are prone to some inherent vulnerability and security flaws that allow distributed denial of service attacks, connection hijacking, IP spoofing, and other well-documented problems. But can we rebuilt it better?
Browsing: Andres Guadamuz
Does anyone else get the feeling that Internet regulation is starting to resemble a dystopian novel? Widespread surveillance (both public and private). Censorship. Lack of transparency. Prosecution of whistle-blowers. Secret courts. Governments intent on hard-wiring morality into the network. Open standards under attack. Excessive IP enforcement. Centralised architectures.
The Internet is supposed to be a distributed architecture, designed to withstand large-scale attacks. The decentralised nature of the Web allows for systems to be taken out, while the whole still will operate by redirecting traffic through the remaining nodes. This makes the Internet resilient to random attacks. Or so the theory goes.