Home Asia-Pacific I 1999 Keeping the Internet Independent, Universal and a Global Means of Communication

Keeping the Internet Independent, Universal and a Global Means of Communication

by david.nunes
Donald HeathIssue:Asia-Pacific I 1999
Article no.:6
Topic:Keeping the Internet Independent, Universal and a Global Means of Communication
Author:Donald Heath
Title:President and Chief Executive
Organisation:Internet Society
PDF size:24KB

About author

Not available

Article abstract

More people have been influenced by the dynamic expansion of the global Internet in the past three years than perhaps by any other event in the twentieth century. With the dramatic impact of this phenomenon into so many varied aspects of our lives, governments throughout the world have been either startled into action or have been drawn into Internet issues that are felt to be infringing on their historical and natural domains. Of course, the impact of the internet extends beyond those elements that governments deem to be theirs to control or regulate; it also affects business, education, finance, religion, media, law, publishing, and many aspects of the worlds diverse cultures. The level of involvement by all factions is intense and at times passionate, owing to the perceived value or ultimate control factors at stake.

Full Article

With the extreme exposure the Internet commands; and, with its innate capability to exacerbate or amplify otherwise benign issues; the Internet is increasingly becoming the focus of discussion in areas such as ownership or control. As we begin entry into the third millennium it seems most likely that for the Internet to reach its fullest potential, it will require systems of self-governance or self-regulation. We know how to create self-regulating institutions – they abound in religion, finance, and welfare spheres. The development of self-regulating systems for many functions of the Internet has become complex, at some times ghastly, and in most cases, nothing short of contentious due to what we believe is at stake. There are many reasons why establishing self-regulatory systems is so difficult and they have at their foundation the more basic elements of human behaviour, The organisation most often looked to, as a model for the development of self-regulating systems, is the Internet Engineering Task Force (IETF), which operates under the auspices of the Internet Society (ISOC). The IETF has been a marvellous example of the effectiveness of self-regulation when put into practice ,amazingly devoid of human base elements. Yet when the same principles are put into practice in the attempt to evolve systems for other aspects of Internet governance, the burden of their application seems to crush the good intentions of the participants. Naturally, the goals of the IETF are the development of the technical underpinnings of the Internet and their interoperability. IETF participants are connected in a common bond dedicated to the global scaling, efficiency, security, and performance of the Internet. At once, their simple yet sophisticated policies and procedures ensure the best outcome. Governments have in some cases threatened participants, who are trying to create a system of self-regulation, demanding a satisfactory consensus, or alternatively they themselves will step in and assume the responsibility. In most cases, there is the non-stated threat, which serves as a great motivator for self-regulating proponents to reach consensus. A process known as rough consensus has evolved within the IEFT, originating from the earliest traditions of Internet culture, that defines what argument wins the day. The Internet reached its present robust state for many reasons: A brilliant protocol that scales beautifully – Approximately 200,000 independent networks make up the world-wide Internet. Each network works together to form the whole of the Internet, by using the efficient IP protocol. An early founding period wherein the participants were virtually unnoticed – Amazing results can come about when productive well-intentioned people are left alone with a common goal, to do their best. An environment extremely conducive to co-operation between and among the participants – The early founders needed to find ways and means – technology and operational procedures, if you well – in order to make interconnections between their sites. Without directions from any organisation they were compelled to find their own way. It was out of this environment that the IETF formed and has become the international standards body for the Internet. The freedom to experiment in an uncontrolled environment – This was an interesting period, because finance was relatively easy to get and there were very few requirements imposed upon the receiver. It is very rare that any government would allow the expenditure of fairly substantial sums of money without requiring detailed explanations of what for, how, and why the money would be used. As a result, responsible individuals simply used the money in the most efficient and productive way, without interference or regulations. An open forum using a grass roots approach to standards development – This, of course, was the seed for the formation of the IETF, which is still the finest model of Internet self-governance we have in the inter-workings of the Internet. The one common element in the principles of the IETF and the factors that dominated the early days in the evolution of the Internet, is co-operation. If you take co-operation out of the Internet, today, the Internet will cease to be anything like what we have thus far experienced. Rough consensus, which is not much more than the realisation of co-operation, has actually derived through a very rigorous process that weeds out the weak proposals and encourages the best. It does not mean unanimity, but it does mean broad-based acceptance. As previously mentioned, there are approximately 200,000 independent networks that make up the Internet. By its very nature, no one can control or own the Internet and therefore we constantly face a void, determining who makes the rules. This dilemma becomes further complicated when the question is asked: Who decides who makes the rules? These questions can only be satisfied through the establishment of systems for self-governance. Without being specific about what self-governance is, it is perhaps instructive to state what self-governance is not. It is not having one entity asserting or exerting too much control or authority. If such an environment arises, others, who believe they should have an equal say, would balk, protest, or otherwise take a contrary position. If co-operation is removed, the Internet will begin its downward spiral into oblivion. The diversity of the Internet, its basic architecture, its virtually ubiquitous geographic spread, and the fact that it truly does touch all cultures, together, demands flexibility in the definition and implementation of rules and self-governance systems. This is issue becomes particularly valid when considering issues that vary significantly from country to country and culture to culture. It is unreasonable to expect that one system could be applied to situations – at least if one considers ethical, social, or political issues. However, issues that are more directly tied to the basic technology of the Internet, and aspects of Internet operation wherein its most desirable attributes are at risk, can and should have self-governance systems that conform to the rough consensus principle. That is, the broad-based support from all major Internet stakeholders. This situation exists in the Domain Name System. The DNS, coupled with the root name servers, is at the heart of the Internet assuring continuity in universal addressability. Today, using the techniques proven in the workings of the IETF, we (the global Internet community) have defined a new international corporation that will assume the responsibilities of assigning Internet names, numbers, and protocol parameters. These functions, or assignments, are at the very centre of Internet operation and their central control assures that the Internet will function in a cohesive and non-fragmented manner. An organisation which evolved from within the ISOC out of the IEFT performed this function from the beginnings of Internet technology, it was an informal, non-legal entity, headed by one man, Dr. Jon Postel, of the University of Southern California. The entity was called the Internet Assigned Numbers Authority (IANA). Under the direction of Dr. Postel, the initial plan for the evolution of IANA to an internationally recognised legally constituted organisation was created. This plan has evolved through the rough consensus process and is currently the Internet Corporation for the Assignment of Names and Numbers (ICANN). ICANN is deeply into the process of establishing the administrative and organisational policies and procedures needed in order to effect the transition of the functions that have been performed by IANA during the history of the Internet, to ICANN. The current focus is on the creation of three key functions that will be a part of the overall ICANN purview. These are the so-called Supporting Organisations, or SOs. Three SOs are being created: one for IP address issues, the “protocol support organisation, and one for domain name issues, the DNSO – for domain name support organisation. Of these organisations, the most difficult to form is the DNSO . Issues surrounding this function has triggered controversy for the three years that have been expended so far in attempts to create this self-regulating structure. The reasons are many and include issues relating to potential infringement of intellectual property rights between domain name holders and trade mark holders; the holding of domain names for the purposes of extortion or for resale; the right to become a registry or a registrar. It is this last point that seems to be at the root of much of the contention. It is perceived by many that a registry, that is, an organisation that controls a top level domain (TLD), such as .com, has the potential to make a lot of money. As a result, there are many who want to make certain that the policies, procedures, and the organisations that are put in place to decide who can become a registry, are perfectly unbiased and that no undue influence can be exerted on their decisions. Conclusion Anyone with an interest in the evolution of the Internet should keep their eyes on developments with in ICANN. This entire process is perhaps more far-reaching than it may appear. It is a real test for the Internet communitys , ability to actually effect self-governance, or self-regulation. If we, together, can create an effective self-regulating organisation for this relatively well-defined portion of Internet administration, it will portend well for the development of future self-governance initiatives. If we fail in this relatively simple attempt – simple at least as compared to such things as regulation of content – then the likelihood of us being able to ever achieve the critical requirement for keeping the Internet an independent, universal, and global means of communication, will be dealt a severe blow. The principles of the Internet Society and the procedures of the IETF have been injected thoroughly into the processes forming ICANN and its associated SOs, and should assure that we implement an administrative system, through ICANN, that is consistent with the attributes that have made the Internet successful to date. Its a story whose end is not yet written.

Related Articles

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More