Chapter 3 Technology of Electronic Commerce 3.1 A short history of the Internet and the web The Internet was the result of some visionary thinking by people in the early 1960s who saw great potential value in allowing computers to share information on research and development in scientific and military fields. J CR Licklider of MIT, first proposed a global network of computers in 1962, and moved over to the Defense Advanced Research Projects Agency(DARPA) in late 1962 to head the work to develop it Leonard Kleinrock of MIT and later UCLA developed the theory of packet switching, which was to form the basis of Internet connections. Lawrence Roberts of MIT connected a Massachusetts computer with a California computer in 1965 over dial-up telephone lines. It showed the feasibility of wide area networking, but also showed that the telephone line's circuit switching was inadequate. Kleinrock's packet switching theory was confirmed. Roberts moved over to DARPA in 1966 and developed his plan for ARPANET. These visionaries and many more left unnamed here are the real founders of the Internet The Internet, then known as arPanet, was brought online in 1969 under a contract let the renamed Advanced Research Projects Agency(ARPA) which initially connected four major computers at universities in the southwestern US (UCLA, Stanford Research Institute, UCSB, and the University of Utah). The contract was carried out by BBN of Cambridge, MA under Bob Kahn and went online in December 1969. By June 1970, MIT, Harvard, BBN, and Systems Development Corp(SDC)in Santa Monica, Cal. were added. By January 1971, Stanford, MITs Lincoln Labs, Carnegie-Mellon, and Case-Western Reserve U were added. In months to come NASA/Ames, Mitre, Burroughs RAND, and the U of Illinois plugged in. After that, there were far too many to keep listing here The Internet was designed in part to provide a communications network that would work even if some of the sites were destroyed by nuclear attack. If the most direct route was not available. routers would direct traffic around the network via alternate routes The early Internet was used by computer experts, engineers, scientists, and librarians. There was nothing friendly about it. There were no home or office personal computers in those days, and anyone who used it, whether a computer professional or an engineer or scientist or librarian, had to learn to E-mail was adapted for ARPAnET by Ray Tomlinson of BBN in 1972. He picked the@ symbol from the available symbols on his teletype to link the username and address, The telnet protocol, enabling logging on to a remote computer, was published as a Request for Comments (RFC) in 1972. RFCs are a means of sharing developmental work throughout community. The ftp protocol, enabling file transfers between Internet sites, was published as an RFC in 1973, and from then on RFCs were available electronically to anyone who had use of the ftp protocol The Internet matured in the 70s as a result of the TCP/lP architecture first proposed by bob Kahn at BBn and further developed by Kahn and Vint Cerf at Stanford and others throughout the 70s. It was adopted by the Defense Department in 1980 replacing the earlier Network Control Protocol(NCP)and universally adopted by 1983 The Unix to Unix Copy Protocol (UUCP) was invented in 1978 at Bell Labs. Usenet was started in 1979 based on UUCP. Newsgroups, which are discussion groups focusing on a topic followed, providing a means of exchanging information throughout the world while Usenet is no considered as part of the Internet, since it does not share the use of TCP/IP, it linked UNIX systems around the world, and many Internet sites took advantage of the availability of newsgroups. It was a significant part of the community building that took place on the networks In 1986, the National Science Foundation funded NSFNet as a cross country 56 Kbps backbone for the Internet. They maintained their sponsorship for nearly a decade, setting rules for its non-commercial government and research uses As the commands for e-mail, FtP, and telnet were standardized, it became a lot easier for non-technical people to learn to use the nets. It was not easy by today's standards by any means
Chapter 3 Technology of Electronic Commerce 3.1 A short history of the Internet and the Web The Internet was the result of some visionary thinking by people in the early 1960s who saw great potential value in allowing computers to share information on research and development in scientific and military fields. J.C.R. Licklider of MIT, first proposed a global network of computers in 1962, and moved over to the Defense Advanced Research Projects Agency (DARPA) in late 1962 to head the work to develop it. Leonard Kleinrock of MIT and later UCLA developed the theory of packet switching, which was to form the basis of Internet connections. Lawrence Roberts of MIT connected a Massachusetts computer with a California computer in 1965 over dial-up telephone lines. It showed the feasibility of wide area networking, but also showed that the telephone line's circuit switching was inadequate. Kleinrock's packet switching theory was confirmed. Roberts moved over to DARPA in 1966 and developed his plan for ARPANET. These visionaries and many more left unnamed here are the real founders of the Internet. The Internet, then known as ARPANET, was brought online in 1969 under a contract let by the renamed Advanced Research Projects Agency (ARPA) which initially connected four major computers at universities in the southwestern US (UCLA, Stanford Research Institute, UCSB, and the University of Utah). The contract was carried out by BBN of Cambridge, MA under Bob Kahn and went online in December 1969. By June 1970, MIT, Harvard, BBN, and Systems Development Corp (SDC) in Santa Monica, Cal. were added. By January 1971, Stanford, MIT's Lincoln Labs, Carnegie-Mellon, and Case-Western Reserve U were added. In months to come, NASA/Ames, Mitre, Burroughs, RAND, and the U of Illinois plugged in. After that, there were far too many to keep listing here. The Internet was designed in part to provide a communications network that would work even if some of the sites were destroyed by nuclear attack. If the most direct route was not available, routers would direct traffic around the network via alternate routes. The early Internet was used by computer experts, engineers, scientists, and librarians. There was nothing friendly about it. There were no home or office personal computers in those days, and anyone who used it, whether a computer professional or an engineer or scientist or librarian, had to learn to use a very complex system. E-mail was adapted for ARPANET by Ray Tomlinson of BBN in 1972. He picked the @ symbol from the available symbols on his teletype to link the username and address. The telnet protocol, enabling logging on to a remote computer, was published as a Request for Comments (RFC) in 1972. RFC's are a means of sharing developmental work throughout community. The ftp protocol, enabling file transfers between Internet sites, was published as an RFC in 1973, and from then on RFC's were available electronically to anyone who had use of the ftp protocol. The Internet matured in the 70's as a result of the TCP/IP architecture first proposed by Bob Kahn at BBN and further developed by Kahn and Vint Cerf at Stanford and others throughout the 70's. It was adopted by the Defense Department in 1980 replacing the earlier Network Control Protocol (NCP) and universally adopted by 1983. The Unix to Unix Copy Protocol (UUCP) was invented in 1978 at Bell Labs. Usenet was started in 1979 based on UUCP. Newsgroups, which are discussion groups focusing on a topic, followed, providing a means of exchanging information throughout the world. While Usenet is not considered as part of the Internet, since it does not share the use of TCP/IP, it linked UNIX systems around the world, and many Internet sites took advantage of the availability of newsgroups. It was a significant part of the community building that took place on the networks. In 1986, the National Science Foundation funded NSFNet as a cross country 56 Kbps backbone for the Internet. They maintained their sponsorship for nearly a decade, setting rules for its non-commercial government and research uses. As the commands for e-mail, FTP, and telnet were standardized, it became a lot easier for non-technical people to learn to use the nets. It was not easy by today's standards by any means
but it did open up use of the Internet to many more people in universities in particular. Other departments besides the libraries, computer, physics, and engineering departments found ways to nake good use of the nets--to communicate with colleagues around the world and to share files d While the number of sites on the Internet was small, it was fairly easy to keep track of the resources of interest that were available. But as more and more universities and organizations--and their libraries-- connected the Internet became harder and harder to track. there was more and more need for tools to index the resources that were available In 1991, the first really friendly interface to the Internet was developed at the University of Minnesota. The University wanted to develop a simple menu system to access files and information on campus through their local network. a debate followed between mainframe dherents and those who believed in smaller systems with client-server architecture. The mainframe adherents"won"the debate initially, but since the client-server advocates said they could put up a prototype very quickly, they were given the go-ahead to do a demonstration system he demonstration system was called a gopher after the U of Minnesota mascot--the golden gopher. The gopher proved to be very prolific, and within a few years there were over 10,000 gophers around the world. It takes no knowledge of UNIX or computer architecture to use. In a gopher system, you type or click on a number to select the menu selection you want Gopher's usability was enhanced much more when the University of Nevada at Reno developed the VERONICA searchable index of gopher menus. It was purported to be an acronym for Very Easy Rodent-Oriented Netwide Index to Computerized Archives. A spider crawled gopher menus around the world, collecting links and retrieving them for the index. It was so popular that it was very hard to connect to, even though a number of other VERONICA sites were developed to ease the load. Similar indexing software was developed for single sites, called JUGHEAD (Jonzy's Universal Gopher Hierarchy Excavation And Display) In 1989 another significant event took place in making the nets easier to use. Tim Berners-Lee and others at the European Laboratory for Particle Physics, more popularly known as CERN, proposed a new protocol for information distribution This protocol, which became the World Wide Web in 1991, was based on hypertext--a system of embedding links in text to link to other text, which you have been using every time you selected a text link while reading these pages. Although started before gopher, it was slower to develop The development in 1993 of the graphical browser Mosaic by Marc Andreessen and his team at the National Center for Supercomputing Applications(NCSA)gave the protocol its big boost ater,Andreessen moved to become the brains behind Netscape Corp, which produced the most successful graphical type of browser and server until Microsoft declared war and developed its Microsoft Internet Explorer Since the Internet was initially funded by the government it was originally limited to research, education, and government uses. Commercial uses were prohibited unless they directly served the goals of research and education. This policy continued until the early 90s, when independent commercial networks began to grow. It then became possible to route traffic across the country from one commercial site to another without passing through the government funded NSFNet internet backbone Microsoft's full scale entry into the browser, server, and Internet Service Provider market completed the major shift over to a commercially based Internet. The release of windows 98 in June 1998 with the microsoft browser well integrated into the desktop shows Bill Gates determination to capitalize on the enormous growth of the Internet. Microsoft's success over the past few years has brought court challenges to their dominance A current trend with major implications for the future is the growth of high speed connections 56K modems and the providers who support them are spreading widely, but this is just a small step compared to what will follow. 56K is not fast enough to carry multimedia, such as sound and ideo except in low quality. But new technologies many times faster, such as cable modems digital subscriber lines(DSL), and satellite broadcast are available in limited locations now, and will become widely available in the next few years. These technologies present problems, not just in the user's connection, but in maintaining high speed data flow reliably from source to the user Those problems are being worked on, too During this period of enormous growth, businesses entering the Internet arena scrambled to find economic models that work. Free services supported by advertising shifted some of the direct
but it did open up use of the Internet to many more people in universities in particular. Other departments besides the libraries, computer, physics, and engineering departments found ways to make good use of the nets--to communicate with colleagues around the world and to share files and resources. While the number of sites on the Internet was small, it was fairly easy to keep track of the resources of interest that were available. But as more and more universities and organizations--and their libraries-- connected, the Internet became harder and harder to track. There was more and more need for tools to index the resources that were available. In 1991, the first really friendly interface to the Internet was developed at the University of Minnesota. The University wanted to develop a simple menu system to access files and information on campus through their local network. A debate followed between mainframe adherents and those who believed in smaller systems with client-server architecture. The mainframe adherents "won" the debate initially, but since the client-server advocates said they could put up a prototype very quickly, they were given the go-ahead to do a demonstration system. The demonstration system was called a gopher after the U of Minnesota mascot--the golden gopher. The gopher proved to be very prolific, and within a few years there were over 10,000 gophers around the world. It takes no knowledge of UNIX or computer architecture to use. In a gopher system, you type or click on a number to select the menu selection you want. Gopher's usability was enhanced much more when the University of Nevada at Reno developed the VERONICA searchable index of gopher menus. It was purported to be an acronym for Very Easy Rodent-Oriented Netwide Index to Computerized Archives. A spider crawled gopher menus around the world, collecting links and retrieving them for the index. It was so popular that it was very hard to connect to, even though a number of other VERONICA sites were developed to ease the load. Similar indexing software was developed for single sites, called JUGHEAD (Jonzy's Universal Gopher Hierarchy Excavation And Display). In 1989 another significant event took place in making the nets easier to use. Tim Berners-Lee and others at the European Laboratory for Particle Physics, more popularly known as CERN, proposed a new protocol for information distribution. This protocol, which became the World Wide Web in 1991, was based on hypertext--a system of embedding links in text to link to other text, which you have been using every time you selected a text link while reading these pages. Although started before gopher, it was slower to develop. The development in 1993 of the graphical browser Mosaic by Marc Andreessen and his team at the National Center for Supercomputing Applications (NCSA) gave the protocol its big boost. Later, Andreessen moved to become the brains behind Netscape Corp., which produced the most successful graphical type of browser and server until Microsoft declared war and developed its Microsoft Internet Explorer. Since the Internet was initially funded by the government, it was originally limited to research, education, and government uses. Commercial uses were prohibited unless they directly served the goals of research and education. This policy continued until the early 90's, when independent commercial networks began to grow. It then became possible to route traffic across the country from one commercial site to another without passing through the government funded NSFNet Internet backbone. Microsoft's full scale entry into the browser, server, and Internet Service Provider market completed the major shift over to a commercially based Internet. The release of Windows 98 in June 1998 with the Microsoft browser well integrated into the desktop shows Bill Gates' determination to capitalize on the enormous growth of the Internet. Microsoft's success over the past few years has brought court challenges to their dominance. A current trend with major implications for the future is the growth of high speed connections. 56K modems and the providers who support them are spreading widely, but this is just a small step compared to what will follow. 56K is not fast enough to carry multimedia, such as sound and video except in low quality. But new technologies many times faster, such as cable modems, digital subscriber lines (DSL), and satellite broadcast are available in limited locations now, and will become widely available in the next few years. These technologies present problems, not just in the user's connection, but in maintaining high speed data flow reliably from source to the user. Those problems are being worked on, too. During this period of enormous growth, businesses entering the Internet arena scrambled to find economic models that work. Free services supported by advertising shifted some of the direct
costs away from the consumer--temporarily. Services such as Delphi offered free web pages, chat rooms, and message boards for community building. Online sales have grown rapidly for such products as books and music CDs and computers, but the profit margins are slim when price comparisons are so easy, and public trust in online security is still shaky. Business models that have worked well are portal sites, which try to provide everything for everybody, and live auctions AOL's acquisition of Time-Warner was the largest merger in history when it took place and shows the enormous growth of Internet business! The stock market has had a rocky ride, swooping up and down as the new technology companies, the dot coms encountered good news and bad. The decline in advertising income spelled doom for many dot coms, and a major shakeout and search for better business models is underway by the survivors It is becoming more and more clear that many free services will not survive. While many users still expect a free ride, there are fewer and fewer providers who can find a way to provide it The value of the internet and the Web is undeniable. but there is a lot of shaking out to do and management of costs and expectations before it can regain its rapid growth 3.2 Internet Service And Tools 3.2.1 World wide Web(www) The World Wide Web(Www) is the latest development in network technologies for information presentation. It allows people to publish a vast array of information using text documents, pictures, sounds, movies and other types of documents. The documents that are published are placed on a networked computer(the server). People can then use other computers (usually desktop-based Macintosh and PC computers, called clients) to connect to the server and look at the documents that reside there. When a person wants to look at www documents, they have to use a browsing program that knows how to connect to the server and how to display the information that is stored there Two popular browsing programs are Microsoft Internet Explorer and Netscape The greatest innovation of the Web is that it eliminates the need to know the network address of a computer-rather computer addresses are stored as hyper-text links that one can click on with a mouse to move from one document to another. Each of the links can point to documents that are on the same server or on a different one across the world. Thus, getting from a computer here at UCSB to one in Australia or Germany is simply one mouse-click away -no complicated addresses to remember or complex instructions to execute. In addition, the Web allows transparent access to both text documents and binary documents (such as graphics, sound, video and other binary data types). This permits liberal use of pictures and sounds in the information presentations that are created Another advantage of the Web is its ability to recognize a special formatting language called HTML(Hyper-Text Markup Language). This permits fine control of the appearance of document that are sent over the net-thus, authors can use boldfaced or italicized type for emphasis, insert tables in their documents, and place pictures and other graphic elements throughout their acumen Finally, many Web browsers(including Netscape) are capable of communicating via the older network protocols like ftp and gopher as well. Thus, from within a browser like Netscape one can view Web documents, do FTP transfers to your computer, read Net News, send e-mail and look at Gopher sites. all that from one program 3.2.2 Electronic Mail(Email) 1. General Description of Electronic Mail Electronic mail, sometimes called email, is a computer based method of sending messages from one computer user to another. These messages usually consist of individual pieces of text which you can send to another computer user even if the other user is not logged in(i.e using the computer)at the time you send your message. The message can then be read at a later time. This procedure is analogous to sending and receiving a letter Originally, email messages were restricted to simple text, but now many systems can handle more complicated formats, such as graphics and word processed documents. Just as you can send attachments with your letters in the normal post(snail mail) you can also send computer
costs away from the consumer--temporarily. Services such as Delphi offered free web pages, chat rooms, and message boards for community building. Online sales have grown rapidly for such products as books and music CDs and computers, but the profit margins are slim when price comparisons are so easy, and public trust in online security is still shaky. Business models that have worked well are portal sites, which try to provide everything for everybody, and live auctions. AOL's acquisition of Time-Warner was the largest merger in history when it took place and shows the enormous growth of Internet business! The stock market has had a rocky ride, swooping up and down as the new technology companies, the dot.com's encountered good news and bad. The decline in advertising income spelled doom for many dot.coms, and a major shakeout and search for better business models is underway by the survivors. It is becoming more and more clear that many free services will not survive. While many users still expect a free ride, there are fewer and fewer providers who can find a way to provide it. The value of the Internet and the Web is undeniable, but there is a lot of shaking out to do and management of costs and expectations before it can regain its rapid growth. 3.2 Internet Service And Tools 3.2.1 World Wide Web (WWW) The World Wide Web (WWW) is the latest development in network technologies for information presentation. It allows people to publish a vast array of information using text documents, pictures, sounds, movies and other types of documents. The documents that are published are placed on a networked computer (the server). People can then use other computers (usually desktop-based Macintosh and PC computers, called clients) to connect to the server and look at the documents that reside there. When a person wants to look at WWW documents, they have to use a browsing program that knows how to connect to the server and how to display the information that is stored there. Two popular browsing programs are Microsoft Internet Explorer and Netscape. The greatest innovation of the Web is that it eliminates the need to know the network address of a computer - rather, computer addresses are stored as hyper-text links that one can click on with a mouse to move from one document to another. Each of the links can point to documents that are on the same server or on a different one across the world. Thus, getting from a computer here at UCSB to one in Australia or Germany is simply one mouse-click away - no complicated addresses to remember or complex instructions to execute. In addition, the Web allows transparent access to both text documents and binary documents (such as graphics, sound, video and other binary data types). This permits liberal use of pictures and sounds in the information presentations that are created. Another advantage of the Web is its ability to recognize a special formatting language called HTML (Hyper-Text Markup Language). This permits fine control of the appearance of documents that are sent over the net - thus, authors can use boldfaced or italicized type for emphasis, insert tables in their documents, and place pictures and other graphic elements throughout their documents. Finally, many Web browsers (including Netscape) are capable of communicating via the older network protocols like ftp and gopher as well. Thus, from within a browser like Netscape one can view Web documents, do FTP transfers to your computer, read Net News, send e-mail, and look at Gopher sites. All that from one program! 3.2.2 Electronic Mail (Email) 1. General Description of Electronic Mail Electronic mail, sometimes called email, is a computer based method of sending messages from one computer user to another. These messages usually consist of individual pieces of text which you can send to another computer user even if the other user is not logged in (i.e. using the computer) at the time you send your message. The message can then be read at a later time. This procedure is analogous to sending and receiving a letter. Originally, email messages were restricted to simple text, but now many systems can handle more complicated formats, such as graphics and word processed documents. Just as you can send attachments with your letters in the normal post (snail mail) you can also send computer
documents, graphics, software or anything else that can be turned into a digital form attached to an When mail is received on a computer system, it is usually stored in an electronic mailbox fo the recipient to read later. Electronic mailboxes are usually special files on a computer which can be accessed using various commands. Each user normally has their individual mailbox 2. How does it work? In order for messages to be sent from one computer to another, your message needs to be converted into a digital form and forwarded to a computer that acts as a mail server or post office This mail server sorts and directs your mail for you. The only way this mail server can direct mail though, to all users is by being connected to a network that all the users are also connected to This network can be internal(a stand-alone network)which means you can only send email to other users on that network. If your mail server is connected to the Internet you can also send your email messages to any other computer user that is connected to the internet anywhere in the world because the internet is a network of all the little networks of organizations around the world This mail server can be within your organization or with an Internet Service Provider, so you would connect to it by logging into your email account. When you send your email message, the mail server decides whether the message is to be passed on to a user on its immediate network or it will pass it onto another mail server on another network closest to it through these combination of networks and each mail server keeps passing it on until it reaches its intended destination. This is known as thestore and'forward system, storing your message at various points on the path to its receiver waiting for the link to be free for it to forward your message on the next part of its Journey The digital data can be broken up and follow different paths(go through different mail servers)to gets to its destination but always meets up when it gets to the post office or mail server of the recipient. The mail will stay at the post office until the recipient decides to collect it which is done by logging into their email account. 3.2.3 Bulletin Board Systems(BBS) 1. Bulletin Board Systems Bulletin Board Systems(BBS)are self-contained online communities. A BBS is almost like a iniaturized Internet. A BBS has a variety of things for users to do- Read and Write messages in Discussion Forums, Upload and Download files, and Play online games Most BBS systems offer a general perspective-they offer a little bit of everything General amounts of varying topic message forums covering a wide variety of topics and interests--using networks that you will never see on the Internet. Other BBS systems are specialized and follow specific themes such as Genealogy, Sports, Programming, Religion, or just about anything you can imagine. These sy stems usually carry and message forum conferences related to their themes Others might host a large selection of online games for their users to enjoy. Though dwarfed by the amount of software available now found on the Internet, some BBS systems are large repositories of a wide variety of software-and, in some cases- software not found anywhere on he Web As with any community, you will find some systems that appeal to you and others that ma not. After checking out various BBS systems, you will find several you will call"home". Then you will have yet another place to happily waste time having fun with your computer- just like the Web 2. Types of BBS Systems There are generally three types of BBS systems: Dial-Up, Telnet and Web-based Most BBS Systems fall under the first two categories. Once you have connected to a Dial-Up or Telnet BBS system, you will notice things are different from when you are surfing the World Wide Web. The graphics are not of the picture quality you are familiar with on the Web. Another big difference is that on most BBS systems your pointing device(mouse)does not work. Some
documents, graphics, software or anything else that can be turned into a digital form attached to an email message. When mail is received on a computer system, it is usually stored in an electronic mailbox for the recipient to read later. Electronic mailboxes are usually special files on a computer which can be accessed using various commands. Each user normally has their individual mailbox. 2. How does it Work? In order for messages to be sent from one computer to another, your message needs to be converted into a digital form and forwarded to a computer that acts as a mail server or post office. This mail server sorts and directs your mail for you. The only way this mail server can direct mail though, to all users is by being connected to a network that all the users are also connected to. This network can be internal (a stand-alone network) which means you can only send email to other users on that network. If your mail server is connected to the Internet you can also send your email messages to any other computer user that is connected to the internet anywhere in the world because the internet is a network of all the little networks of organizations around the world. This mail server can be within your organization or with an Internet Service Provider, so you would connect to it by logging into your email account. When you send your email message, the mail server decides whether the message is to be passed on to a user on its immediate network or it will pass it onto another mail server on another network closest to it through these combination of networks and each mail server keeps passing it on until it reaches its intended destination. This is known as the 'store' and 'forward' system, storing your message at various points on the path to its receiver waiting for the link to be free for it to forward your message on the next part of its journey. The digital data can be broken up and follow different paths (go through different mail servers) to gets to its destination but always meets up when it gets to the post office or mail server of the recipient. The mail will stay at the post office until the recipient decides to collect it which is done by logging into their email account. 3.2.3 Bulletin Board Systems (BBS) 1. Bulletin Board Systems Bulletin Board Systems (BBS) are self-contained online communities. A BBS is almost like a miniaturized Internet. A BBS has a variety of things for users to do - Read and Write messages in Discussion Forums, Upload and Download files, and Play online games. Most BBS systems offer a general perspective - they offer a little bit of everything. General perspective BBS systems do not have a topical theme. Some are dedicated to bringing in large amounts of varying topic message forums covering a wide variety of topics and interests--using networks that you will never see on the Internet. Other BBS systems are specialized and follow specific themes such as Genealogy, Sports, Programming, Religion, or just about anything you can imagine. These systems usually carry and message forum conferences related to their themes. Others might host a large selection of online games for their users to enjoy. Though dwarfed by the amount of software available now found on the Internet, some BBS systems are large repositories of a wide variety of software - and, in some cases - software not found anywhere on the Web. As with any community, you will find some systems that appeal to you and others that may not. After checking out various BBS systems, you will find several you will call "home". Then you will have yet another place to happily waste time having fun with your computer - just like the Web! 2. Types of BBS Systems There are generally three types of BBS systems: Dial-Up, Telnet and Web-based. Most BBS Systems fall under the first two categories. Once you have connected to a Dial-Up or Telnet BBS system, you will notice things are different from when you are surfing the World Wide Web. The graphics are not of the picture quality you are familiar with on the Web. Another big difference is that on most BBS systems your pointing device (mouse) does not work. Some
BBS users notice these differences and never see all what the bbs was offering. These users have truly missed out BBS software is currently under major re-development Many developers are working towards making their BBS application look and feel just like the Web. Until these applications are ready. the traditional ANSI video graphic interface will just have to do Traditional Dial-Up Based BBS Systems. Until recently, most Bulletin Board Systems used the concept of connecting personal computers together via modems using regular telephone lines. There are still a large number of"dial-up"based BBS systems in use worldwide. Accessing these dial-up BBS systems is different than accessing your dial-up Internet provider. You will need omething known as Terminal Software that allows you to use your computer modem to"call these BBSes via the regular telephone network There are many kinds to choose from Telnet BBS Systems- BBS systems on the Internet. The Internet has added greater accessibility to Bulletin Boards. Instead of having to dial BBSes individually, many systems are now available via the Internet using the Internet protocol called Telnet. Telnet BBS systems allow greater flexibility for the bBs user since they call their Internet Service Provider as usual, then they use special Telnet Client Software to access Bulletin Boards on the Internet. Web-based BBS Systems. New technology is arriving to bring the Bulletin Board System to the Internet with full point-and-click accessibility and full color graphics. These systems are still in their infancy and their numbers are few. as this technology improves you will see more of these systems in 3.2.4 File Transfer Protocol (FTP) 1. An introduction to ftp FTP(File Transfer Protocol) is the Internet standard file transfer program. FTP enables you to transfer nearly any kind of file between most computers connected to the Internet by logging on to a remote computer solely for the purpose of file transfer In current times if you want to transfer a file to a friend, you can just attach it in an email and send it off. With high speed bandwidth being so cheap and plentiful to the home user, transferring a file in this manner is usually more than adequate. What if you needed to transfer the file to someone immediately; there could be no delays, it has to be fast, and the files you are transferring may be very large. In a scenario like this, email will most likely not be dequate. This is because most email providers limit the size of your mailbox on their servers the size of attachments you may receive, and that there is no guarantee when you send email how long it will take for the recipient to receive it or if it will even get there. This is where FTP comes FTP stands for the File transfer protocol and is used to transfer files between an FTP server and another computer. In the past, FTP was much more common than it is today and was the dominant file transfer mechanism on the Internet. If you needed to transfer files between two computers, you would use FTP to do so. FTP is still very popular today when a service requires that a lot of files be hosted for other to people to download. FTP also tends to be faster than other contemporary methods of transferring files because it was designed to do so doiaEven more important, FTP support Automatic Resume. This means that if you are nloading the latest new game demo that is over 600 megs, and for some reason the download stops in the middle of the transfer, the ftp client will attempt to on the next download of the same file, to continue from where you left off. This feature can save you a huge amount of time but is generally only found in specialized FTP client software and not in your browser software 2. How to connect to an FtP server There are two approaches to allowing users to connect to an FTP Server. The first is to make it so anyone can log in anonymously, otherwise known as anonymous FTP, or you can names and passwords to people that they must use to log in to the server The two most common ways to connect to an FTP server is with your Web Browser or with cialized FTP Client. To connect to a FTp server with your browser you would prefix the hostname you are connecting to with the ftp: //protocol statement. For example ftp://www.bleepingcomputer.comItwouldthentrytoconnectanonymouslyIftheserveryou
BBS users notice these differences and never see all what the BBS was offering. These users have truly missed out. BBS software is currently under major re-development. Many developers are working towards making their BBS application look and feel just like the Web. Until these applications are ready, the traditional ANSI video graphic interface will just have to do. ▪ Traditional Dial-Up Based BBS Systems. Until recently, most Bulletin Board Systems used the concept of connecting personal computers together via modems using regular telephone lines. There are still a large number of "dial-up" based BBS systems in use worldwide. Accessing these dial-up BBS systems is different than accessing your dial-up Internet provider. You will need something known as Terminal Software that allows you to use your computer modem to "call" these BBSes via the regular telephone network There are many kinds to choose from. ▪ Telnet BBS Systems - BBS systems on the Internet. The Internet has added greater accessibility to Bulletin Boards. Instead of having to dial BBSes individually, many systems are now available via the Internet using the Internet protocol called Telnet. Telnet BBS systems allow greater flexibility for the BBS user since they call their Internet Service Provider as usual, then they use special Telnet Client Software to access Bulletin Boards on the Internet. ▪ Web-based BBS Systems. New technology is arriving to bring the Bulletin Board System to the Internet with full point-and-click accessibility and full color graphics. These systems are still in their infancy and their numbers are few. As this technology improves you will see more of these systems in use. 3.2.4 File Transfer Protocol (FTP) 1. An Introduction to FTP FTP (File Transfer Protocol) is the Internet standard file transfer program. FTP enables you to transfer nearly any kind of file between most computers connected to the Internet by logging on to a remote computer solely for the purpose of file transfer. In current times if you want to transfer a file to a friend, you can just attach it in an email and send it off. With high speed bandwidth being so cheap and plentiful to the home user, transferring a file in this manner is usually more than adequate. What if you needed to transfer the file to someone immediately; there could be no delays, it has to be fast, and the files you are transferring may be very large. In a scenario like this, email will most likely not be adequate. This is because most email providers limit the size of your mailbox on their servers, the size of attachments you may receive, and that there is no guarantee when you send email how long it will take for the recipient to receive it or if it will even get there. This is where FTP comes in. FTP stands for the File transfer protocol and is used to transfer files between an FTP server and another computer. In the past, FTP was much more common than it is today and was the dominant file transfer mechanism on the Internet. If you needed to transfer files between two computers, you would use FTP to do so. FTP is still very popular today when a service requires that a lot of files be hosted for other to people to download. FTP also tends to be faster than other contemporary methods of transferring files because it was designed to do so. Even more important, FTP support Automatic Resume. This means that if you are downloading the latest new game demo that is over 600 megs, and for some reason the download stops in the middle of the transfer, the ftp client will attempt to on the next download of the same file, to continue from where you left off. This feature can save you a huge amount of time but is generally only found in specialized FTP client software and not in your browser software. 2. How to Connect to an FTP Server There are two approaches to allowing users to connect to an FTP Server. The first is to make it so anyone can log in anonymously, otherwise known as anonymous FTP, or you can assign user names and passwords to people that they must use to log in to the server. The two most common ways to connect to an FTP server is with your Web Browser or with a specialized FTP Client. To connect to a FTP server with your browser you would prefix the hostname you are connecting to with the ftp:// protocol statement. For example, ftp://www.bleepingcomputer.com. It would then try to connect anonymously. If the server you