DETAILS


COLUMNS


CONTRIBUTIONS

a a a

SIGGRAPH PUBLIC POLICY

Vol.33 No.2 May 1999
ACM SIGGRAPH

“Last-Mile” Bandwidth Recap and Committee Survey Activity



Bob Ellis


May 99 Columns
CG Pioneers Standards Pipeline


Public Policy
Previous Public Policy Column

For people interested in graphics and graphical user interfaces, the greatest shortcoming of the Internet is bandwidth, particularly lines to consumer premises.  Although most computing professionals have access to high bandwidth connections, most consumers do not. Consumer access to computing and the Internet is probably the most significant development in computing since 1980. This market now drives computing, and hence computer graphics, so it behooves all of us to understand the “last-mile” issues, both technical and political. 

I have enlisted the assistance of SIGGRAPH Public Policy Committee member Myles Losch to co-author this report with me. I’d also like to give advance notice of the forthcoming Public Policy BOF, which will be held at the SIGGRAPH 99 conference in Los Angeles. This will feature Myles, as well as perhaps an outside speaker and myself speaking on this topic.  At time of writing, the BOF has yet to be scheduled, but traditionally it has been held early afternoon on the Thursday of the conference. 

Following up on our announcement of the first on-line public policy survey in the last issue, committee members Laurie Reinhart and David Nelson present the initial results. Don’t forget that sometime in April the second survey replaced the first one. The second survey looks at computer graphics developments that will be important in research and commercial areas and is our contribution to the forward-looking events at the SIGGRAPH 30th year celebration at SIGGRAPH 99.

— Bob Ellis

“Last-Mile” Bandwidth Recap



Bob Ellis and Myles Losch

Although bandwidth and computing capacity are issues throughout the Internet, their importance is perhaps felt most acutely by people who connect via their Internet service provider (ISP). They are the main issues, other than content, which figure most heavily in the public’s perception of the Internet. In the following sections we take a look at current technologies, survey technical issues and review policy issues associated with the new, high-bandwidth data communications services.

Computer professionals all know that response time is a critical aspect of user-computer access. Although oft-reported, it has been most recently described by Jakob Nielsen (Nielsen, Jakob, “User Interface Directions for the Web,” Communications of the ACM, January 1999/Vol. 42, No. 1, pp. 65-72 and http://www.useit.com/) in relation to the WWW. He describes necessary response times as less than 0.1 second for the appearance of instantaneous feedback, less than 1.0 second for the user’s flow of thought to stay uninterrupted and 10 seconds as the limit for keeping the user’s attention focused on the dialogue.

Why is this so important? It is clear that high-speed communications provide a more positive graphics experience on the Internet. But just the ability to download graphics faster is not the primary benefit. One way to enable more people to use the Internet effectively is to present it as a graphical experience, both in terms of content and user interface (the latter requiring response times as described above). 

Two examples come to mind. All of the new models for health care assume that individuals will take greater responsibility for their own interactions with health care professionals. The key to this is access to medical records. Imagine if these records were all on line (with suitable security and privacy protections) and patients could inspect their record by graphically exploring the data, looking at a chart showing the timeline of significant events. See the LifeLines project of Ben Shneiderman and colleagues at the University of Maryland (http://www.cs.umd.edu/hcil/lifelines/).

The second example is the problem we have all faced of being “lost in cyberspace.” How many times have you worked your way through a website (or series of websites) only to discover that you had no idea of where you were in the structure or how to get back to a previous point? Although many sites try to provide some textual information on location, what’s missing is the computer equivalent of the familiar “You Are Here” maps found in physical environments such as museums, zoos, etc.

Why don’t these capabilities exist? They do, but not on the Internet.  And they don’t exist on the Internet because the bandwidth needed to provide the greatly increased amounts of data simply is not available to most users. In fact many that access the Internet over dial up lines still do so with the graphics initially turned off!

For a summary of the potential, see the first SIGGRAPH public policy white paper (SIGGRAPH White Paper, “Computer Graphics, Visualization, Imaging and the GII: Technical Challenges and Public Policy Issues,” http://www.siggraph.org/pub-policy/whitepaperGII.html). 

Because our knowledge is primarily based on the situation in the United States, this description is of necessity oriented towards the U.S. However, some of the technologies described (e.g. the G.lite version of ADSL) are being globally standardized by the International Telecommunication Union (ITU) (http://www.itu.int/), and others (e.g. for cable TV networks) will likely be adapted to conditions elsewhere. 

Current and Near Term Future

Technologies

Currently and in the near term future, few residences and small businesses will be directly served by optical fiber, as the construction and equipment costs of this technology can seldom be recovered within a reasonable time from such consumers. Thus we don’t discuss “fiber-to-the-home.” Starting in the 1980s, a growing number of fiber-based competitive local telephone companies (telcos) were launched to serve businesses in metropolitan areas.  A few electric and/or natural gas utilities (including municipally owned ones) also began to use their rights of way for new fiber networks. The 1996 Telecom Act encouraged such trends, and this will gradually expand the number of available broadband data services, often in combination with other technologies we examine more closely below. 

It should be noted that broadband services for the residential market are usually priced at flat monthly rates between about U.S. $40 and U.S. $60, vs. typical current ISP charges for dialup access of around U.S. $20/month. We feel that these prices are generally attractive to consumers, given the improved performance.

Digital Subscriber Line

The telcos plus some ISPs and new local carriers using telco wires are starting to deploy and offer digital subscriber line (DSL) service in a number of metropolitan areas. DSL employs a high frequency carrier signal on existing copper lines to provide high bandwidth digital communications. DSL can coexist with normal analog phone service. DSL is a continuously connected service. DSL connections will be supported by many of the usual ISPs.

DSL comes in many variations, hence the acronym is frequently written as xDSL. The following paragraphs describe the key parameters. Note that some forms of DSL are asymmetric, meaning that the upstream bandwidth is lower than the downstream bandwidth. Depending on the application, this may or may not be a problem. Except as noted, potential data rates decrease on longer loops (with the supported limit being about 18,000 feet from the telco switching office). Line quality and configuration also affect performance, and some providers impose speed limits for marketing reasons, e.g. to enable tiered offerings priced for different customer groups.

  1. HDSL: The earliest DSL service was developed by Bellcore in the 1980s. HDSL uses two wire-pairs (all other xDSLs use one) to provide T1 or fractional T1 service at lower cost than T1’s original 1960s technology. Data rates are 1.5 Mbit/s in each direction. It is now being superseded by newer technologies.
  2. SDSL: Advanced DSP chips enable T1 service (like HDSL) to be transported on only one wire-pair, so the limited capacity of telcos’ copper cables can be used more efficiently and costs lowered. 
  3. IDSL: IDSL provides “narrowband” ISDN’s basic rate service (144 Kbit/s bi-directionally for the full 2B + D channel configuration), transported via DSL technology so as to lower the telco’s costs. 
  4. ADSL: ADSL is an asymmetric service designed for MPEG-2 compressed TV movies-on-demand, and is now used for Web surfing. It trades off upstream bandwidth (capped between 640 Kbit/s and 1 Mbit/s, depending on the implementation) so that up to 8 Mbit/s can be sent downstream. The asymmetry also deters businesses from replacing T1s with less-costly ADSL.
  5. “G.lite” or “splitterless” ADSL: This is ITU-T’s new standard for home Internet access. Unlike all other DSLs (for which an installer must visit the customer’s site), consumers simply plug their “modem” into a home phone jack. To achieve this labor-cost and usability benefit, data rates are capped at 1.5 Mbit/s downstream and 384 to 512 Kbit/s upstream.
  6. VDSL: This still-evolving short-range service is ADSL’s upward migration path. On a 1000-ft. loop (from a fiber terminal), up to the OC-1 rate of 52 Mbit/s can be sent downstream, vs. 1.5 to 6.4 Mbit/s upstream.  A symmetrical business version is to carry the E3 rate of 34 Mbit/s bi-directionally.  At VDSL’s 4500-ft. loop length limit, speed roughly equals ADSL’s. Telcos plan “fiber to the neighborhood” in support of VDSL wherever ADSL’s use is heavy enough to justify the cost of running fiber. If multichannel HDTV movies-on-demand become popular, that would also justify early VDSL rollouts. 

Good places to find out more about xDSL on the Web are http://www.adsl.com, http://www.ieee-occs.org/dsl_lite and http://www.uawg.org/.

Cable Data Services

Cable television companies are also rolling out their answer to the telephone companies’ xDSL services.  A special interface (misleadingly called a cable “modem”) is used to interface a computer to the cable for data services. These interfaces are typically packaged as an external peripheral device that is housed in its own case like an external dial-up modem. The interface to the computer is then via a 10 Mbit/s Ethernet.

Bandwidths are typically up to 10 Mbit/s downstream and are capped between 200 Kbit/s and 2 Mbit/s upstream. Cable data service is a shared service for several subscribers using the same channel and traveling through all these subscribers’ interface cards. Packets are encrypted as a privacy measure. It is also a continuously connected service. Cable services, while regulated, are not common carrier services so Internet access is typically provided by captive service providers, such as @Home or RoadRunner.

Good places to find out more about cable data services are http://www.cablelabs.com/ and http://www.home.net/ or www.rr. com/. 

Satellite

Satellite television providers have some limited data services. For example, DirecPC provides downstream data rates of 400Kbit/s.  As there is no practical way to provide uplink capability to subscribers, upstream data services must utilize another medium, such as telephone. Note that for some applications, such as Web surfing, the upstream data rate requirements are quite low and could be satisfied by analog modem connections.

Good places to find out more about satellite data services are http://www. direcpc.com/ and http://www.loral.com/ — especially the latter site’s details on CyberStar and Loral Orion.

Terrestrial Wireless

Cellular telephone service has had the ability to transmit data for a long time. But the bandwidths are low and the costs are high, so the service has primarily been used by a few individuals who wish to take advantage of Internet connections (mostly for email only) while traveling.

Companies like Ricochet are now providing medium speed data connections by blanketing an area with low power transmitters/receivers to provide an Ethernet-like connectivity without wires. It is only available in a very few metropolitan areas and is used mainly by traveling professionals who want Internet access.

High-speed broadband data services are starting to be provided by companies such as WinStar, Teligent and Advanced Radio Technology. The plan is to have fiber connected local hubs that feature line-of-sight to nearby buildings. These services are provided on frequency bands as high as 38GHz. Current economics result in targeting business customers located in buildings of 10,000 square feet or greater. Consumer service is not yet available, but multi-unit dwellings like apartment houses and condominium complexes have costs of service comparable to businesses of similar size. Thus, homes of this type are an attractive future market for service providers like those named above. 

Other firms have been freed by the U.S. 1996 telecom reform to offer broadband wireless data services, and some have signaled interest in doing so.  Among the potential players are:

  • Established wireline telcos, which have for years used wireless local loop (WLL) radio equipment to serve very remote or inaccessible customers’ premises.  AT&T has been actively developing a new generation of such technology.
  • Digital cellular and PCS carriers, who are standardizing a new “third generation” (3G) family of services that carry data at up to the E1 rate of 2 Mbit/s. Note that fixed-site cellular data service dates from the 1980s, when it came into use for telemetry in scientific, industrial and security surveillance applications.
  • “Wireless cable” television service providers, and other owners of radio frequencies newly-auctioned by the FCC. 
  • Established terrestrial TV broadcasters, who may use parts of their newly granted digital TV channels for data transmission to paying customers.

Good places to find out more about terrestrial wireless data services are http://www.art-net.net/, http://www. netro-corp.com/, http://www.winstar.com/ and http://www.teligent.com/.

Technical Issues

Digital Subscriber Line

Digital subscriber line services have two major technical limitations: they are distance and crosstalk limited. The telcos’ plans for incremental extension of fiber into residential areas (noted above under VDSL) will slowly address these constraints by shortening the copper loops used for older forms of DSL. Just how big a problem these shortcomings will be really waits to be assessed until the service is more widely deployed. DSL services may be asymmetric, with upload speeds significantly lower than download speeds. Whether or not this is a problem is largely application dependent. For example, surfing the Internet is largely thought to have large downloads in response to a few characters of user interaction data which is uploaded. One of us (Ellis) has noted in his own personal Web surfing that the ratio of uploaded data to downloaded data may be as high as 1:4, a somewhat unexpected result.

Speeds are expected to be 52 Mb/sec for VDSL or (for ADSL) far less, varying with loop length and quality. New “modems” are also required which cost $200 or so. There’s also the question of installation, although some services (G. Lite ADSL) are user-installable. 

Of course the high-speed “last-mile” service of xDSL (or any other broadband data service) is only as good as the complete path from subscriber to ISP. Subscribers should be aware that their path after reaching the telco switching office may include being aggregated or multiplexed onto a circuit with many other subscribers. This might lower their throughput and response time if not properly engineered to handle large loads.

Cable Data Services

Although many people probably do not realize it, the tree-and-branch topology of existing cable TV networks means that these services are a shared resource so there are concerns of lowered performance as more subscribers join. Cable operators plan to address this, like the telcos cited above, through incremental fiberization guided by continuous performance monitoring. Thus, the cable network’s topology will slowly evolve so that ever-fewer customers (ultimately, just one — if traffic rises by enough) use each cable segment. In addition, there are security and privacy concerns, which cable industry standards will address through per-user encryption.

Speeds are also asymmetric, with faster download than upload speeds as previously noted. Non-standard modems are required although standards are being developed. Some early ones are one-way, thus needing phone lines for uploading data. Also, lack of “plug and play” user installation has been a constraint.  And since cable and PC technicians have limited cross training in each other’s fields, they’ve had to be sent out in pairs, doubling labor costs.

The typical requirement of connecting the cable data interface to the computer via Ethernet requires an Ethernet interface on the computer. Typically, home computers do not have an Ethernet interface as standard equipment and business computer users may not want to tie up an Ethernet port this way.

Satellite

Currently, services are provided on geosynchronous direct broadcast satellites (DBS) which are definitely one way because the equipment used on customer premises is receive only.  A second data service is needed for uploading. New low and medium altitude earth orbits (LEO/MEO) communication satellites that Iridium and other new mobile service communications operators use may enable two-way service, including mobile.

Non-standard “modems” and radios are required. And the shared resource nature of the service limits capacity in the short term.

Terrestrial Wireless

Bandwidth and cost are the primary concerns, plus the need for new (and non-standard) “modems.” Other technical issues include the need for somewhat new technology for computing such as radios. Novel concerns include “line-of-sight” requirements and possible signal blockage by rain or other environmental factors! For “wireless cable” operators and terrestrial TV broadcasters, a further issue is their one-way radio infrastructures, which (like cable TV) were optimized to deliver non-interactive entertainment.

Policy Issues

Digital Subscriber Line

Although Telcos, ISPs and others are beginning to widely advertise DSL services, cost and availability are definite problems.  Also the ISDN debacle continues to haunt the minds of potential subscribers. Some have suggested (see the 1998 Harvard workshop at http://www.ksg.harvard.edu/iip/ngct/ngc. html) that the telcos would really like their DSL to be a mostly unregulated, non-common carrier service.

DSL availability is also tied up in the deregulation confusion from the 1996 Telecommunications Act that sets the scene for a clash between the normally state regulated telephone services and federal laws and regulations.  Access to these services by ISPs and competitive local exchange carriers (CLECs) continues to be a problem. Telcos, despite recent Supreme Court decisions against them, continue to litigate against (and thus delay) competitive service providers. 

Cable Data Services

Cable services, while regulated, are not common carrier services. This means there is no ISP choice without extra cost. Some fear that small ISPs who shelter controversial free speech could thus be forced out of business. The issue is important enough to existing ISPs and “portals” that AOL is actively campaigning for these services to be “open,” although they are not suggesting that they be assigned common carrier status.  A coalition of advocacy groups for opening cable networks to any ISP, may be found at http://www.nogatekeepers.org/. Availability also continues to be a problem, partly because cable operators feel little competitive pressure to quickly offer broadband data service.

Some feel that the best way to get meaningful competition to the telcos is to let cable operators “do their thing” without requiring them to open their services to competitors. Others feel that because the telcos are under pressure to open their systems to competition, it is only fair to place the same requirement on cable operators.

While subscribers may always discontinue service, the use of non-standard “modems” means that an equipment investment inhibits transfer to another type of service. Also, the cable companies would prefer to rent the interface device to you, which decreases competition. Possible standardization forced by the FCC is looming which would bring an element of competition to the sales of the interface equipment. But until universal (DSL, Cable, Satellite, etc.) interface devices (similar to the universal handsets which have been lacking in the United States’ non-standard digital cellular situation) become available, the requirements for investment in fairly costly interface devices constrain ISP choice.

Another current policy issue is industry mergers, as typified by the AT&T/TCI merger.  AT&T will get access to homes over TCI’s cables and can begin to offer competitive telephone and other services.  A side issue has been the attempts of some local governments to deny approvals for the “new” TCI until they open their systems to competition. 

Satellite

Costs and competitive issues are important. Satellite television services are about where cable television was a decade or more ago, but the adoption curve has reached the point of rapid increase. Satellite data services are in their infancy. The newness makes it difficult to predict where policy issues might surface. 

As usual with a new technology, esoteric issues such as interactions between FCC antenna regulations and local regulations limiting their placement will be discovered.  Antenna placement regulations may be a particular problem for apartment and condominium dwellers whose antenna cannot “see” a desired satellite from their own window, balcony, patio or similar attachment point.

Terrestrial Wireless

Consumer terrestrial wireless data access is even less developed than satellite services. Issues include spectrum allocation, though a solution may in time emerge from the FCC’s new debate on allowing ultra-wideband, micropower CDMA services. FCC rules (also cited above for satellite services) that restrict outdoor line-of-sight radio antennas are another concern in some housing complexes. 

Summary

Deployment of broadband data services is well underway. Significant, but not insurmountable problems exist. Perhaps the biggest is that this is all caught in the deregulation of telecommunications. For example, many industry analysts agree that telcos’ congressional lobbying before passage of the 1996 Telecommunications Act, and their litigation afterward, have impeded competition in the markets they dominate. In 1998, MCI and others suggested that to counter this, telcos should be made to divest their copper loops. While there is now little interest in such drastic measures, continued failure to achieve local service choice for most customers could spur demand for them, and even for parallel steps against cable TV operators. 

A special symposium sponsored by the Harvard Information Infrastructure Project called: “Next-Generation Communication Technologies: Lessons from ISDN” (http://www.ksg.harvard.edu/iip/ngct/ ngct.html) was held in June 1998. While the overall results were inconclusive (everyone basically said “it wasn’t our fault”), there was a serious attempt to learn the lessons of this failed three-decade effort to digitize the “last mile.”

First, stable standards were very late, and their implementation later still ... yet for the residential customer who must rely on ISDN for both voice and data, important features are lacking even now.  Among these are convenient extension telephones and easy interfaces to consumers’ existing analog customer premises equipment (CPE) such as phones, answerers, auxiliary “ringers” (e.g., lights and gongs), wiring devices, etc. The market seemed to offer windows of opportunity in the early 1980s and again in the mid-1990s, but telcos were too slow-moving to exploit these — and some say that utility regulation left them little reason to do so.

Many states’ ISDN tariffs also wrongly priced the service, as customers saw it. Telcos’ chronically weak grasp of users’ data needs, and their poor record in order handling and tech support, were further obstacles to success.  And, AT&T’s 1984 divestiture sowed dissension among the companies that had to collaborate in order for ISDN to do well in the U.S. (The service won greater acceptance overseas.)

DSL — and cable modem service as well — are immune to some of the problems that beset ISDN, but vulnerable to others. Telcos and cable TV firms have had little success at selling and supporting advanced data services for consumers. This suggests that ISPs, CLECs, et al, be helped to do so. But the incumbent owners/operators of wired infrastructure in the U.S. typically disfavor such ideas. 

Experience in Canada may be useful on this point. That country’s counterpart agency to the FCC ordered that cable TV networks be opened to all data communication service providers who wish to serve cable modem customers. Many in the U.S. will observe these arrangements with interest.


Public Policy Survey Results



David Nelson and Laurie Reinhart

1. Policy Issues directly affecting computer graphics
2. General policy issues
3. Using computer graphics to resolve technical challenges of the Internet
4. Using the Internet to resolve graphics and visualization challenges
Tables 1 –4 (Ranked from 1=very important, to 4=not important)

At time of writing, the first survey has been on the website for about three weeks. Therefore these are preliminary results.  Anticipating an increase in responses after this publication, we plan to publish the results of the survey in another issue and on the website when the survey time period is finished.

Each question had four possible rankings: very important (1), important (2), neutral (3) and not important (4).

There were 21 responses to the survey. No question in any group ranked as low as neutral. Subjects which ranked the highest included accessibility, interoperability of hardware and software, increased delivery speed and enabling access to up-to-date information for the user. Questions receiving highest ranking seem to be related to ease of use, speed and up-to-date information. Lowest ranked questions were: convergence of TV and computers, support for international diversity (lowest of all, with rank of 2.43, not quite to neutral 3.0), use of graphics to reduce information deluge from searches and catering to diverse user population by delaying production to the desktop.

The number of people answering is too small for this to be very meaningful. For example, they could have been from the same age group or from only one continent. We would appreciate it if you would please go to the survey and add your responses, so that we can have a better sampling from throughout the graphics community.

The tables on the left show details on the rankings of the questions, and of each question group overall.

1.1Availability of cost-effective computer/communications bandwidth for graphics
1.2Supporting diverse, general public users
1.3Accessibility including availability, affordability, & usability
1.4Convergence of TV & computers, including compatible standards
1.5Political & technical support for life style changes such as telemedicine
2.1Support for international diversity
2.2Legal and security issues such as privacy, censorship & intellectual property rights
2.3Applicability of existing laws vs. new laws
2.4Interoperability of hardware and software
2.5Scaling to support the global community
3.1Providing more effective interfaces beyond point-and-click menus
3.2Accommodating diverse users (i.e. scientists/educators/politicians/general public)
3.3Organizing information effectively for the user
3.4Increasing delivery speed by use of better image compression & other techniques
3.5Using graphics to reduce the information deluge from searches & queries
4.1Enabling up-to-date information availability to the user
4.2Catering to diverse user populations by allowing users to customize their view of information
4.3The need for additional graphics and computing R&D




Robert Ellis retired in 1993 as Sun Microsystems’ representative on the technology committee of the Computer Systems Policy Project (CSPP) and co-manager of Sun’s university research program. Previously, he held computer graphics software development and management positions with Sun, GE-Calma, Atari, Boeing and Washington University (St. Louis). He received B.S. and M.S. degrees in electrical engineering and computer science from Washington University (St. Louis). Ellis currently serves as the Chair of the Public Policy Committee of ACM’s Special Interest Group on Computer Graphics and Interactive Techniques (SIGGRAPH). 

Myles Losch is a telecommunications planner and information technology analyst. He previously held software development and telecom technology positions at Atlantic Richfield and Southern California Gas. His program planning work for the Los Angeles ACM chapter encompasses telecom and public policy topics.  As volunteer LA SIGGRAPH historian, he is tracing the chapter’s early days, in time for its 30th academic year of service (starting 9/99). In cooperation with the SIGGRAPH Public Policy Committee he planned sessions at CFP98 and CFP99. His B.S. in geology is from the City College of New York. 


Laurie Reinhart is employed by ACS Government Solutions Group as a contractor to the U.S.  Air Force Health Protection and Surveillance organization at Brooks Air Force Base in San Antonio. She has worked on databases for the Human Brain Consortium (U Minn), on family studies doing genetic epidemiological analyses and generating family trees at the Health Science Center at San Antonio, on nutritional data on volunteers at the USDA Human Nutrition Research Laboratory (U of North Dakota) and on the Menstrual Reproductive History Survey, the Alaskan Menarche Survey and several cancer studies (U Minn). She has degrees in biometry and health information systems (M.S., B.A.) and medical technology (B.S.) from the University of Minnesota, and recently in computer science (B.S.) from Montana State University where she specialized in computer graphics and image processing. Reinhart is a member of ACM and of SIGGRAPH, was a student volunteer at several conferences and has been on the SIGGRAPH Public Policy Committee since 1996.


The copyright of articles and images printed remains with the author unless otherwise indicated.

David Richard Nelson (aka Doogie) has worked extensively in medical research laboratories on immunology and aging projects using biochemistry techniques for gene sequencing, confocal microscopy, imaging and molecular visualization. He has presented several abstracts at the Federation of American Scientists in Experimental Biology and is published in a number of scientific immunological journals. David continues to do freelance computer graphics and has developed a wide variety of websites, multimedia, 3D animation and design for universities, start-ups, scientific organizations and Music Record Labels from New York City to Miami. He is currently a member of Association for Computing Machinery, ACM SIGGRAPH and the SIGGRAPH Public Policy Committee. David is currently finishing a degree in digital media at Full Sail Center for the Recording Arts in Florida. Website.