Computer Science

By The Editors / November / December 2000
October 24th, 2007

Norman K. Meyrowitz ’81

People who complain about keeping up with the pace of technology should have talked to Norm Meyrowitz ten years ago: he predicted all of it. In a keynote address at the Hypertext 1989 Conference in Pittsburgh, Meyrowitz said: “Sometime soon there will be an infrastructure, national and international, that supports a network and community of knowledge linking together myriad types of information for an enormous variety of audiences.” Sound familiar?

Now the president of Macromedia Ventures, Meyrowitz is the first to object to being singled out for his early prognostications of the World Wide Web. Much of his 1989 speech, he points out, was an homage to ideas expressed by Vannevar Bush in the 1940s. But as his career had made clear, Meyrowitz has a knack for figuring out how to make the future happen in the present.

In 1983 Meyrowitz co-founded Brown’s Institute for Research in Information Systems (IRIS), where he was principle architect of Intermedia, a networked, multi-user, hypermedia environment that was a forerunner to the Web. Using a cable system designed for dorm-room televisions, Meyrowitz and his IRIS colleagues helped make Brown one of the first “wired” campuses in the country.

After joining the GO Corporation in 1991, Meyrowitz continued to forge ahead—although a little too quickly at first. GO’s PenPoint system was built around the same idea as today’s PalmPilot, except that in those days processors were power hungry and drained their batteries within a week.

When he left two years later to join Macromedia as director of strategic technology, he had learned his lesson. “It’s really easy to be too early with an invention,” he says, “even when it’s clear the invention’s going to work. A lot of products try to boil the ocean, but part of being successful is doing just what you need to do.”

Under Meyrowitz’s watch, Macromedia has been a key player in the evolution of the Web. Their products—Director, Dreamweaver, Flash, and Shockwave, to name a few—have become the standard for Web viewing and authoring tools. As happy as he is with their technical success, Meyrowitz also judges Macromedia’s products like any good manager: by their ubiquity. (He reports that Flash, an animation design tool, was downloaded 81 million times in September.)

“Product development is not about inventing something in a lab on your own,” Meyrowitz says. “It’s more about observing the way people work, finding their inefficiencies, and then giving them something to correct them.”

Decades before the internet was a household word, John W. Tukey coined two terms that lie at the heart of today’s computer age: software and bit. When he died in July (see Obituaries, “The Father of Software,” page 111), newspapers across the world hailed Tukey, an eminent statistician, for his contributions to the language of high technology.

In a January 1958 article in American Mathematical Monthly, Tukey became the first person to coin a lasting term for the programming that made the era’s electronic calculators work. “Today,” he wrote, “the ‘software’ comprising the carefully planned interpretive routines, compilers, and other aspects of automotive programming are at least as important to the modern electronic calculator as its ‘hardware’ of tubes, transistors, wires, tapes, and the like.”

Twelve years earlier he’d first shown his knack for language by using bit as a substitute for the awkward binary digit, which describes the ones and zeros that are the basis of all computer programs.

According to john seely Brown there’s a big difference between information and knowledge. “Information lives on machines and can be easily transferred,” he says. “Knowledge lives in individuals and communities and can’t just be transmitted. It has to be internalized in a way people can act on it.” Such an emphasis on the personal, human quality of knowing might not be what you’d expect from a chief scientist at Xerox’s famed Palo Alto Research Center (PARC), but in fact Seely Brown has been trying to merge the digital with the human for more than three decades.

Until earlier this year Seely Brown was Xerox PARC’s director, overseeing the research that brought us such innovations as the mouse, the laser-writer printer, and the Ethernet network. But the focus of PARC, he says, has never been just technology. When he first began working there, for example, he joined a group of cognitive scientists who were operating under the assumption that in ten years they would have the technical skill to build just about anything. The real challenge, they decided, was not developing the technology but discerning what it meant and how it might be used. Seely Brown and his colleagues wanted to identify systems that would be “approachable, learnable, and usable,” he says. Much of his approach is described in his common-sense book, The Social Life of Information, which was published in February.

IBM may have invented the PC, but john crawford got it to work fast enough to stick around. Crawford, who is now the director of microprocessor architecture at Intel Corporation, was the chief architect of both the Intel 386 and 486 microprocessors and a manager of the development of the Pentium chip. He and his colleagues are the living proof of Moore’s Law, named after theorist Gordon E. Moore, who predicted in 1965 that the processing power of computers would double every eighteen months.

Originally trained as a software engineer, Crawford joined Intel in 1977, transferring five years later to a hardware group developing the 386 chip. When a few months later the group was transferred to Oregon, Crawford stayed behind and assembled a team to design a backup chip.

“We were supposed to be a stopgap, quick-to-market product,” he says. “As time went on we pretty much surpassed everyone’s expectations—which is easy to do when you’re building up from zero.” By the time Crawford’s chip hit the market in 1987, the Oregon team had been long forgotten. Crawford’s processor quickly became the standard 386, a chip that is now widely acknowledged to have saved Intel—which last year employed 63,700 people and brought in more than $25 billion in revenue.

In 1992 Crawford was named an Intel Fellow, one of the industry’s most prestigious technical positions. He is developing a processor to make the leap from 32- to 64-bit addressing—what he calls “the next step on the Moore treadmill.”

Ingrid B. Carlbom ’80 Ph.D.

In 1980 Ingrid Carlbom collected Brown’s third computer-science Ph.D. Such degrees are always notable achievements, but Carlbom’s was doubly so: she was one of only twenty-one women in the United States to earn a computer-science Ph.D. that year, less than a tenth of the number earned by men. With the percentage now up to only about 15 percent, it’s no wonder that Carlbom has done most of her work surrounded by men.

The other consistent features of Carlbom’s career have been her adaptability and the breadth of her influence. During her first job, at the oil-exploration-research company Schlumberger-Doll, Carlbom helped design the first artificial intelligence system for interpreting seismic data collected by field engineers. She also developed visualization software to create three-dimensional models of underground oilfields.

In 1988 she joined Digital Equipment Corporation’s Cambridge Research Lab and turned her attention to medical imaging, working in the fields of embryology and neurology as well as helping to map growth in prostate cancers. Three years later Carlbom became director of visual communications research at Lucent Technologies, where her innovations continue to expand the frontiers of computer science. Among her accomplishments has been managing the design of third-generation wireless devices able to deliver video over cell phones, of a new system of fingerprint authentication, and even of a system that delivers real-time statistics and replays for professional tennis broadcasts.

Carlbom is relieved that the number of female computer scientists is on at least a slow rise, but she continues to push for a more equal distribution of labor. “It’s a problem,” she says. “At my level of seniority I think there are five women in this industry. We’re all asked to be on the same committees.”

High school students interested in studying computers often pin their college dreams on one of two schools: MIT and the University of Washington, which boast two of the finest computer-science departments in the world. Heading each of them is a Brown alum.

John Guttag has been chair of the department of computer science and electrical engineering at MIT since 1988. His West Coast counterpart, Ed Lazowska, has been chair of Washington’s department of computer science and engineering since 1993. During those years they have trained many of the computer industry’s finest minds.

Guttag and Lazowska have done more than crank out industry drones, however. Lazowska, for example, tries to give students a broader perspective by putting them to work assisting Seattle public schools with Internet connections and technology training for teachers; he and his students have also led the effort to build a network linking all the state’s schools and colleges.

With all the quick fortunes that have been made in the computer industry, one of the biggest challenges for computer-science professors is keeping students in school. Armed with only the sketchiest knowledge, students are sometimes lured away by the force of their dot-com fantasies. “I talk to students carefully about it,” says Guttag. “They read about all the successes, but sometimes they don’t know they aren’t reading about the failures.”

Lazowska and Guttag face a constant battle to keep their curriculums up-to-date without revamping them for every industry fad that comes along. “What can students learn that will have lasting value over time?” Guttag asks. “It’s very easy to get caught up in the pressure to teach the tools that industry thinks you ought to know. The challenge is to figure out what’s of lasting value over time.”

Both professors point to the same inspiration for their careers, especially in learning how to connect with students: the visionary technology and education professor Andries van Dam. Lazowska still recalls the first time they met: “I was drifting around from major to major in 1969 when I was fingered as someone who might be good enough to work for Andy van Dam. I was told to show up at the Watson computer center at midnight. I was ushered into the ‘machine room’ and sent to a rear corner where there was a young guy working at a fancy graphics display. I sat down and watched him work for about five minutes. At that point, he turned around and said ‘Well, asshole, aren’t you going to ask any questions?’ That was Andy.”

What do you think?
See what other readers are saying about this article and add your voice. 
Related Issue
November / December 2000