Skip to main content Skip to secondary navigation
Main content start

Security in new operating system a matter of (less) trust

Perhaps it is time for computers to have HiStar, a new Unix-like operating system that pares trust among software programs down to a bare minimum

Sometimes it seems like the human default setting is to not trust. Nations have armies, corporations have lawyers, and doors have locks. Yet people often give too much trust to their software. Perhaps it is time for computers to have HiStar, a new Unix-like operating system that pares trust among software programs down to a bare minimum. Programs can still get legitimate jobs done, even without the presumption of trust. It’s just much harder to try any funny stuff, like leaking sensitive data over networks.

“Today there is a big problem with untrustworthy code running on our machines,” says Nickolai Zeldovich, a doctoral student in the research group of Stanford computer science Assistant Professor David Mazières. Zeldovich presented HiStar to the computer science community at the USENIX 7th Symposium on Operating Systems Design and Implementation in Seattle earlier this month. His collaborators are fellow Stanford student Silas Boyd-Wickizer, UCLA Assistant Professor Eddie Kohler, and Mazières. “To start with, people download arbitrary code from Web sites and run it with full privileges on their desktop computers, leading to a rise in malware, spyware and so on. But even code you’d hope would be trustworthy, such as antivirus software, sometimes has serious remote vulnerabilities.”

In other words, people routinely run programs with privileges to read, write, and transmit a lot of information. Security rests on the naïve hope that programs will not do things that violate that trust. HiStar, in contrast, allows one to impose strict limitations on software to ensure that programs do not access, modify or disclose information they are not supposed to.

Meanwhile, authority in HiStar is decentralized. There are no trusted users with special privileges to access all data, such as the “root” user in most Unixes. The only real power of the root user in HiStar is to kick users off the system wholesale, but that crude coercive power still does not give root access to anyone’s private data.

In fact, pretty much the only universally trusted code in HiStar is the operating system’s kernel, which has been pared down to only 17,000 lines of code so that it is easy to scan for unauthorized changes (by contrast, the kernel of Linux as of Spring 2006 was nearly 7 million lines of code). Where there must be trust, there is also transparency.

Tainted code

The nitty gritty of HiStar is the idea of “taint,” an idea also applied in a previous operating system Mazières helped design called Asbestos. Taint allows the operating system to enforce restrictions on information flow by making explicit the lack of trust one part of the system has for another. Consider that a tanker truck can be labeled for transporting hazardous materials. It can transport industrial chemicals but it cannot carry fruit juice. Similarly, software programs running on HiStar are labeled with varying degrees of taint and can therefore handle some specific data, but not other kinds.

One example of how this works is antivirus scanning. The job of a scanner is to read through all of a user’s data to look for the signature traces of code in viruses, worms and other cybernasties. But what if the antivirus scanner itself has been compromised? In most operating systems, it would be trusted and therefore have carte blanche to send data to a hacker. In HiStar, however, an antivirus scanner runs tainted. That designation means the kernel prevents it from communicating anything over the network. The network is regarded as untainted and tainted programs cannot talk to anything less tainted than themselves. Similarly, the helper applications the scanner spawns share its taint, keeping them from the network as well.

Mistrust of the scanner also precludes it from sending any information, such as its progress and results, to the system’s user interface for display on the screen. Giving the user feedback therefore requires the help of a simple, 110-lines-of-code “wrap” program that escorts information from a tainted source for use by an untainted entity. In the abstract labeling scheme Zeldovich and his co-authors use to describe HiStar, this ability to essentially untaint selected information is represented with a star. This symbol is the source of the operating system’s name.

If holding a star sounds like winning absolute trust, it shouldn’t. It is only valid in a very specific situation, known as a category, such as antivirus scanning. The wrap program cannot take data tainted in a different category and liberate it from the stigma it holds. In other words, wrap can allow the virus scanner to talk to the interface, but it cannot help a word processor do the same thing without getting another star for that purpose. Thinking again of tanker trucks, the truck that delivers gasoline to the station for use in cars still cannot haul other liquids for which it is not rated.

Meanwhile, it is notable that no matter how big and complex the virus scanner program may become (the one Zeldovich and Mazières have tested is 40,000 lines of code), the wrap program will stay small and therefore easy to audit for any corrupt code. Like the kernel, wrap only gets the trust it has because it has unusual transparency.

Getting along without trust

What’s most important about the antivirus scanner example is that the scanner can do everything it is supposed to do even though it is doing it under a cloud of suspicion. Zeldovich and Mazières are keenly aware that the operating system must offer comparable functionality and performance to less secure ones.

Another example of an important application that can run without the system’s full faith is logging in. In this case the system has an obvious reason not to trust the person at the keyboard but it is equally true that the person at the keyboard should be guarded about the system leaking his password. At the point in the process where the password is checked, taint plays a vital role in ensuring that malicious code on the system can’t reveal the user’s password over the network.

Essentially, HiStar’s login process creates an ephemeral, tainted password checking program that can compare an encrypted version of the password the user enters with a version of the password provided by the system and encrypted in the same way. The tainted password checking program can only communicate whether there is a match to one other, star-holding program: the one responsible for granting a legitimate user her privileges. Once the password checker has told the granting program there is a match, it is deleted. Dead code tells no tales.

For all the overhead of labeling, and tainting, and creating, and deleting in the name of mistrust, HiStar still performs well. Zeldovich and Mazières performed several benchmarking tests. For most operations (but not all) HiStar ran about as well as Linux or OpenBSD Unix.

There are some technical limitations associated with the young operating system. One of them is that administration without an all powerful “root” user may turn out to be substantially different than what system administrators are used to. Very little about adopting a new operating system, in fact, is a trivial matter for those who do it.

With so many data security threats looming out there, Mazières says he is motivated to find ways to help people use computers despite them. A hope he has for HiStar is that it will actually allow people to do more with their computers than before because it will allow them to safely work with untrustworthy but potentially valuable resources, such as the abundant, useful code people make available for download under free software licenses.

“There are a lot more untrustworthy resources out there than trustworthy ones,” he says. “If only a small part of a system needs to be trusted, we can have more options for how to structure our systems.”