I’ve been saying this a lot in the last few months, and I’ll be saying it again in the future : the security model of most current desktop operating systems is just completely crazy, and only the need to keep compatible with legacy software can justify continuing to use it while it needs major changes to adapt itself to the world we’re living in. This posts puts together past thoughts on the subject, and adds up some new ones too.
For those who are not convinced that there’s a problem yet, let’s first ask some simple questions :
- What prevents untrusted software from deleting all or part of the personal data stored in your home folder ? How many lines of code does it take to write a script doing exactly that ? Does someone running this script have, at some point, a clue of what’s happening before it’s too late, without having to read the script and understand it first ?
- When a system dialog does show up to tell you that an application tries to do something dangerous, do you have a clue about what exactly is that “something”, or do you only have the choice between trusting the application totally or not trusting it at all ? Can any application show a dialog looking like a system dialog and use it to get your password ?
- Does installing new applications from the internet (i.e. not provided by your operating system’s distributor) generally require giving a random “installer” program or script access rights to all of the system ? In that case, is there an easy way to differentiate legit installers from malicious program that look like an installer but silently does bad things in the background ?
Current versions of Windows fail at all three. Mac OS X only does well facing the last issue. Other unices’ performance varies from one distribution to another, but none I’ve ever met passes successfully all those simple tests. Therefore, as far as security is concerned, all those OS can be considered weak and easily exploited. The sole way they can defend themselves is by having a low enough market share and not being an interesting target to attack, or by using various anti-malware programs which detect and block as much known malware strategies as possible.
The last option is not something you should crave for. As more and more kinds of malware are around, anti-x software grows more and more complex, and therefore ends up slowing down your computer like hell. Moreover, this way of doing things is not even efficient : each time a new malware is out, it takes some time before it is detected and banned by this program’s manufacturers, and all those who get infected in meantime are… well, poor lab rats. Trying to avoid this by making adaptive anti-malware is deceptive at best : it makes it even heavier, and all those horror stories about antivirus killing computers after a false positive virus detection on system software should make you quickly realize that a blind AI is not good for everything. And finally, if people start to think that anti-malware is the mandatory way to security, we get a common attack pattern : malware pretending that it’s anti-malware itself.
So in short, desktop security is currently in a very bad shape. OS manufacturers have thought that a user/admin security model which worked fairly well on the tightly-controlled server ecosystem would scale well on desktop computers of the Internet age, and obviously, looking at the size of current botnet networks, it’s a total and absolute failure. Something better is needed, finding that something is a goal for next-generation desktop operating systems, and as such one of my main goals.
A way to solve it
In an ideal world, computers would know through some mental connection with the user what’s good for them, and would independently do their best to achieve this result. The user would take no part in the system’s security, as the system would take care of this for him and according to his will. Sadly, this is not the world we live in. Computers are dumb machines which only do what we tell them to and don’t have enough information on the user to take such complex decisions independently.
So if we don’t want to impose our own vision of what’s good for the user, like some other new operating systems, we have to give the user some level of responsibility on what he does with his computer. This does not mean cowardly putting all the blame on him, though : the computer must do its best to help the user decide, and help him detect and address potential threats.
To say it in another way, we want to detect dangerous behaviors from untrusted programs, give the user concise yet complete information about what the potential problem is, then let him decide if he’s ready to take the risk or not.
This works based on the assumption that applications rarely have to do dangerous things. Otherwise, the user will just get an endless amount of annoying warnings and will end up not bothering to read them at all and just click yes anytime he sees one. Let’s make this assumption for the moment, I’m going to show later that it is indeed valid in the context of a usual desktop computer.
How to do that in practice
Well, so we have two tasks : detecting risks, and clearly exposing them to an unskilled user. As you can probably already figure out, doing that will not be easy. So if I want to convince anyone, I must at least suggest a way this could be done, before someone claims that I just played with words without proposing any real, implementable solution to this problem I’ve shown. Okay, let’s do it now.
Issue #1 : What is a dangerous behavior ? What is a safe behavior ?
Let’s enumerate some system resources our programs may have access to in order to show that answering this question is relatively easy in practice.
CPU : Even if we’re really paranoid, we may consider that a program we explicitly ran has the right to fully benefit from its CPU time, as long as it shares the CPU fairly with other software. Therefore programs running at normal process priority are safe, processes able to alter their own priority or wishing to run on above-average priority are dangerous, RT processes are highly dangerous, and processes able to run programs with RT priority are extremely dangerous.
RAM : Memory allocation is just too useful for us to make it a privileged action, so we have to consider it safe. On the other hand, as I said before, it is prone to abuse too. What should we do when all allocatable memory has been allocated and a process asks for more ? The elder UNIX tells us that in that case, we should kill the process which uses the most memory. We should avoid killing vital system processes when doing so, though. And there should be a protection malicious against applications spawning a lot of process, each process only allocating a small amount of memory. Therefore, being flagged as a non-killable system process is dangerous, and spawning a very large number of processes (let’s say more than 10 but this should be user-adjustable) is dangerous too. Normal use of memory allocation is considered safe.
Private application data : Each application is entitled to have some private folders where she can store data that other normal applications don’t have access to. As disk space is a finite resource, HDDs and SSDs may fill up someday as applications store more and more data in there. But getting out of disk space is something which happens much more often due to legit applications than RAM shortage on modern computers, so we can’t afford just killing apps immediately there. The usual solution, namely saying that the system runs out of disk space and maybe adding up some information on who uses how much disk space, works fairly well here, so I don’t think it’s necessary to change a thing. Private data storage is hence considered safe, although the most paranoid of us will be able to put quotas on it or RAM usage if they want to.
Application folder : Now can the application modify its own program files ? I see two kind of software which might want to do that : malware and applications which include their own updater instead of using the system-provided one. Those who have installed Java on Windows know that the difference between both in terms of annoyance is very slim ;) So it seems fair to me that self-modifying applications will get a security warning. If they want to download binary plugins, they’ll have to put them in their private folder and have the user agree with all security permissions the plugins may require even though the application has those already. As for self-modifying code at run time, there are just too many interpreters which do JIT compilation in this world for me to outlaw this practice, though it does make some exploits easier.
User data : The user has a personal folder somewhere on the file system, where he can put and manage his own data safely without any fear of it being touched by other users or by untrusted applications without his permission (a part of this folder which he normally doesn’t have to care about being dedicated to per-user private application data). What form does this permission take ? Well, for other users, it works just like on other desktop OSs, and we set up sharing settings in the file properties. For applications, we consider that the user gives an application permission to open a file when he selects it using an application’s open dialog (or command line in a CLI interface) or explicitly asks the system to open this file with said application (like in the right click->open with… of Windows). What remains considered as dangerous is applications which want to touch user files without getting the user’s permission at each time (e.g. backup software).
System data and settings : Generally speaking, applications should not mess with system settings, nor with the files of any other application, as only the user and dedicated applications should do that. Now, it may appear later that some exceptions make sense. As an example, most modern 3D games can be played with a lower screen resolution, so that they work smoothly on older hardware. There’s no reason why this shouldn’t be allowed for full screen applications, as long as the original resolution comes back when the application is closed. In the same way, games and emulators/VMs often require locking the mouse when they run in windowed mode, and this is not something bad as long as the user knows how to unlock it (read : if this is done through a system API with a standardized UI)
File associations : This is a special case. We cannot give any application the right to set itself as the default way to open a file without the user knowing, but a security warning would be too much given how mundane and relatively safe setting up file associations is. So we choose to create a new system dialog specifically for this task.
When an application is run for the first time (and thus “installed”), the file associations it requests are checked. If they are not currently associated to any other application, they are silently associated to this application. If they are currently associated to other applications, a special “file association” system dialog shows up and asks the user if he wants the newly installed application to become the default way to manage those file types. For each file type where the answer is negative, the application is only put on the bottom of the list of applications associated to this file type.
Run at boot time : Another topic we must handle with care. Looking at Windows, it’s fair to say that applications run on system startup far too easily. But there are legit applications which the user would actually want to run at every boot (P2P software, mail clients…). A security warning would be, again, too much, since it’s not literally risky to run them, so I see two more mundane way of doing this without knowing which exactly is better :
- Have a “Run at boot time” special folder. The user drags and drops an application in there to create a shortcut, and removes the shortcut when he wants it to stop running at boot time. Could hardly be made simpler and the user could hardly be made more aware of what’s happening.
- Like with file associations, have a dialog show up the first time an application is run (preferably the same dialog in fact) which asks the user if he wants the application to run on startup through a checkbox that is cleared by default.
Graphics : Normal applications should not be given direct access to the chipset or the screen as a default setting for obvious security reasons. On the other hand, there’s no reason why applications shouldn’t be able to draw things through safe system abstractions, as long as they don’t prevent other applications from doing their own job properly. Practically, this means that an application can safely draw things inside of its window, but not in the rest of the screen. To prevent system window spoofing, system windows should have a characteristic, easy to remember visual aspect which normal applications cannot imitate because they can only affect what’s inside of their windows and not the window decoration itself. For the same reason, full-screen applications should not be allowed by default to have a translucent background.
Sound, other peripherals : Same idea as before : applications don’t get access to the raw hardware by default, but can use system abstractions as long as there’s no reason to prevent them from doing so, in which case restrictions may be applied. As an example, printing could imply going through a print dialog first with default security settings. These mandatory system dialogs have two advantages : not only do they force applications to implicitly ask for the user’s permission to do some things, but they also enforce respect of the system-wide usability conventions by preventing alien applications from bringing their own widget and dialog set with them when there are already some system dialogs for most common jobs.
Communication with other applications : A bit earlier, I said that applications should not mess with the files of other applications. A cautious reader would now say “Hey, isn’t it a bit restrictive ? Suppose I made an application suite, like Adobe CS or Microsoft Office, why couldn’t its applications interact with each other ?”. Without going this far, applications spend a lot of time interacting with drivers, or the operating system. For this reason, perfect sandboxing cannot work. In practice, it’s not a problem, though : as long as we allow applications to specify to the system which other applications have the right to interact with them, and in which way, everything should be fine.
Issue #2 : How do we detect dangerous behaviors ?
This is an interesting question. Analyzing binaries to detect those would be as slow and power-consuming as an antivirus, exactly the kind of things which we want to avoid. There’s also the option of asking the user at run time, when software tries to do such a risky move, but it would get in the way of the user very bluntly, stopping him in the middle of his activity, and thus chances are that he would go in a berserk “yes yes ok ok” rage. Not very effective either.
The solution I propose is to simply have applications ask the OS for the security rights they need themselves. As I currently envision using something like OSX’s .app bundles for application distribution, here’s how it would work with this system : there would be a file in the bundle’s folder hierarchy, which says which risky behaviors the application wants to engage in. Somewhere deep in the OS’ top-secret files, there would be a file for each application, telling which security rights have been granted to the application by the user. If the application asks for more than it has been allowed to, the system issues a warning to the user. As a default setting, this warning does not show up anymore once the user has chosen to take the risk to run the application : no need to go annoying once the decision has been made.
And what if the application doesn’t ask for the right to do something dangerous but tries to do it anyway, you may ask ? Well, at the moment it tries to do that, it’s automatically killed as a default setting, or the operation just fails at the user’s option. Just like in the event of segfault.
Issue #3 : How do we describe the situation to the user ?
That’s another interesting issue, and one which shows why a security system made for servers owned by competent sysadmins can’t just be ported to desktop computers without some work being done first. In a way, our task is similar to that of antivirus manufacturers who have to design infection warnings.
(I’ll try to include sample work in italics to show where this is leading, of course it is not nearly release-ready, it’s just to picture the general tone and look of the thing.)
The first thing to keep in mind that the event of such a warning is relatively exceptional for normal application. It is thus important to warn unskilled users about the gravity of the situation. The beginning of the warning should be about that : be frightening. Shocking colors, darkened background, and pictures can be used, not to the point where it becomes ridiculous though ;)
This application is requesting the following powerful rights, which can be used to compromise your computer’s security.”
Would then follow a list of security privileges the app wants to be granted, with a word of warning against malware which voluntarily obfuscate that list and a way to get an explanation about each privilege.
“Make sure it comes from a trusted source, and take care if this list is exceedingly long or complex, as it might be the sign that you’re dealing with malicious software.
- Access files in your home folder (What ?)
- Be the driver of the “USB Mass Storage” peripheral (What ?)“
When the user clicks a “What ?”, a tiny popup is summoned, whose goal is to go into more details, explaining why it is dangerous and which kind of legit programs need this right to work. As an example, “Be the driver of the xxx peripheral” could be expanded this way :
“Most applications access your computer’s hardware in an indirect and safe way. This application wants to directly access it, so there’s no protection ensuring that it won’t break it or render it unusable in another way. The sole kind of applications which should ever need this are hardware drivers, programs which ensure communication between the hardware and other applications. Those should be acquired directly from your hardware’s manufacturers.”
At the end of the warning window, there’s a little conclusion stating that the user has to make a decision between letting this application run or not. Of course, the button selected by default is “no”.
“Do you want to take the risk to run this application ?
[ Yes ] [ No ]”.
The final result could look a bit like this :
This application is requesting the following powerful rights, which can be used to compromise your computer’s security. Make sure it comes from a trusted source, and take care if this list is exceedingly long or complex, as it might be the sign that you’re dealing with malicious software.
- Access files in your home folder (What ?)
- Directly access the “USB Mass Storage” peripheral (What ?)
Do you want to take the risk to run “Virus” ?
[ Yes ] [ No ]”
It starts to vaguely shape up, though of course additional work would be needed to make sentences shorter so that this dialog becomes more punchy and easier on the eyes…
But all applications would start to spawn warnings and the user wouldn’t even read…
I have one thing to say : prove it ! On my side, I’m going to list in the following all the applications I’ve installed myself on my computer, and see how much need special security permissions to work, provided they are tweaked a little to work on this OS, if we follow the rules mentioned above.
- µTorrent : Bittorrent client. Files are saved in a user-specified directory, and in a special “downloads” directory by default. No access to other user directories is actually required. Network access goes through the system TCP/IP stack. NO SECURITY WARNING
- Adobe AIR : Bytecode interpreter. Works on user-specified AIR files. NO WARNING
- Adobe Flash plugin : Well-known browser plugin. WARNING : Wants to modify Firefox/Opera/…’s folders (I must admit that I am quite satisfied labeling Flash a security threat…)
- Aspell : Dictionary. May give everybody access to its features. NO WARNING
- Audacity : Audio editor. Edits user-specified files, plays and records audio using standard system libraries. NO WARNING
- AVG 2011 : An antivirus. The kind of things we want to get rid of. SHOULDN’T BE NEEDED
- Bamboo : Driver for a pen tablet. WARNING : Driver for peripheral xxx
- Blender : Popular 3D modeling software. Edits user-specified files, graphics using the system libraries (inc. OpenGL). NO WARNING
- Bochs : Emulator. Runs user-specified images, accesses real hardware through the system library.NO WARNING
- CDBurner XP : CD/DVD authoring tool. Random programs shouldn’t be allowed to burn CDs whenever they want, be there a system API for that or not : WARNING : Wants ability to burn CD/DVDs
- CPUkiller : Slows down the system, useful when running old games. But a WARNING : May degrade performance of the system is due, obviously, as it will have to bypass some system protections to do that properly.
- DFend Reloaded : DOS emulator. It does need to run at a slow pace, but unlike CPUkiller it can do this via timers and not slow down the rest of the system. See Bochs for the rest.
- Foxit Reader : PDF reader. Shows user-specified PDFs using system drawing libraries. Does not include a browser plugin. NO WARNING
- GIMP : Image editor. Creates and open files on the user’s command, uses system drawing libraries, prints using system print dialogs. NO WARNING
- Ghostscript and friends : I don’t even know what it’s used for, but Scribus wants it. May give other software access to its functionality. NO WARNING
- Inkscape : Vector drawing software. Same as gimp : NO WARNING
- IZArc : Archive compressor/decompressor. Operates on user-specified files and private files only : NO WARNING
- Lazarus : Programming tool. Operates in a user-specified folder and on user-specified files. NO WARNING
- LyX : Something between a programming tool and a word processor ;) NO WARNING
- Maxima : Computational program. NO WARNING
- Various system components (.Net, VC++ redist…) : Obviously, those should be handled with extreme care, so WARNINGS are to be expected (which exactly you get depend on the nature of the package you’re dealing with).
- MiKTeX : See LyX and Aspell.
- Minefield (Firefox nightly) : Web browser. Fetches and sends data on the Internet. As long as it does not send user data on the user’s back and downloads files in a private or user-specified folder, I’m okay with that (Note : there should be a way in the security framework to keep access to that folder even when the application is closed and re-opened, though, for those who want downloads to go automagically without a single “save” window…)
It might have plugins, but they are stored in the application’s private folders and as are made of interpreted code, so they can’t go beyond the permissions of the mother application. NO WARNING
- MinGW : Command-line programming tool. In terms of security permissions, it should be the same as Lazarus. The user will have to specify the output folder by hand compared to current versions, though. NO WARNING
- mIRC : Chat client. A priori, nothing different from a web browser here. NO WARNING
- Mozilla Thunderbird : E-mail client. Again, shouldn’t require more than a web browser in terms of security permissions, the sole extra thing it does is storing e-mails in a private folder. If the user gives a random program his e-mail account’s data, he knowingly faces the potential consequences in terms of privacy violation, it’s not our role to go in the way there. NO WARNING
- nMars : Redcode interpretor (used to play the CoreWar geeky game). A basic interpretor which really doesn’t require anything out of the ordinary in order to work. NO WARNING
- Various Nokia software : Well, there are some things going on there. First, there are drivers for my phone as a bluetooth and USB peripheral. Then it wants to access all of my phone’s data for backup, firmware update, and sync purposes. It may also establish an internet connection through my phone. I’d say that the result would look like this : WARNINGS : Driver for peripheral xxx, Wants full access to storage media xxx, Wants to add an internet connection
- Notepad++ : Text editor. Nothing special, if the author accepts to give up on his update system and use the system one. NO WARNING
- OpenOffice.org : Office suite. Again, nothing special. NO WARNING
- Opera : Web browser. See Minefield
- VirtualBox : Virtual machine. Same as Bochs, plus uses hardware virtualization features. I must learn more about those before I know if letting software use them through the system API still requires a security warning.
- PlasmaPong : A physics game. Nothing special. NO WARNING
- Project64 : A console emulator. Nothing special. NO WARNING
- Scribus : Desktop publishing software. Nothing special. NO WARNING
- Sins of a Solar Empire : RTS game. Nothing special. NO WARNING
- Skype : Communication software. Should be tweaked : currently, it does not close when one clicks the close button, this should not be allowed. Otherwise, NO WARNING
- Starcraft : RTS game. Nothing special. NO WARNING
- VLC media player : Media player. One thing which tickles my nose is that it gets full access to the user’s “My music” and “My videos” folders without clearly going through an open dialog for that : this should be removed. The “Library” feature shows again the need to allow an application to keep access to a file/folder even when closed and re-opened later under some circumstances. Otherwise, NO WARNING
- Worms Armageddon : Ballistic-like game. Nothing special. NO WARNING
As you can see, most applications would adapt themselves to the previous sandboxing rules relatively well, provided that they receive some small tweaks and use standard system APIs instead of hand-crafted ones. The sources of warnings which remains are drivers, system components, and apps which want fairly intrusive access to the hardware, the operating system, or other application’s files. And that’s the point : one should be very careful about counterfeited versions of those, as they can badly affect the system’s stability.
The goal of keeping security warnings exceptional through implicit permissions of the user is thus achieved. So unlike with things like UAC, users shouldn’t see those prompt every week, and would put more attention towards them when they do see one. Moreover, the dialogs are more informative, allowing users to check that the application does indeed require only permission to do what it pretends to do and keeps trapped in its sandbox otherwise.
And there ends this rather lengthy article. I hope you found it interesting ;) Again, please do not hesitate to mention any problem you see with this approach right now : it’s very difficult to fix a broken security model once millions of copies of the operating system are running, as Microsoft can attest. And though I do not pretend to ever see millions of copies of this OS running around the world, the same problem may occur on a smaller scale ^^