Hi,
axiagame wrote:I also do not want to start a project I will never complete, so I am wondering this : can I assume that a personalized OS can give higher security againsit different forms of hacking, starting by the fact it can not be tested for leaks that would help hack it ?
Most current OSs are monolithic. This means that a lot of code (with a lot of bugs) is running at the highest privilege level, which creates a significant security risk (people exploiting those bugs). In addition, it means that the kernel has to have a way to start/load code that will run at the highest privilege level, which creates another significant security risk (e.g. people writing "trojan drivers"). For a micro-kernel; all the device drivers, etc typically run at the lowest privilege level. This mitigates a lot of the risk - if a bug in a driver is exploited or there's a "trojan driver", then the attacker won't be able to access anywhere near as much. In addition, you can more effectively limit what a driver can access (for example, you could say "keyboard drivers don't have permission to use networking" and make it much harder for someone to write a key-logger).
The next step is libraries - most OSs are infested. A security hole in one library means that any software that uses that library has a security hole. It's possible to use "services" instead; where a service runs as a process (in its own separate virtual address space) and software communicates with the service via. IPC. For an example, imagine you've got code to compress/decompress data with an accidental security hole, and several applications (web browser, word processor, etc) that uses the compression/decompression code. If the compression/decompression code is implemented as a library then it can access any/all data (and any/all other resources - files, sockets, etc) that any of the applications have access to (e.g. possibly including the web browser's built-in "password manager" data). If the compression/decompression code is implemented as a service then it might only be able to access itself and any data that applications have explicitly sent to it (e.g. not including the web browser's built-in "password manager" data).
Then there's programming languages. Most OSs allow (e.g.) applications and drivers to be written in languages like (e.g.) assembly, C and C++. Programmers need more skill to avoid bugs in these languages, and these languages also make it easier for bugs to go unnoticed. Making people use different languages and/or making people use some sort of safe/managed code, will improve security.
There's also delivery method. For example, almost all open source software is much more vulnerable to a type of "man in the middle" attack; where someone downloads the source code, inserts their own malicious code, then provides malicious binaries to the general public. For a simple example; imagine if you went to your local computer shop and told them you want to supply lots of "Ubuntu Install CDs" for free (to promote open source and/or Linux and/or Ubuntu). Everyone that uses your CD is vulnerable to whatever changes you felt like making, and if these people bother to look at the source code (99.999% of people won't) then they'd be looking at the original (unmodified) source code and aren't going to see any of your malicious changes. If you attempt to do something like this with Windows or OS X; everyone is going to know that something is dodgy (e.g. piracy) before they even see the CD. Basically in both cases you have to trust the supplier of the software, but for open source you also need to trust any people in between the original supplier and the end user.
Finally; most existing OSs were designed for "large boxes" (desktop, workstation and server systems) where getting physical access is harder; and because of this they're aren't designed to prevent access from people with physical access (for example, someone booting a "live" CD and mounting existing file systems to by-pass the installed OS's file system checks). The modern trend is towards mobile systems, where it's much easier to (e.g.) steal someone's laptop or smartphone; and the old "getting physical access is harder" mentality is no longer reasonable. This means that you can/should consider doing things like encrypting file systems; so that physical access alone is not enough to gain access to sensitive data.
For the reasons above; it is possible for an OS to be much more secure than existing mainstream OSs.
For your own personal OS (that you and nobody else uses); the problem is that a full OS takes a massive amount of work. The OS would be secure because there's no applications and no device drivers, not because nobody else has the software.
Cheers,
Brendan