Page 3 of 7

Re: Dawn

Posted: Mon Mar 27, 2017 7:54 am
by Geri
MollenOS wrote:
x86, where handling usb described by a 10000 page long book, 2 billion transistors of imperialism.
Other than your entire post made close to zero sense, you do know USB has very little to do with X86 right? The usb-controller and the usb-protocol works entirely the same on other architectures.
thats correct, but the whole protocol interhit x86-s point of view, created by the same corporations who ruined the industry (intel, hp, microsoft, nes...)

Re: Dawn

Posted: Mon Mar 27, 2017 8:27 am
by Brendan
Hi,
Geri wrote:
MollenOS wrote:
x86, where handling usb described by a 10000 page long book, 2 billion transistors of imperialism.
Other than your entire post made close to zero sense, you do know USB has very little to do with X86 right? The usb-controller and the usb-protocol works entirely the same on other architectures.
thats correct, but the whole protocol interhit x86-s point of view, created by the same corporations who ruined the industry (intel, hp, microsoft, nes...)
Yes; after 30+ years 80x86 PC has some "scar tissue" where things that never existed in the 1980s got added. This happens with any hardware that is successful enough to exist for many decades (but it's also why BIOS is deprecated and everything is shifting to UEFI).

The only reason your silly fantasy has no "scar tissue" is that it has not been successful for 30+ years.


Cheers,

Brendan

Re: Dawn

Posted: Mon Mar 27, 2017 9:03 am
by Geri
brendan, if it depends on me, the IO specifications i made will not be changed at all, ever.

maybe a lot of hardware and software implementations will be buggy, maybe a lot of places its throat will be bleeding, maybe a lot of hardware will not work in X or Y situations, but thats not depends on the specifications. it will not just have 4*6 methods to just access a byte on the disk/disc. (OR if yes, the original implementation will stay compatible basically forever (if we not run out from 64 bit address space), and it is enough to support just that to have everything up and running.)

Re: Dawn

Posted: Mon Mar 27, 2017 10:11 am
by Schol-R-LEA
Geri wrote:to take back the control from the hardware-imperialists.
This statement makes no damn sense at all. As I've already mentioned, there's no 'taking back' what people never had to begin with; all the early small computers, starting with the MSI TTL designs such as the PDP-8, and going through the Altair, the Sol, the PET, the Apple I and II, up to the present day, have all been built from, and dependent on, hardware produced in massive IC foundries which could produce thousands of chips of various sorts every day. Even the TTL and PLA implementations of SUBLEQ would be impossible without the companies you are denouncing.

I am aware that your namesake (I assume you aren't the same Jeri, given the different spelling and your disdain of hardware) has demonstrated that she can build SSI integrated circuits by hand in a home lab - just barely - but it is a long step from putting a half-dozen NMOS transistors onto a chip, to being able to use that same technique to build even something as simple as a 4-bit half-adder (which IIUC takes a minimum of 9 NMOS transistors per bit on the half-adder - I gather that you only need 7 in a CMOS IC, but her approach doesn't seem to allow for both N and P transistors on one chip). I am also aware that techniques for additive-process fabrication ("3D printing") of transistors are being developed, and great strides in the process have been made in the ten years since the first experiments with it began, but it will be quite a while before those are practical, as well.

As for 'imperialism'... I am at a loss as to what that's supposed to mean in this context. While the Intel/AMD x86 duopoly in desktop systems has been pretty unshakable up to now (there's some hope for fluidity now that there is a version of SM Blinds running (in emulation) on ARM processors, but since they seem to be planning on licensing it only for wholly locked-down tablets, that may not be much of a much), that has more to do with the historical development of home computers than with either of the companies, or the x86 design itself.

Hell, Intel themselves have been trying to get rid of x86 pretty much since they came out with it (as I have already said), but at the same time they don't want to risk their dominant market position (they are a for-profit corporation, after all), so they end up making half-steps such as Itanium and then pulling back in panic the moment there is any sign of it faltering.

If that is 'imperialism', well, they are downright timid compared to the likes of the oil companies, or Oracle, or IBM back in their heyday. As imperialists go, Intel is far from the worst.

And who would replace them? Imagination haven't exactly lit the world on fire even though they hold the IP for the most widely used CPU on the planet; they are doing more to hold back the ARM design through their weak handling of it than any of their competitors are. Loongson is a more likely alternative, but frankly, I doubt that a MIPS implementation manufactured by the Chinese government is going to go over well in the Western market, and they haven't exactly been aggressive in marketing it. AMD is unlikely to rock the boat with a new design; they will stick to x86 and ARM, or maybe go with RISC-V if they are feeling daring, but they aren't in a position to actually do much that is new, or even make a SUBLEQ chip of their own, unless they think that a market exists for it (hint: it doesn't). TI, Nat Semi, NEC, Transmeta, Hynix, ON Semi (oops, forgot that they bought out Fairchild), and just about every other IC manufacturer have dropped out of the CPU game for the most part (several of them make microcontrollers and FPGAs and the like, but none are working in the consumer PC/Phone/Tablet space to any notable degree), and are unlikely to try to jump back in.

I hate to say it, but unless and until printable ICs which can be downloaded and made by casual consumers become a reality, Intel is the least unpalatable of a lot of bad options, and are likely to remain on top even if they weren't.

Besides, to the average electronics user, none of this matters. The number of people in the world who even know what an ISA is is probably smaller than the population of Singapore (5.6 million, by the way, according to Wicked-Pedo). The average consumer doesn't even understand what an operating system is, and as long as they can run a browser, a word processor, and Facebook on the one they have, they don't really care - nor should they, any more than they should care how the heating element of the toaster or the fuel injection system of their car works. Life is too short for anyone to know how all the things used by modern society work, what matters is whether they work well enough not to anger too many of the ones stuck with it.

Seriously, the whole Ivan Illich/Lee Felsenstein trip is a fine ideal, and one I wish were more realistic, but most people don't really want or need conviviality in their tools. They just want to use them.

Re: Dawn

Posted: Mon Mar 27, 2017 12:00 pm
by Korona
Wait, we are not really comparing subleq hardware implementations to real CPUs, are we? Because of their low instruction density (and thus bad caching behavior) CPU manufacturers are struggling to produce fast RISC CPUs; even the CISCy ARM with lots of vector extensions cannot compete with x86 in high-performance markets. There is no way something like subleq is ever going to work at an acceptable speed.

Re: Dawn

Posted: Mon Mar 27, 2017 12:49 pm
by Geri
Schol-R-LEA:

yeah, that jeri is a different person (the name is just coincident).

dont misunderstand me, i am NOT against corporative chip productions, i am a capitalist.

however, by capitalism, i mean capitalism. and not monopolimperialistic oppression: i am strongly against these isa-s, chipsets,hardware and software technologies, and thats why i made Dawn OS.

now i have a bit of time and i will now tell why i decided to create dawn and did it with subleq.

at first i was generally preferred x86/pc over comodore, becouse it was a more open architecture where there was a tons of manufacturers who made cpus, hardware, and main boards, meanwhile most of the architectures at that time were closed. z80 were also a good competitor (thats actually a 8080 clone with a new register bank) but it failed to enter the high-level system era and it stuck in 8 bit. and as x86 had a general and good community and support. i am not this old, these things arrived into my contry in the 90s, before the contry was communist dictature and had no general computerization.

in the 2000-s when x86 manufacturers (cpu, video card, etc) started to gone and only 1-2 left, i realized that something is seriously wrong with x86, at that time we still used mainly DOS and i didnt really understood for example how x86, or even 3d, opengl works, i put together some primitive 3d renderers before that time in dos qbasic, and i used a lot of variety of basic language to do computer games. i realized that all of them is crap, and real programming canot be made in them.

and at that time, knowledge was still hidden from the mass in the contry, i had to wait for 4 years to even talk to a person who ever SEEN a c code, i had to sacrafice years from my life just to have access to a course where i was able to learn c and computer architectures deeply. (at that time internet was only introduced to the masses and it was not possible to learn the language with search. and hardware and software magazines contained no technical stuffs, they looked like payed articles and advertisation of x86 related stuffs)

i however was wrong, instead of getting real knowledge, they educated fake everywhere: list x86 cpus in pentameter, put a hdd into the computer, how to install a windows... (which is funny becouse besied this, a person who comes out from university with degree in computers here, still does not knows whats a transistor, a 486 cpu, x86, or what the BIOS is).

at a time suddenly c# appeared, and suddenly, from 0, everybody started to teach, learn, every corporation suddenly adapted it, and after a long investigation i realized that microsoft bribed down public servants and basically all scool leaderships to chain the contry into this shitty jit programming language put together from unreproducable libraries, and microsoft also payed activists to propagate the language people on forums and they made fake parameters and stats about the language (like its 2x faster than c, etc). the myths slowly dieing, becouse bribed corporate people will back each others @$$ if somebody questions how and why it is called programming if they click windows together and call a library 600x slower than a few line c code, or why w need 80000 clock cycle long SQL messages to store data of something, moving camera with wasd in an unity tutorial and calling it ,,his game'' etc etc.

i slowly realized that this is the whole mainsteam computer industry nowdays is about: making extremely complex architectures that only 1 or 2 corporation can manufacture, annihilate the knowledge, chain people to technologies that millions of people can fakely expertise while gnurting above them like pigs.

now actually we are basically lag beyong the 60s in computerization. there is no control and knowledge related to computer technologies, imperialists hijacked the whole industry, the whole thing depends on 1-2 factory and if they stop for some reason, 50 year of data, knowledge, software will disappear suddenly now including even the telephone, TV and film industry too, there will be no corporation to recreate anything related to x86, windows, linux, avi players, not even the original creators.

i was wondered a lot of on this, but since there is no more technologic enhanchements in the industry since the last 15 years, only backward steps, i felt to create something that can bring back the computer industry to the level of the people, bring the control, the knowledge back to the persons.

of course i had/has a lot of dependent software for windows and liunx, becouse i have to live from somewhat.

BUT i felt that i must make 180 degree turn and start going backwards, becouse i will not march into the abyss with the flock of the sheeps. this step had to be a constructive step, since you cant protest against computers, so i have to make some product or solution.

there were no day when i wake up and sayd: i will make a new system becouse computer technology is currupt and oppressive and designed to earn 1 million usd even where only creates 1$ value.

it was just a slow process. at first i didnt know how complex x86 is and how it designed on the purpose of never be recreated in any forms again, and didnt tought that arm is so shitty and just so bad either.

at first i tought i will make a virtualized operating system / like android / that hides the original cancer from platforms, and i can release it on any platforms. however linux people were however very angry about it, becoue they mainly think the purpose of linux is being the fact that they can type unix commands into a console, even if it produces no values.

i tought first using not urisc, vliw was my first goal, becouse it worked so nice in theory. i talked with a lot of persons who were fantasizing about simds, vliws, quantum computers, and i put together an architecture with 32 piece of 64 bit registers, fixed length 255 instructions. it worked so great in theory! well yeah, in practice its useless, becouse no matter how well your instruction set is designed, you cant join the most of the operations, only rarely. (the same problem what killed itanium and kills elbrus too). a lot of people spend lot of time meditating on they vliw/quantum conceptions, but its useless, so i just throwed the conception to the garbage. some of them even take it personally and refuses to talk me after i decided not to make quantum magic processor architecture set. but its they problem.

then i was like: okay, at this point the whole industry is basically the enemy of freedom.

the mainstream os, mainstream computer platforms like x86 or arm, the mainsteam shitty conceptions, and the pigs grunting and clicking in the corporations without creating any value.

WHAT TO DO?

i remembered somebody mentioned me an architecture that can only sub and make a branch, and a whole computer. i find urisc conception, make lot of investigations. there were varians of subleq (there is even addleq and friends). bitbitjump, bit and byte moving architectures.

i choosed subleq becouse that is the most human understandable urisc architecture that can be explained to anyone instantly, and still simple enough to be implemented in computers. maybe not the best choice from technical aspects, and maybe other solutions could produce 30-40% more performance, less transistor, whatever, i decided to choose this.

then i realized that i must do a compiled becouse the alreday existing compiler only is just basically operates as a virtual machine for illustrational purposes without relocation, without data formats and any type of floating point.

so i did the compiler first, it was a very hard and long task, becouse i had no knowledge how to do a compiler, and later i started to develop the OS.

and now there is done.

and at first time in my life, i feel like i achieved something, something that a very long time ago nobody achieved. i feel like i am actually discovered and conquered the computers. the real computers.

Re: Dawn

Posted: Mon Mar 27, 2017 10:30 pm
by Schol-R-LEA
I have to be honest, your assessment - to the extent that I was able to follow it - is not so far from my own. We differ in two major respects: you think the situation was different in the past, and that it is curable today.

I suppose one could argue that it was different, in some ways, in the very earliest days of home computing, as the scene was a mix of hyper-enthused engineers and passionate social reformers. That part is true. It is also true that most people - even in places like Silicon Valley and Tech Highway - didn't heard about this movement until after it was gone. And that the ones who started the whole thing were quickly shoved to the side by people like Gates and Jobs, or absorbed into their already growing empires, such that a company such as Cromemco or Processor Tech which flourished in 1975, couldn't have gotten started in 1979. And it is true that by the time these companies became media darlings, the technology was taking a back seat to the marketing, even in technology driven companies.

No surprise there, really, as the exact same pattern had played out in the 1950s, when mainframes first appeared, and again in the 1960s with minicomputers. If microcomputers had gone differently, that would have been a shocking surprise.

But that's really a side point, because where you see conspiracy and greed, I see Hanlon's Razor ("Never attribute to malice what can be adequately explained by stupidity") at work. Things aren't terrible because people planned for it to be terrible; things are terrible because the people doing things had no idea what they were up to, and made a mess out of it all. Intel, Microsoft, etc. are just as trapped in that mess as everyone else is.

And in case you haven't noticed, that ought to be a lot scarier than malice would be. If the Emperor has no clothes, then what are the rest of us wearing?

Remember that most of the systems of the time were only powerful in comparison to not having a computer at all, or having to use a timesharing system through a teletype - a rather low bar to hurdle. I don't know if timesharing services were ever available to the general public in Hungary; I doubt it, because they really weren't even in the US at the time (while they were expected to become a sort of public utility, they were too expensive to access and too unreliable to be useful for anyone who wasn't using them professionally, and by the time that started to change home and office PCs were making them unnecessary). Trust me, you didn't miss much.

The programmers then didn't write better code because they were better programmers, they were writing better code because the systems were so limited that they were limited to doing really simple things. Oh, and because most schools only taught programming in post-graduate engineering courses, though technical schools did pop up by the early 1970s to crank out semi-competent coders to write COBOL and RPG III packages for IBM mainframe shops. Still, the programs which most people saw - if they saw any at all - were written by the very few people who knew how to, and all of them were already well trained and as knowledgeable as possible before they began; no one who wasn't would have even tried, because there wasn't much room for anything else.

Let me put it to you this way: even in 2007, most people in the US - which is pretty much the definitive early adopter of computer technology, though it was never as pervasive here as it was in, say, Japan - had never actually owned their own computer, and maybe a quarter had never used one at all. And you know what? The percentage who own a desktop or laptop PC has dropped since then. Why? Because all the things they want to do with them, they can do better and easier on a smartphone. The don't know or care about the technology, and if they needed to know anything in order to use it, they simply wouldn't bother. You could say that this is a sign of being hypnotized sheep, but that misses the point: they only adopted these things in the first place because they didn't have to stop being hypnotized sheep in order to do so.

To anyone who isn't a techno-crank like us here on this board, the last thing they want is to have to learn anything new, to have to deal with change, or to face the fact that they have no idea what is going on around them or how the tools they use work. And they will resist anything or anyone who tells them they should. The sane path is the path of least resistance; only the mad, such as ourselves, think otherwise. The world needs our madness in order to adapt to unpredictable changes, but most of the time, the world will ignore us until it can no longer afford to do so - and rightly, as most of the time, our new ideas turn out to be wrong.

Re: Dawn

Posted: Tue Mar 28, 2017 1:47 am
by SpyderTL
I, too, agree with most of what Geri has said, although I would have worded it differently. :)

I've said as much before on this forum, that I wanted to "start over" at the OS level, using modern tools to re-solve problems that were encountered 50 years ago, just to see how things may have unfolded differently.

I see the Subleq approach as the same idea, only at the hardware level -- a simple "what-if" scenario where instead of trying to pack more and more functionality into a mechanical device, you just find the simplest, most elegant universal solution possible, and build everything else off of that.

As an analogy, this reminds me of the metric system vs. the imperial system. I'm guessing that over half of the people on this site use the metric system, but the idea of basing all measurements on factors of ten, and then defining all measurement systems in terms of properties of a universally constant and readily available reference material, such as water, was an important achievement for this planet in terms of improving communication and efficiency.

Another breakthrough in extreme simplification was the iPhone. Compared to the approach that other companies were taking, which was to try to somehow bring the computer desktop paradigm to the phone, Apple decided to go a different route, and designed a device with a touch screen and almost nothing else.

The advantage to this simplified approach is that now you have a human interface that is so simple that, very literally, a 9-month old can effectively use your product. I know this for a fact, because I saw it happen with my own eyes.

Now, imagine the same paradigm shift happening with computer hardware, where device drivers and task schedulers can be written by 5 year olds, in class, in a single day, because the tools that they use are so simple to understand.

Or, if you prefer, when I was a kid, I could build virtually anything using a box full of Legos, of all different shapes and sizes. And there were thousands of different Lego blocks to choose from. But kids today build entire worlds out of blocks that are all exactly the same shape and size in Minecraft.

I guess, as an engineer, it's always easier to add complexity than it is to imagine simplicity. That reminds me of an infomercial that I saw for the first time last night for a product that would keep your folded clothes organized in your drawers or in your suitcase. It was essentially a thin plastic sheet that you put between your clothes that would allow you to easily pull out one item without messing up all of the others. Of course my first thought was "Duh, why didn't I think of that." That sort of "simple" solution takes imagination more than engineering.

Re: Dawn

Posted: Tue Mar 28, 2017 6:52 am
by mikegonta
Geri wrote:i started to create a bootable emulator to have a bootable dawn running on regular pcs.
but i will possibly not develop the bootable emulator.
Before you give up on PC's entirely, you might consider releasing the DAWN C source code so that those interested could give it a
proper source code evaluation and/or native x86 source code implementation.

Re: Dawn

Posted: Tue Mar 28, 2017 8:09 am
by Schol-R-LEA
SpyderTL wrote:I guess, as an engineer, it's always easier to add complexity than it is to imagine simplicity.
I quite agree. There is a definite tendency to add things, usually one small piece at a time like decorations to a Christmas tree. It always seems reasonable at the time - after all, it is just a small addition, it adds value and doesn't negatively affect the cost or usability too much, so why not? Unfortunately, this can only work for so long before the 'ornaments' outweigh the 'tree', leaving the whole unbalanced and ugly even if each addition would, on it's own, be an improvement.

So eventually, you don't have a lot of choice but to start over, or at least remove things rather than add more.

This is hardly unique to programming, of course. It shows up even in things like theology (simplicity was a major factor in the adoption of Buddhism, Christianity, and Islam in their early development - when they first appeared, they were a lot less heavy on rituals and dogma than the religions they supplanted - and efforts to remove accretions was part of the Protestant Reformation, as well as in the origins of religions as diverse as Zen Buddhism, Sufiism, Unitarianism, and Quakerism).

There are several aphorisms in engineering, design, and theoretical science warning about the risks of over-complicating things; indeed, one of these, Occam's Razor (which is usually described as "The simplest theory that explains the whole phenomenon is usually best"; however, it originated as a principle for philosophical debate, and is actually more like "do not add anything new without having a reason to do so"), is one of the primary means of determining the usefulness of a model in the scientific method. One of the most famous is a quote from Antoine de Saint-Exupéry, (back when he was known for being an aircraft designer rather than a children's author), but often attributed to Einstein, who often quoted it: "perfection is finally attained not when there is no longer anything to add, but when there is no longer anything to take away". Einstein had a similar one himself: "the supreme goal of all theory is to make the irreducible basic elements as simple and as few as possible without having to surrender the adequate representation of a single datum of experience" - often paraphrased as "Everything should be made as simple as possible, but no simpler."

The however, finding a path between Scylla and Charybdis - between 'too simple' and 'too complicated' - is hard. Most of the really, really good simplifications - Newton's mechanics, the calculus (not getting into the arguments over priority for that, no sir), Einstein's general relativity, the Turing Machine, Darwin's theory of natural selection, McCarthy's universal evaluation function, plate tectonics - tend to be surprising, even offensive, to the people of their day, but are so powerful and unifying that within a few generations people feel they are entirely obvious and forget how complicated things were before them. They almost always tie together things which no one before - even the person who created them - had ever suspected as having anything to do with one another. They are what Stephen Jay Gould and Nil Elbridge called 'saltations' - drastic jumps away from the existing course things had been on. You can't really predict them, anticipate them, or... count on one arising, sadly.

These sorts of blindly brilliant, powerful, and revolutionary ideas tend to be rare, and it is easy - oh so easy! - to think you have found one when you haven't. I've thought I'd found one on several occasions, only to realize that it either was something already thought of before, or had a fatal flaw I didn't notice at first.

(Footnote: Before anyone complains, I won't try to defend Gould. While he is one of my favorite authors, most accounts indicate that in person he was as much of a condescending prick as Richard Dawkins, he was way too defensive about his own work and tended to over-apply it, and the way he tried - and mostly succeeded - in screwing Elbridge out of the credit for punctuated equilibrium is simply shameful. Feet of clay etc.)

Re: Dawn

Posted: Tue Mar 28, 2017 8:32 am
by Schol-R-LEA
Note that I am not agreeing or disagreeing with Geri - I am deliberately playing the skeptic role, because it is important to poke and prod at new ideas to make sure they aren't going to tip over like a Jenga tower. I am trying to help, not by being a cheerleader, but by asking the hard questions and seeing what answers you can come up with. It's a necessary part of the process, though one all too often ignored, because it is both difficult and painful.

So, perhaps you should think of me as a conceptual pentester, trying to find exploits in the ideas so you can then fix any leaks.

Trust me, nothing would please me more than for a new and powerful idea to prove sound and useful. But just thinking it is sound isn't enough; you need to test it, to argue against it, to see what critics might say about it, to find any flaws in it before proceeding.

I'll admit I've gotten personal at points. I'll admit I haven't played an honest hand at times. I'll admit I have been brutal and nasty when I didn't need to be. I am human like anyone else, and if it is any consolation, I do feel guilty in retrospect over some of the things I have said.

Re: Dawn

Posted: Tue Mar 28, 2017 8:38 pm
by Schol-R-LEA
From the "Shameless Self-Promotion" Dept.: I was wondering if you had read my essay Historical Notes on CISC and RISC yet. I don't know if what I said there is really relevant to what you are thinking of, but it may help you understand what I have been saying a little.

(And yes, the footnote at the end is more than a bit dismissive of OISC. Now, you will note that this was written in the singular. Why? Because in 2006, there were perhaps three such emulators around at all, that I knew of at least, the best known of which was one actually named OISC, and it was that specific language which I was referring to. I had no idea that they had gained so much interest since then until you started posting here about Dawn OS. Before I first posted in this thread, checked the Esolangs Wiki and found that there has been a lot more work on them since I last looked at the subject. I do find it interesting and I think that this interest has the potential to further the theoretical understanding of ISA design, but as I have said, interesting doesn't mean practical.)

Re: Dawn

Posted: Wed Mar 29, 2017 8:23 am
by mikegonta
Geri wrote:i started to create a bootable emulator to have a bootable dawn running on regular pcs.
but i will possibly not develop the bootable emulator.

Re: Dawn

Posted: Wed Mar 29, 2017 10:16 am
by Korona
Schol-R-LEA wrote:From the "Shameless Self-Promotion" Dept.: I was wondering if you had read my essay Historical Notes on CISC and RISC yet. I don't know if what I said there is really relevant to what you are thinking of, but it may help you understand what I have been saying a little.
Let my hijack this thread and comment on this: One thing that bugs about your RISC/CISC article is that it disregards the advantages of Intel's modern CISC implementations: Register renaming (modern RISC implementation need it too!), microcode (likewise), macro-fusion, micro-fusion, smaller icache footprint and so on. The article provides a good and detailed historical overview but the conclusion "RISC is architecturally superior" and "we would be better off if the industry invested in RISC" is pretty shallow.

Today RISC is not inherently better (or worse) than CISC: The trade-off is between silicon size (and thus lower production cost and absolute energy consumption) for RISC and performance (and relative energy consumption) for CISC.

Re: Dawn

Posted: Wed Mar 29, 2017 1:23 pm
by Geri
mikegonta, how not creating a bootable x86 emulator by myself will rip my OS, when the os is not even x86-related?