• Announcements

    • Robin

      Welcome!   03/05/2016

      Welcome, everyone, to the new 910CMX Community Forums. I'm still working on getting them running, so things may change.  If you're a 910 Comic creator and need your forum recreated, let me know and I'll get on it right away.  I'll do my best to make this new place as fun as the last one!
Sign in to follow this  
mlooney

NP Thur March 19 2020

Recommended Posts

Nice, but Planck units are often either too small or too large for measuring things at human scales of size, time, and energy. We could use them as a base, and then define everyday units that are some multiple of Planck units--e.g. a meter that is 10^35 Planck Lengths, or a gram that is 10^5 Planck Masses, or a temperature unit that is 10^-32 of the Planck Temperature.

Share this post


Link to post
Share on other sites
1 hour ago, hkmaly said:

Actually, speed of light IS planck unit for speed.

Well, OK, then.

 

1 hour ago, hkmaly said:

I don't think Jesus would like to see gambling in church, even bingo.

I don't think the Jesus portrayed in the Gospels would like most churches. How many times did he say, "Don't judge"; what are churches known for ... ?

 

1 hour ago, hkmaly said:

Of course they do, it's part of mass.

They do a bit more than that.

 

1 hour ago, hkmaly said:

Not interchangeably, but not with clear borders either - it is possible I call the same thing different names depending on context and/or mood if it "fits" both.

Here, what the businesses call themselves is often a matter of branding. A lot of nicer restaurants have liquor licenses and serve drinks, with a small bar that is not the focus of the establishment.

 

1 hour ago, hkmaly said:

Although if I would focus on it I would agree with your "google" paragraph and with that point about bar possibly being just section. Also, you can have minibar which is basically fridge or other piece of furniture with alcohol ; you can't have minipub.

That's a different usage of the word bar, bar as a tavern-like business vs bar as a place where you store your liquor to serve guests or make your evening drinks, for those of you who are that organized and particular.

 

1 hour ago, hkmaly said:

So what? Babbage was already working on his computer in 1820 and first programs appeared no later than 1843.

... Which he never finished, and Ada Lovelace wrote programs that never ran in her lifetime. Good idea, too far ahead of his time to implement; it could have been done had there been a compelling cultural focus. As it was, he pushed the state of the art of machining technology in Britain, but he could not retain the people he trained, they were too in demand.

 

1 hour ago, hkmaly said:

I would argue with that, but I also don't consider Windows useful software, so ...

I use it all the time, it pays the bills, but I don't like it.

 

1 hour ago, hkmaly said:

You do realize that for example, all games are event-driven? Also, it's hard to avoid event-driven programming when doing anything with GUI.

I did mention device drivers, that covers a lot of ground.

 

1 hour ago, hkmaly said:

When I was in my formative years, I've got something they called pizza and used ketchup. It made me avoid pizza for more than ten years, until I found out that in real pizza tomato sauce is used.

Uhg. I've had that. Taste like, well ketchup on pizza breading. I've had good pizza that was not covered in tomato based pizza sauce. If the person making it knows what they're doing, it can be good, though arguably isn't really pizza. But ketchup does not work.

 

 

 

1 hour ago, hkmaly said:

So what? Babbage was already working on his computer in 1820 and first programs appeared no later than 1843.

It's true that computers were not really practical until they started to use transistors.

I would argue with that, but I also don't consider Windows useful software, so ...

You do realize that for example, all games are event-driven? Also, it's hard to avoid event-driven programming when doing anything with GUI.

When I was in my formative years, I've got something they called pizza and used ketchup. It made me avoid pizza for more than ten years, until I found out that in real pizza tomato sauce is used.

 

 

 

 

Share this post


Link to post
Share on other sites
1 hour ago, Darth Fluffy said:
3 hours ago, hkmaly said:

I don't think Jesus would like to see gambling in church, even bingo.

I don't think the Jesus portrayed in the Gospels would like most churches. How many times did he say, "Don't judge"; what are churches known for ... ?

Good point.

1 hour ago, Darth Fluffy said:
3 hours ago, hkmaly said:

Of course they do, it's part of mass.

They do a bit more than that.

As I said, if they drank only as part of mass and STILL get drunk, they would be doing it very incorrectly.

Although it reminds me one joke about a novice priest who was recommended vodka against stage fright ...

1 hour ago, Darth Fluffy said:
3 hours ago, hkmaly said:

Not interchangeably, but not with clear borders either - it is possible I call the same thing different names depending on context and/or mood if it "fits" both.

Here, what the businesses call themselves is often a matter of branding. A lot of nicer restaurants have liquor licenses and serve drinks, with a small bar that is not the focus of the establishment.

What the businesses call themselves ALSO has little influence on how I call them :)

1 hour ago, Darth Fluffy said:
3 hours ago, hkmaly said:

You do realize that for example, all games are event-driven? Also, it's hard to avoid event-driven programming when doing anything with GUI.

I did mention device drivers, that covers a lot of ground.

... no, it doesn't. Device drivers are relatively small pieces of software. Usually part of kernel.

1 hour ago, Darth Fluffy said:
3 hours ago, hkmaly said:

So what? Babbage was already working on his computer in 1820 and first programs appeared no later than 1843.

... Which he never finished, and Ada Lovelace wrote programs that never ran in her lifetime. Good idea, too far ahead of his time to implement; it could have been done had there been a compelling cultural focus. As it was, he pushed the state of the art of machining technology in Britain, but he could not retain the people he trained, they were too in demand.

Yeah. Pity.

Share this post


Link to post
Share on other sites
4 hours ago, hkmaly said:

What the businesses call themselves ALSO has little influence on how I call them :)

It's good to have the name correct if you are meeting friends, who may be using an app to find it.

 

4 hours ago, hkmaly said:

... no, it doesn't. Device drivers are relatively small pieces of software. Usually part of kernel.

Device drivers are small because they are focused on a specific task. That task bundles the messiness of interfacing with an external real world. Unlike games, where you can generally fudge if you skip a frame, interfacing to hardware has to catch everything while competing for attention with other hardware. It handles all of the hard issues. Your device drivers are essentially competing with each other. And the goal is everyone wins; if one dominates, they all loose.

The kernel might have device drivers, but the manufacturer of hardware should provide more tailored ones. They would work at a low level, plugging into the kernel and might be considered as a whole.

Re: Your dislike for Fenestrations. Fenestrations is device driver hell.

 

4 hours ago, hkmaly said:

Yeah. Pity.

<wistfully considers, "How much difference would it have really made?"> Odds are we would have had more accurate math tables published, that was the goal.

Would more than one have been built? From our perspective, we'd think so, but they would not have appreciated what they had, early on, and the time to build each would have been daunting.

I've worked with IBM punch cards long ago, before terminals were common. They were used as input, and in some cases output and long term storage for computers, but they had their won history; the various devices used to manipulate the cards included sorting machines. Since the cards as storage and the machinery to manipulate them could branch, making decisions, it seems the assemblage was Turing complete, and could be used to compute. I don't know the history of this in detail, but the Hollerith cards were developed in the late 1800s for census processing, so I'm thinking that as computing power, it was more than hypothetical. In the first part of the twentieth century, these cards and the associated machinery were used in database style tabulations, enough so that IBM was already a thriving concern when relay and electronic computation became available.

I think, at best, the analytic engine would have introduced a similar card based information culture earlier. It would have done a similar job as the card punches and sorters, in a fairly messy, steam powered fashion. It would be interesting to see, I hope someone makes the movie version.

Great impact? Eh, well, the telegraph, railroads, machine guns, balloons, and cartridge rounds were available during the American Civil war. Did they matter? Yes, to varying degrees. The intelligence assets were quickly exploited; the telegraph changed the pace at which response could be deployed by orders of magnitude. The balloon allowed battle updates within minutes when coupled with the telegraph. Railroads moved men and supplies much more efficiently, likely saving the North. Cartridge rounds were much faster to load and probably decided the ultimate outcome, in that the North was much more able to replace archaic weapons and field modern ones; they hit partial deployment vs the South barely deploying. Many Southerners carried family heirlooms to battle. The (hand cranked) machine gun was poorly understood, relegated to artillery, were it was not competitive with traditional designs. 

The point is, it's hard to say what the impact of a specific technology would have been in the short term. In 20/20 hindsight, Ada Lovelace's accomplishments look amazing; in the 1840s, had the machine been built, <shrug>, I have doubts she would have been appreciated in her time. It's like finding a $1000 bill, but you've never seen a bank note before. "Hey, look at the fancy toilet paper."

 

 

 

Share this post


Link to post
Share on other sites
3 hours ago, Darth Fluffy said:

Device drivers are small because they are focused on a specific task.

Back in the late 80's early 90's I wrote device drivers for a weird fax modem.  Unlike most modems it was memory mapped, which makes for a strange driver.

Share this post


Link to post
Share on other sites
2 hours ago, mlooney said:

Back in the late 80's early 90's I wrote device drivers for a weird fax modem.  Unlike most modems it was memory mapped, which makes for a strange driver.

That sounds like the host would have had memory mapped I/O as well, not just the peripheral. Was it current then? I recall a lot of that kind of funk from the primordial micro days in the late 70s early 80s, but not even as recent as a decade later.

Share this post


Link to post
Share on other sites
5 minutes ago, Darth Fluffy said:

That sounds like the host would have had memory mapped I/O as well, not just the peripheral. Was it current then? I recall a lot of that kind of funk from the primordial micro days in the late 70s early 80s, but not even as recent as a decade later.

It was an ISA card that was made for IBM type machines, so yeah, sorta like video cards it was memory mapped.  You had to set it's location in memory with jumpers on the board.  Weird ass thing.

 

Share this post


Link to post
Share on other sites
9 hours ago, Darth Fluffy said:

<wistfully considers, "How much difference would it have really made?"> Odds are we would have had more accurate math tables published, that was the goal.

[snip]

The point is, it's hard to say what the impact of a specific technology would have been in the short term. In 20/20 hindsight, Ada Lovelace's accomplishments look amazing; in the 1840s, had the machine been built, <shrug>, I have doubts she would have been appreciated in her time. It's like finding a $1000 bill, but you've never seen a bank note before. "Hey, look at the fancy toilet paper."

While far from a serious take on the subject, I feel like I must mention Lovelace and Babbage.

Share this post


Link to post
Share on other sites
10 hours ago, Darth Fluffy said:
15 hours ago, hkmaly said:

What the businesses call themselves ALSO has little influence on how I call them :)

It's good to have the name correct if you are meeting friends, who may be using an app to find it.

For that case, the proper way to referent to them is like "the pub named Indian Restaurant".

(Also, all friends know pubs around my home better than me.)

10 hours ago, Darth Fluffy said:
15 hours ago, hkmaly said:

... no, it doesn't. Device drivers are relatively small pieces of software. Usually part of kernel.

Device drivers are small because they are focused on a specific task. That task bundles the messiness of interfacing with an external real world. Unlike games, where you can generally fudge if you skip a frame, interfacing to hardware has to catch everything while competing for attention with other hardware. It handles all of the hard issues. Your device drivers are essentially competing with each other. And the goal is everyone wins; if one dominates, they all loose.

Sure you can skip a frame but users will be angry if you miss a keypress ... even if they already hold five other keys.

Yeah, in good old times™ device drivers were struggling to catch everything and competing with each other ... nowadays, most hardware is capable of buffering events and the rest  is so slow compared to CPU it doesn't matter. And then you have DMA: you don't need to worry your soundcard goes out of music to play as you can just give it start of several megabytes long buffer and it will fetch it from memory itself, and similarly disk and network card, not speaking about graphics card which usually have it's own memory.

10 hours ago, Darth Fluffy said:

The kernel might have device drivers, but the manufacturer of hardware should provide more tailored ones. They would work at a low level, plugging into the kernel and might be considered as a whole.

Yes, even manufacturer-provided drivers are loaded into kernel and run inside it.

Also, no, manufacturers of common hardware are not expected to provide their own drivers, they are expected to follow the standards. ALL USB mass storage devices should operate the same way, no need for specific driver for every one. Of course, manufacturers of something complicated like graphics card should provide better drivers ... and sometimes they even do ... but that's more like exception.

10 hours ago, Darth Fluffy said:
15 hours ago, hkmaly said:

Yeah. Pity.

<wistfully considers, "How much difference would it have really made?"> Odds are we would have had more accurate math tables published, that was the goal.

Would more than one have been built? From our perspective, we'd think so, but they would not have appreciated what they had, early on, and the time to build each would have been daunting.

Right, true. After all, computers were underestimated even much later.

On 8/8/1999 at 0:00 AM, Danny Hillis said:

I went to my first computer conference at the New York Hilton about 20
years ago.  When somebody there predicted the market for microprocessors
would eventually be in the millions, someone else said, "Where are they
all going to go? It's not like you need a computer in every doorknob!"
       
Years later, I went back to the same hotel.  I noticed the room keys had
been replaced by electronic cards you slide into slots in the doors.
       
There was a computer in every doorknob.
        -- Danny Hillis, cca 1999

10 hours ago, Darth Fluffy said:

I think, at best, the analytic engine would have introduced a similar card based information culture earlier. It would have done a similar job as the card punches and sorters, in a fairly messy, steam powered fashion. It would be interesting to see, I hope someone makes the movie version.

I'm surprised noone did yet. There definitely were books about it. From William Gibson.

It IS possible that this would make the true computers arrive little faster ... but maybe not.

10 hours ago, Darth Fluffy said:

The point is, it's hard to say what the impact of a specific technology would have been in the short term. In 20/20 hindsight, Ada Lovelace's accomplishments look amazing; in the 1840s, had the machine been built, <shrug>, I have doubts she would have been appreciated in her time. It's like finding a $1000 bill, but you've never seen a bank note before. "Hey, look at the fancy toilet paper."

Yes, it's definitely hard to say ... but that goes both ways. There are inventions noone would predict how much and how quickly they change the world.

 

Share this post


Link to post
Share on other sites
3 hours ago, ChronosCat said:

While far from a serious take on the subject, I feel like I must mention Lovelace and Babbage.

"Suddenly there is a gaping hole in my life that I was hitherto unaware." - she is obviously unaware of banal Presidential Tweets; that would be the bucket of ice water she needed.

There is a factual error, Babbage did finish his non programmable Difference Engine.

 

Share this post


Link to post
Share on other sites
1 hour ago, hkmaly said:

Yeah, in good old times™ device drivers were struggling to catch everything and competing with each other ... nowadays, most hardware is capable of buffering events and the rest  is so slow compared to CPU it doesn't matter. And then you have DMA: you don't need to worry your soundcard goes out of music to play as you can just give it start of several megabytes long buffer and it will fetch it from memory itself, and similarly disk and network card, not speaking about graphics card which usually have it's own memory.

Yes, even manufacturer-provided drivers are loaded into kernel and run inside it.

Also, no, manufacturers of common hardware are not expected to provide their own drivers, they are expected to follow the standards. ALL USB mass storage devices should operate the same way, no need for specific driver for every one. Of course, manufacturers of something complicated like graphics card should provide better drivers ... and sometimes they even do ... but that's more like exception.

USB is a good example of what you are talking about, so, yeah, I can see what you are saying. My impression is that manufacturers still try to distinguish their products with extra bells and whistles which only their drivers can activate, but I could be wrong.

 

1 hour ago, hkmaly said:

I'm surprised noone did yet. There definitely were books about it. From William Gibson.

It IS possible that this would make the true computers arrive little faster ... but maybe not.

Yes, it's definitely hard to say ... but that goes both ways. There are inventions noone would predict how much and how quickly they change the world.

I am aware if Gibson's work, but have never read any of them.

Back in the primordial days, one of the 'how computers work" column in some magazine was cast in the framework of Sherlock Holmes solving his latest crime of interest using the Analytic Engine. The lesson of the article was in the format of Holmes explaining to Watson what he was doing. It was a clever presentation; I think it may have been bundled as a loose volume. No great lessons, and apparently forgotten.

 

1 hour ago, hkmaly said:

Yes, it's definitely hard to say ... but that goes both ways. There are inventions noone would predict how much and how quickly they change the world.

I doubt if anyone foresaw how much we'd be living our lives online. Now ++, courtesy of COVID-19.

 

Share this post


Link to post
Share on other sites
8 hours ago, Darth Fluffy said:

That sounds like the host would have had memory mapped I/O as well, not just the peripheral. Was it current then? I recall a lot of that kind of funk from the primordial micro days in the late 70s early 80s, but not even as recent as a decade later.

Oh good grief, 1970s desktops had like 8-16 kilobytes of RAM. The whole set of OS and drivers had to run mostly from ROM because it could only be allowed to use about 4 kilobytes of the running memory.

Share this post


Link to post
Share on other sites
2 hours ago, Darth Fluffy said:
4 hours ago, hkmaly said:

Yes, even manufacturer-provided drivers are loaded into kernel and run inside it.

Also, no, manufacturers of common hardware are not expected to provide their own drivers, they are expected to follow the standards. ALL USB mass storage devices should operate the same way, no need for specific driver for every one. Of course, manufacturers of something complicated like graphics card should provide better drivers ... and sometimes they even do ... but that's more like exception.

USB is a good example of what you are talking about, so, yeah, I can see what you are saying. My impression is that manufacturers still try to distinguish their products with extra bells and whistles which only their drivers can activate, but I could be wrong.

They claim to, but remember that it's in their interest as it can raise the price of device.

Note that claiming so cost significantly less than actually build something useful into the product.

2 hours ago, Darth Fluffy said:
4 hours ago, hkmaly said:

Yes, it's definitely hard to say ... but that goes both ways. There are inventions noone would predict how much and how quickly they change the world.

I doubt if anyone foresaw how much we'd be living our lives online. Now ++, courtesy of COVID-19.

That would be good example. And you only need to look at sci-fi to see how surprising some stuff was.

Share this post


Link to post
Share on other sites
8 hours ago, hkmaly said:

They claim to, but remember that it's in their interest as it can raise the price of device.

Note that claiming so cost significantly less than actually build something useful into the product.

... almost sounds like a Microsoft rant.

 

Share this post


Link to post
Share on other sites
9 hours ago, ijuin said:

Oh good grief, 1970s desktops had like 8-16 kilobytes of RAM. The whole set of OS and drivers had to run mostly from ROM because it could only be allowed to use about 4 kilobytes of the running memory.

I believe the original bottom end TRS-80 had 4K of memory. 16K was the top end, if I recall correctly. I never owned one. It was temping, in the day. You could have a working computer for $500. I had friends that had spent far more on kits that they barely got working.

My first computer was an Exidy Sorcerer. It was fairly well designed for its day but corners were cut on the hardware, and Exidy had little commitment to it. 48K memory, although, that was a bump, maybe 16K or 32K was standard? Why not 64K? Well the upper 16K was used for several machine functions and rom packs. I bought chips to make it 64K, then realized it would be counterproductive to install them.

They all had weird graphics at the time, if they had graphics at all. My Exidy only had character graphics, but I could define the upper set of characters, down to the pixel. The TRS 80 did something similar with their upper set, not definable, but predefined blocky graphic elements, blocks of white or gray on a 2x3 grid. I think memory mapped video was standard at the time.

 

Share this post


Link to post
Share on other sites
40 minutes ago, Darth Fluffy said:

They all had weird graphics at the time, if they had graphics at all. My Exidy only had character graphics, but I could define the upper set of characters, down to the pixel. The TRS 80 did something similar with their upper set, not definable, but predefined blocky graphic elements, blocks of white or gray on a 2x3 grid. I think memory mapped video was standard at the time.

Memory mapped video was standard on PC until expanded VGA cards became standard.  I suspect that if, for some reason, you are back in real character based mode, which is tricky to to do on a modern computer, that it's still memory mapped.

 

Share this post


Link to post
Share on other sites
1 hour ago, mlooney said:

Memory mapped video was standard on PC until expanded VGA cards became standard.  I suspect that if, for some reason, you are back in real character based mode, which is tricky to to do on a modern computer, that it's still memory mapped.

The early PC was pathetic, and did not deserve the success it had. IBM had no understanding of their market (I'm looking at you, CGA), and picked the worst 16 bit chip from what was available, based on it was the earliest, but wasn't 16 bit in the address space. Fortunately, we seemed to have evolved beyond that early legacy.

I don't know that "real mode" is a thing any more, in the hardware. There's many good reasons to have abandoned it, and little call to run the old software as such. I suspect it died when XP went away.

The emulators like DOSBOX still have to deal with all the crappy decisions, like the various memory extension systems.

Share this post


Link to post
Share on other sites
5 minutes ago, Darth Fluffy said:

I don't know that "real mode" is a thing any more, in the hardware. There's many good reasons to have abandoned it, and little call to run the old software as such. I suspect it died when XP went away.

Try Win9x as when "real dos mode" went away.  XP's dos mode was an emulation as much as DOSBOX is and had the same issues in not being able to run select types of software.

Share this post


Link to post
Share on other sites
8 hours ago, mlooney said:

Try Win9x as when "real dos mode" went away.  XP's dos mode was an emulation as much as DOSBOX is and had the same issues in not being able to run select types of software.

In some sense, I miss '98. It ran everything I had at the time, mostly. XP was not heinous, it still ran most stuff; backward compatibility was still a thing. Win 10 reached new depths of breaking the existing software.

The death of real mode is not a bad thing. Frankly, if needs a stake through its heart, just to be sure.

DOSBox does a good job, but not as good as XP. I, among many am a fan of MOO2, still a fan, and have played it within the last year. All of the bells and whistles worked under XP, a couple of them are broken under DOSBox. It is getting better. I can attack Antares now.

Share this post


Link to post
Share on other sites
On 4/1/2020 at 5:40 PM, mlooney said:
On 4/1/2020 at 4:56 PM, Darth Fluffy said:

They all had weird graphics at the time, if they had graphics at all. My Exidy only had character graphics, but I could define the upper set of characters, down to the pixel. The TRS 80 did something similar with their upper set, not definable, but predefined blocky graphic elements, blocks of white or gray on a 2x3 grid. I think memory mapped video was standard at the time.

Memory mapped video was standard on PC until expanded VGA cards became standard.  I suspect that if, for some reason, you are back in real character based mode, which is tricky to to do on a modern computer, that it's still memory mapped.

I'm sure modern card can do the legacy memory mapped graphics as well. You know, that one where you could fit 320x240 256 color pixels in the memory ... or have more pixels but some weird hard to understand tricks how to switch it.

On 4/1/2020 at 7:32 PM, Darth Fluffy said:

I don't know that "real mode" is a thing any more, in the hardware. There's many good reasons to have abandoned it, and little call to run the old software as such. I suspect it died when XP went away.

Oh yes it is. Intel is talking about abandoning support for legacy BIOS for motherboards, but the CPU itself still boots in real mode.
("Legacy" BIOS basically works by starting CPU in real mode, switch to "protected" mode to do HW/memory initializations, then switch back to real mode, boot in it and then OS switch to "protected" mode again. UEFI keeps the CPU in "protected" mode.)
 

On 4/1/2020 at 7:40 PM, mlooney said:

Try Win9x as when "real dos mode" went away.  XP's dos mode was an emulation as much as DOSBOX is and had the same issues in not being able to run select types of software.

Actually WinMe, but that's sort of 9x. However, if you had EMM386 or any other memory extender, you actually already run in emulation (V86 mode) in DOS ... it just had wider compatibility because there was no multitasking and no protection of I/O, so more assumptions from real mode worked, AND, it had VCPI, which is a way to give all access to application (think game) hoping that the game won't overwrite your memory and give you all the access back when it finish running. You can't do THAT from any OS which either supports multitasking or has basic security.

On 4/2/2020 at 4:46 AM, Darth Fluffy said:

In some sense, I miss '98. It ran everything I had at the time, mostly. XP was not heinous, it still ran most stuff; backward compatibility was still a thing. Win 10 reached new depths of breaking the existing software.

Funny is that in Win 10, the reason of those new depths are not hardware. From hardware standpoint, it runs same as XP. They are only changing software, possibly including throwing again some old legacy compatibility support layers.

On 4/2/2020 at 4:46 AM, Darth Fluffy said:

The death of real mode is not a bad thing. Frankly, if needs a stake through its heart, just to be sure.

Intel is working on that. However, I don't think they will remove V86 mode ever. The CPUs are now so complicated it wouldn't really help, and keeping JUST the most modern 64bit part would be too much.

Also, they tried multiple times to market alternative CPUs not compatible with the x86 mess and it was always disaster.

(For reference, the CPU has real mode, 16bit protected mode, V86 mode, 32bit protected mode, 64bit protected mode, hypervisor mode and some management mode which is supposed to be used for power management. Oh, and the take-control-over-the-computer-remotely AMT mode, if it's real Intel and not AMD.)

On 4/2/2020 at 4:46 AM, Darth Fluffy said:

DOSBox does a good job, but not as good as XP. I, among many am a fan of MOO2, still a fan, and have played it within the last year. All of the bells and whistles worked under XP, a couple of them are broken under DOSBox. It is getting better. I can attack Antares now.

I suspect those missing things are not related to CPU but GPU.

Share this post


Link to post
Share on other sites
1 hour ago, hkmaly said:

Also, they tried multiple times to market alternative CPUs not compatible with the x86 mess and it was always disaster.

Uh, ANY other non x86 of the era? The Amiga? The Mac?

Share this post


Link to post
Share on other sites
On 4/3/2020 at 7:59 AM, Darth Fluffy said:
On 4/3/2020 at 6:21 AM, hkmaly said:

Also, they tried multiple times to market alternative CPUs not compatible with the x86 mess and it was always disaster.

Uh, ANY other non x86 of the era? The Amiga? The Mac?

To be more specific, those alternatives were so big disaster most people don't know they exists. Ok, they also were mostly into professional/server market, not for personal PCs.

iAPX432. 80960. 80860. XScale. IA-64 alias Itanium alias Merced.

Share this post


Link to post
Share on other sites
4 hours ago, hkmaly said:

To be more specific, those alternatives were so big disaster most people don't know they exists. Ok, they also were mostly into professional/server market, not for personal PCs.

iAPX432. 80960. 80860. XScale. IA-64 alias Itanium alias Merced.

IT is not largely driven by technical excellence, it is often moved by marketing. Otherwise, would we even have Microsoft today?

Zilog had a good thing going in the 8 bit era. Clean design, superset of the original mainstay of the serious micro, and it kept them going for a few years. The Z8000 was a good design, much better than the 8086, but Zilog was not a semiconductor powerhouse.

The 68000 from Motorola was similar, and Motorola had the chops to sell it, and the Mac of the era and the Amiga are not slouches. Both were years ahead of 8086 and Microsoft operating systems. The failure of both to dominate the field is a failure of marketing.

Intel would be dead as well, except they spent (presumably still do) and enormous amount on research and eventually caught up and surpassed. And they did some things well; I like their graceful overheating mitigation through throttling and shut down vs AMD's "We'll do it cheap" pop the IC approach.

That all said, it has to be admitted that one of the reasons, maybe the only one, that the 8086 was chosen was it was the first and it was available. Intel has always been a leader.

I love the iAPX432. I don't know what to call that; not so much a failure of marketing, rather maybe an abandonment. The damn chip set was decades ahead of its time. And that hints at the problems. Chip set; I don't recall how many but I'm thinking around four, and you needed a separate boot and configuration processor. Totally non-standard paradigm, too. Nonstandard architectures have come, had their day, and gone. (Database machines, like Britain Lee and Teradata, and Lisp machines) Maybe it will never be a thing.

Same time frame, Intel also had a beautiful single chip signal processing microcomputer. I don't remember the number. Everything, including analog I/O, on one chip. I suspect, given the clock speeds of the day, that it was too slow for anything we'd consider to be useful. I know I went looking for it a few years later, and the had already dropped it.

 

 

Share this post


Link to post
Share on other sites
On 3/31/2020 at 5:22 AM, hkmaly said:
On 3/31/2020 at 3:47 AM, Darth Fluffy said:
On 3/31/2020 at 2:07 AM, hkmaly said:

I don't think Jesus would like to see gambling in church, even bingo.

I don't think the Jesus portrayed in the Gospels would like most churches. How many times did he say, "Don't judge"; what are churches known for ... ?

Good point.

If I recall correctly, you have yourself to blame that I now know about this webcomic.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
Sign in to follow this