• Announcements

    • Robin

      Welcome!   03/05/2016

      Welcome, everyone, to the new 910CMX Community Forums. I'm still working on getting them running, so things may change.  If you're a 910 Comic creator and need your forum recreated, let me know and I'll get on it right away.  I'll do my best to make this new place as fun as the last one!
Sign in to follow this  
mlooney

NP Thur April 23 2020

Recommended Posts

1 hour ago, Darth Fluffy said:
2 hours ago, hkmaly said:

... hmmm  ... although, reading the details, probably the authors of CP/M are to blame ...

Gary Kildall. I don't think it is a fair criticism. What made sense years earlier for nascent 8 bit systems with limited memory was not intended to be ported to larger systems. It didn't make sense when it happened, other than from a marketing P.o.V. CP/M was perceived to be the professional microcomputer system at the time, so to market micros to businesses, IBM wanted something close to being CP/M. They basically did not understand their market. There is no way that their system should have won out. The Atari ST was a much better architecture, but it was built like crap. The Amiga was years ahead of the competition. So was the Mac. "Snatched defeat from the jaws of victory" comes to mind.

It was actually first from long line of victories of compatibility over quality. Paterson decided to implement CP/M clone instead of clean design, including compatibility on unhealthy level. Market proved it was good decision. Some of that compatibility remains in Windows even today, and Windows Me was compatible with DOS so much it was seriously impacted by it ... and every decision to keep compatibility was good for market and every case where some part of compatibility was dropped was bad for product doing that.

Similarly, Intel tried to develop new architecture, IA-64, because they didn't believed that the i386 architecture, which at that point was superscalar emulation of 32bit extension of 16bit extension of 8bit processor, is too messy to extend. AMD developed their 64bit extension to that mess and market rejoiced, and Intel was forced to buy this extension and work on compatible processors ... and IA-64 remained obscure. Although, based on what I read about the architecture (I mean, the actual documentation of architecture), it was disaster anyway. It was impossible to program without complicated compiler.

The inertia of software is enormous. It shouldn't be. Properly written program should be easy to port to new system. But between the fact lot of programs are not written properly and that lot of companies keep their source code and are neither willing to do the port nor allow anyone else to do it, there is huge advantage of any new system if it can run already existing programs.

The price we pay for it is much messier design of basically everything - except, of course, niche projects who decided for clean design ... and remain obscure. It's not just Microsoft: The GNU kernel is still not in shape to compete with Linux, BECAUSE it was programmed based on theoretically good design theories, while Linux is mostly monolithic.

 

Share this post


Link to post
Share on other sites

CP/M was designed originally to run using no more than 8 KB of RAM, which made sense when it was being used on systems that could have as little as 16 KB. Thus, it took a number of shortcuts in order to conserve every byte possible (the two-digit year representation that led to the Y2K problem being one example). It did not make sense to use such limitations on systems that could have a megabyte or more of memory, much less the 16-32 GB that is typical of current systems, yet we have all sorts of workarounds set up in order to allow backwards compatibility. Such workarounds more or less amount to the equivalent of requiring a Boeing 787 to be capable of using parts from a WWI biplane.

Share this post


Link to post
Share on other sites
4 hours ago, hkmaly said:

It was actually first from long line of victories of compatibility over quality.

SCO Xenix was available at the time for the IBM compatibles, although it was not cheap.

 

4 hours ago, hkmaly said:

The inertia of software is enormous. It shouldn't be.

Remember "COBOL programmers needed for Y2K"?

 

4 hours ago, hkmaly said:

The GNU kernel is still not in shape to compete with Linux, BECAUSE it was programmed based on theoretically good design theories, while Linux is mostly monolithic.

I'm not sure I follow this. GNU tends to be slow to roll out, and I'd blame delays and half baked releases before I'd blame 'superior design' as a flaw. OTOH, if you're right, and 'superior design' is truly a cause of GNU non-acceptance, is the design actually superior? Maybe the theory is flawed.

 

 

Share this post


Link to post
Share on other sites
11 hours ago, hkmaly said:

Exactly. If Mysticism is not true, they you can't quit the game, because there wouldn't be any "you" after you quit.

I suppose in normal circumstances when one quits something it is assumed that one will move on and do something else instead (or if circumstances permit and one is stubborn, one might try again). However, I don't see how the definition requires this.

If that's not what you're getting at, I'm even more confused. Just because someone is dead does not mean we can't use verbs when talking about what they did in the past, and we can also use verbs to refer to the actions of someone who is about to die, so why would this be any different?

Share this post


Link to post
Share on other sites
16 hours ago, ijuin said:

CP/M was designed originally to run using no more than 8 KB of RAM, which made sense when it was being used on systems that could have as little as 16 KB. Thus, it took a number of shortcuts in order to conserve every byte possible (the two-digit year representation that led to the Y2K problem being one example). It did not make sense to use such limitations on systems that could have a megabyte or more of memory, much less the 16-32 GB that is typical of current systems, yet we have all sorts of workarounds set up in order to allow backwards compatibility. Such workarounds more or less amount to the equivalent of requiring a Boeing 787 to be capable of using parts from a WWI biplane.

Yes. However, note that average software can easily have more parts than Boeing 787 if you don't count the software used ON the Boeing.

12 hours ago, mlooney said:
12 hours ago, Darth Fluffy said:

Remember "COBOL programmers needed for Y2K"?

Hell, some states needed COBOL programmers this year to fix their unemployment systems when the system got stressed.

Yup. It's hard to find them, as the use of COBOL cripples the mind. Also, remember that computer without COBOL and FORTRAN is like a chocolate cake without ketchup and mustard.

12 hours ago, Darth Fluffy said:
16 hours ago, hkmaly said:

The GNU kernel is still not in shape to compete with Linux, BECAUSE it was programmed based on theoretically good design theories, while Linux is mostly monolithic.

I'm not sure I follow this. GNU tends to be slow to roll out, and I'd blame delays and half baked releases before I'd blame 'superior design' as a flaw. OTOH, if you're right, and 'superior design' is truly a cause of GNU non-acceptance, is the design actually superior? Maybe the theory is flawed.

I'm specifically talking about kernel. GNU kernel is a microkernel: theoretically superior concept, but not really practical when it was first proposed as it brings overhead. Of course, we are getting close to point where the overhead might be worth it ... that's why microkernels are appearing in real (as opposed to "hobby") systems now (iOS, for example, is said to be using microkernel, although modified for better performance in way which may make it not really microkernel anymore).

7 hours ago, ChronosCat said:
19 hours ago, hkmaly said:

Exactly. If Mysticism is not true, they you can't quit the game, because there wouldn't be any "you" after you quit.

I suppose in normal circumstances when one quits something it is assumed that one will move on and do something else instead (or if circumstances permit and one is stubborn, one might try again). However, I don't see how the definition requires this.

If that's not what you're getting at, I'm even more confused. Just because someone is dead does not mean we can't use verbs when talking about what they did in the past, and we can also use verbs to refer to the actions of someone who is about to die, so why would this be any different?

"Using verbs" when talking about what a dead person is doing AFTER death is mysticism.

But, yes, the "quit" in the original statement MIGHT be understood in way which makes it possible, if undesirable.

 

Share this post


Link to post
Share on other sites
 
6 minutes ago, hkmaly said:

Yup. It's hard to find them, as the use of COBOL cripples the mind.

There are $ome good counterargument$ to that.

 

6 minutes ago, hkmaly said:

Also, remember that computer without COBOL and FORTRAN is like a chocolate cake without ketchup and mustard.

... and a day without sunshine isn't 'day'.

 

6 minutes ago, hkmaly said:

I'm specifically talking about kernel. GNU kernel is a microkernel: theoretically superior concept, but not really practical when it was first proposed as it brings overhead. Of course, we are getting close to point where the overhead might be worth it ... that's why microkernels are appearing in real (as opposed to "hobby") systems now (iOS, for example, is said to be using microkernel, although modified for better performance in way which may make it not really microkernel anymore).

Well, you know what they say, no GNUs is good GNUs.

 

6 minutes ago, hkmaly said:

"Using verbs" when talking about what a dead person is doing AFTER death is mysticism.

... or a recording.

 

Share this post


Link to post
Share on other sites
1 hour ago, hkmaly said:

Yup. It's hard to find them, as the use of COBOL cripples the mind. Also, remember that computer without COBOL and FORTRAN is like a chocolate cake without ketchup and mustard.

RATFOR, or Rational Fortran isn't that bad.  Of course it's not normal Fortran, it is like the early version of C++ which "compiled" to normal ANSI C code, which was then compiled to machine code.  Ratfor compilers generate FORTRAN which is then compiled to machine code.  

Share this post


Link to post
Share on other sites
2 hours ago, Darth Fluffy said:
2 hours ago, hkmaly said:

Yup. It's hard to find them, as the use of COBOL cripples the mind.

There are $ome good counterargument$ to that.

Well, sure. It's just that whores only sell their body.

2 hours ago, Darth Fluffy said:
2 hours ago, hkmaly said:

Also, remember that computer without COBOL and FORTRAN is like a chocolate cake without ketchup and mustard.

... and a day without sunshine isn't 'day'.

... is this about freedom of information?

42 minutes ago, mlooney said:
2 hours ago, hkmaly said:

Yup. It's hard to find them, as the use of COBOL cripples the mind. Also, remember that computer without COBOL and FORTRAN is like a chocolate cake without ketchup and mustard.

RATFOR, or Rational Fortran isn't that bad.  Of course it's not normal Fortran, it is like the early version of C++ which "compiled" to normal ANSI C code, which was then compiled to machine code.  Ratfor compilers generate FORTRAN which is then compiled to machine code.

I suppose that Fortran 2003+ is not that bad either, it's just niche language. The quote was presumably about Fortran 66.

Even COBOL has newer version actually ... problem is that the majority of COBOL code is still written either in the old version, or AS if it was written in the old version.

 

Share this post


Link to post
Share on other sites
2 hours ago, hkmaly said:

Well, sure. It's just that whores only sell their body.

Have you ever actually used COBOL? It's archaic, but it is a procedural language, hence does pretty much what any other procedural language does. The main problem with it is that it is built around stuff that no one does any more. It can handle punchcard-centric I/O like nobody's business.

I've actually met Grace Hooper.

 

2 hours ago, hkmaly said:

... is this about freedom of information?

That was a thing, wasn't it? No, I had entirely forgotten that. Politics is more about 'stick it where the sun don't shine' than about sunshine.

 

2 hours ago, hkmaly said:

I suppose that Fortran 2003+ is not that bad either, it's just niche language. The quote was presumably about Fortran 66.

Even COBOL has newer version actually ... problem is that the majority of COBOL code is still written either in the old version, or AS if it was written in the old version.

Yes, we need Visual COBOL ++.

 

 

 

 

Share this post


Link to post
Share on other sites
1 hour ago, Darth Fluffy said:
4 hours ago, hkmaly said:

Well, sure. It's just that whores only sell their body.

Have you ever actually used COBOL? It's archaic, but it is a procedural language, hence does pretty much what any other procedural language does. The main problem with it is that it is built around stuff that no one does any more. It can handle punchcard-centric I/O like nobody's business.

No, I need my mind working. Although, considering Dijkstra was mainly complaining about GO TO and I'm fine despite using assembler ...

I did read the wikipedia page though. It mentions that while COBOL has something which was possible to use as subroutines, it lacked parameters passing and due to how perverse the return was implemented didn't allowed recursion. Sounds horrible enough.

1 hour ago, Darth Fluffy said:
4 hours ago, hkmaly said:

I suppose that Fortran 2003+ is not that bad either, it's just niche language. The quote was presumably about Fortran 66.

Even COBOL has newer version actually ... problem is that the majority of COBOL code is still written either in the old version, or AS if it was written in the old version.

Yes, we need Visual COBOL ++.

No. We also don't need Borland COBOL ++ Builder.

Share this post


Link to post
Share on other sites
8 hours ago, hkmaly said:

Yup. It's hard to find them, as the use of COBOL cripples the mind. Also, remember that computer without COBOL and FORTRAN is like a chocolate cake without ketchup and mustard.

I'm specifically talking about kernel. GNU kernel is a microkernel: theoretically superior concept, but not really practical when it was first proposed as it brings overhead. Of course, we are getting close to point where the overhead might be worth it ... that's why microkernels are appearing in real (as opposed to "hobby") systems now (iOS, for example, is said to be using microkernel, although modified for better performance in way which may make it not really microkernel anymore).

The COmmon Business-Oriented Language was created to be legible to people who don't actually understand programming (e.g. middle managers and government bureaucrats), which resulted in it being heavy on verbiage and light on computational efficiency.

From Wikipedia:

" COBOL was designed in 1959 by CODASYL and was partly based on the programming language FLOW-MATIC designed by Grace Hopper. It was created as part of a US Department of Defense effort to create a portable programming language for data processing. It was originally seen as a stopgap, but the Department of Defense promptly forced computer manufacturers to provide it, resulting in its widespread adoption. "

FORmula TRANslation was created not for general programming purposes, but rather to aid in setting up heavy-duty computations for mathematical research purposes. A subset of FORTRAN was cleaned up and made into the basis of most early BASIC implementations.

As far as microkernels go, most microkernel implementations off-load a lot of micromanagement to the application programmer or end-user which most people would rather not bother with most of the time. It's great if you want hardware-level control of everything, but it's not great if you just want to "plug and play".

Share this post


Link to post
Share on other sites
6 hours ago, hkmaly said:

No, I need my mind working.

The mind is a terrible thing, so let's get wasted.

 

Although, considering Dijkstra was mainly complaining about GO TO and I'm fine despite using assembler ...

Lulz

 

I did read the wikipedia page though. It mentions that while COBOL has something which was possible to use as subroutines, it lacked parameters passing and due to how perverse the return was implemented didn't allowed recursion.

In COBOLs heyday, subroutine handling was not at its best, and nothing much supported recursion.

LISP, one of the oldest computer languages is an exception. LISP was not written as a computer language, it was written as an algorithm description language, and then, "Hey, we could implement this on a computer." I read a good case for "All languages aspire to be LISP".

 

No. We also don't need Borland COBOL ++ Builder.

Web enabled COBOLScript Query Language.

 

 Sounds horrible enough.

... As you open the door, you hear a guttural chatter. You can vaguely make out hideous figures, for the most part thankfully obscured by their cloaks. The COBOLDs cease their discussion, and turn to confront you. Several pairs of beady black eyes stare back at you. Roll for initiative. ...

... Your wizards find it difficult to make sense of the COBOLDs' verbose incantations. Counterspells are surprisingly ineffective. If only one of your number could follow this archaic tongue, and make sense of it! ...

...As the last of the COBOLDs is dispatched, you are grateful for the knowledge that this battle will not recur. ...

 

 

Share this post


Link to post
Share on other sites
1 hour ago, Darth Fluffy said:

BAT DOUCH!!!

ADO BUTCH!!!

A DUTCH BO!!!

... the mind is a terrible thing ,,,

Sir, put down the dictionary and step away from the keyboard and nobody has to get hurt here.

Share this post


Link to post
Share on other sites
1 hour ago, mlooney said:

Sir, put down the dictionary and step away from the keyboard and nobody has to get hurt here.

"You are nothing but a collection of facts about words! I will use your pages when the toilet paper runs out! And to start the grill! Your mother contained the words 'smelt' and 'elderberries'!".

Now, time to make lunch. I'm glad I averted violence.

Share this post


Link to post
Share on other sites
1 minute ago, Darth Fluffy said:

I will use your pages when the toilet paper runs out!

Most book paper is not good for that sort of thing.  Among other issues, like being very rough on the butt,  it will jam up your pipes.  Don't ask why I know this.  

Share this post


Link to post
Share on other sites
Just now, mlooney said:

Most book paper is not good for that sort of thing.  Among other issues, like being very rough on the butt,  it will jam up your pipes.  Don't ask why I know this.  

You need one of those old school unabridged dictionaries with the really thin soft pages that they had in libraries.

Share this post


Link to post
Share on other sites
1 hour ago, Darth Fluffy said:

You need one of those old school unabridged dictionaries with the really thin soft pages that they had in libraries.

That kind of paper is thin but it's not soft.

Share this post


Link to post
Share on other sites
14 hours ago, ijuin said:

The COmmon Business-Oriented Language was created to be legible to people who don't actually understand programming (e.g. middle managers and government bureaucrats), which resulted in it being heavy on verbiage and light on computational efficiency.

Being legible to people who don't understand programming is nice but unobtainable goal. Computers are not able to understand natural language, so any similarity between COBOL and natural language was just apparent.

14 hours ago, ijuin said:

FORmula TRANslation was created not for general programming purposes, but rather to aid in setting up heavy-duty computations for mathematical research purposes.

Even today, FORTRAN compiler is most free regarding how much aggressive can optimize the code. No other compiler can switch order of dimension when going through two-dimensional array ... and that's BY DEFAULT, with standard "for".

14 hours ago, ijuin said:

As far as microkernels go, most microkernel implementations off-load a lot of micromanagement to the application programmer or end-user which most people would rather not bother with most of the time. It's great if you want hardware-level control of everything, but it's not great if you just want to "plug and play".

In usable OS, application programmer is not supposed to deal with microkernel directly. Which is not really anything new: even in linux, you don't call syscalls, you call library functions.

Of course, writing this main library for microkernel IS non-trivial task ...

9 hours ago, Darth Fluffy said:

In COBOLs heyday, subroutine handling was not at its best, and nothing much supported recursion.

That's not exactly strong argument to continue using it NOW.

9 hours ago, Darth Fluffy said:

LISP, one of the oldest computer languages is an exception. LISP was not written as a computer language, it was written as an algorithm description language, and then, "Hey, we could implement this on a computer." I read a good case for "All languages aspire to be LISP".

LISP was ahead of times in lot of areas. It's only real disadvantage is how much parenthesis is using. It's true that today every serious language implements lot of LISP features and have the rest on TODO list ...

6 hours ago, mlooney said:
6 hours ago, Darth Fluffy said:

I will use your pages when the toilet paper runs out!

Most book paper is not good for that sort of thing.  Among other issues, like being very rough on the butt,  it will jam up your pipes.  Don't ask why I know this.  

Also, remember that some sorts of inks are poisonous enough to affect you through the butt. (No, I don't know it from personal experience.)

 

Share this post


Link to post
Share on other sites
1 hour ago, hkmaly said:

Being legible to people who don't understand programming is nice but unobtainable goal. Computers are not able to understand natural language, so any similarity between COBOL and natural language was just apparent.

That is a non sequitur. Computers have difficulty understanding the ambiguities in natural language. They have no trouble with a well structured declarative language that is also readable. There is no requirement to use obscure symbology. I'm looking at you, C.

If you don't think that's true, it's both because you are very used to C and it's offspring, and because you drank the Kool Ade.

 

1 hour ago, hkmaly said:

Even today, FORTRAN compiler is most free regarding how much aggressive can optimize the code. No other compiler can switch order of dimension when going through two-dimensional array ... and that's BY DEFAULT, with standard "for".

My impression is that C and sons had the best optimization, because people put their efforts into it. You could be right though, I wouldn't know.

Most of the arguments for using Fortran that Ive heard had to do with it's mature library of scientific functions, and similar to COBOL and business, it's what engineering departments have always used. As a matter of fact, though Fortran is the first language I had exposure to, the instructor was learning with us, so it was not in depth; the operations staff knew nothing. Then when I went to university, my junior year programming class for engineers, at a school that used Fortran exclusively in the engineering program, the instructor decided to teach Algol to his room full of engineering students, because he liked it better. Then, a couple of years later, when he came up for tenure, we raked him across the coals, and he was surprised. Pathetic.

Algol wasn't bad, just useless in context. You don't see it much because it more or less became Pascal. The concepts are widely copied.

 

1 hour ago, hkmaly said:

That's not exactly strong argument to continue using it NOW.

That's not an argument at all for using it now. The argument for using it now is it is already there. It's not like you're saying, "Hey, let's go write this new thing in COBOL"; COBOL tasks are all maintenance. It's like having an old clunker you'd love to replace, but it still runs and gets you where you're going, and you don't want to sink money into a new ride. ...

 

 

Share this post


Link to post
Share on other sites
2 hours ago, Darth Fluffy said:
4 hours ago, hkmaly said:

Being legible to people who don't understand programming is nice but unobtainable goal. Computers are not able to understand natural language, so any similarity between COBOL and natural language was just apparent.

That is a non sequitur. Computers have difficulty understanding the ambiguities in natural language. They have no trouble with a well structured declarative language that is also readable. There is no requirement to use obscure symbology. I'm looking at you, C.

The "obscure symbology" drives home the point that if you don't understand the code you shouldn't try to write it.

Also, C isn't using so much obscure symbology. Compared to C++. Or Perl.

2 hours ago, Darth Fluffy said:
4 hours ago, hkmaly said:

Even today, FORTRAN compiler is most free regarding how much aggressive can optimize the code. No other compiler can switch order of dimension when going through two-dimensional array ... and that's BY DEFAULT, with standard "for".

My impression is that C and sons had the best optimization, because people put their efforts into it. You could be right though, I wouldn't know.

Doesn't help. Sure, people are putting lot of efforts into optimizing C, but some statements in C can't be optimized simply because the compiler can't prove the sideefects won't matter.

In Fortran, it's explicitly said that compiler doesn't guarantee the sideefects and is free to change them. Note that this is not actually property of language itself ; it would be possible to just make a compiler flag changing that for C. It's just that in C, lot of code rely on non-optimized behaviour and would break.

2 hours ago, Darth Fluffy said:

Most of the arguments for using Fortran that Ive heard had to do with it's mature library of scientific functions

Sure, that also helps.

2 hours ago, Darth Fluffy said:

Algol wasn't bad, just useless in context. You don't see it much because it more or less became Pascal. The concepts are widely copied.

Note that Pascal is language best suited for teaching programming. Compared to Fortran, which is best suited for computing, and C, which is best suited for programming.

2 hours ago, Darth Fluffy said:
4 hours ago, hkmaly said:

That's not exactly strong argument to continue using it NOW.

That's not an argument at all for using it now. The argument for using it now is it is already there. It's not like you're saying, "Hey, let's go write this new thing in COBOL"; COBOL tasks are all maintenance. It's like having an old clunker you'd love to replace, but it still runs and gets you where you're going, and you don't want to sink money into a new ride. ...

The clunker is mostly from rust and you sink more money to keep it running than to buy something new ... but hey, you were always driving it.

2 hours ago, mlooney said:
2 hours ago, Darth Fluffy said:

I'm looking at you, C.

I'll see you C and raise you perl style regular expressions.

Perl style regular expressions are quite logical extension of regular regular expression into context-free grammars. Actually using them as such is still bad idea.

2 hours ago, Darth Fluffy said:

True, but they are related.

How?

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
Sign in to follow this