• Announcements

    • Robin

      Welcome!   03/05/2016

      Welcome, everyone, to the new 910CMX Community Forums. I'm still working on getting them running, so things may change.  If you're a 910 Comic creator and need your forum recreated, let me know and I'll get on it right away.  I'll do my best to make this new place as fun as the last one!
Sign in to follow this  
hkmaly

Story, Wednesday, Oct 23, 2019

Recommended Posts

7 hours ago, hkmaly said:

Not personally done this but touched it in school: There are basically three methods of computing graphics: "manually", with specific equations for each thing you want to do, with matrixes and with quaternions.

Manual methods gets hard to write, debug and modify. Might be in some cases most efficient to compute, but with graphics cards specifically optimizing the general methods, not so often.

Compared to matrixes, quaternions are smaller (4 numbers instead of 9), allow smooth rotations more easily and are more resistant to rounding errors.

That sounds right. Fewer computations is also generally faster, depending what they are; several basic calculations can be cheaper than a single complex function. Integer math is much quicker than floating point, and many games are scaled to use integer math internally; every movement is an integral number of quantum movements.

I used three separate x, y, z equations, then two display x, display y equations, five total. I tried combining them, but I couldn't get it to work; was spending too much time on that one feature, and dropped it.

The machine this ran on ran at a blazing one MHz or so; may have been one half, may have been two.

 

Share this post


Link to post
Share on other sites
49 minutes ago, Darth Fluffy said:

The machine this ran on ran at a blazing one MHz or so; may have been one half, may have been two.

Ah, the good old days where men were men and computers were hand cranked. Thank goodness the CMD was invented. :demonicduck:

(That must have been, what, the mid eighties? I remember my first PC had a cool 386SX 16MHz CPU and that was in 1991.)

Share this post


Link to post
Share on other sites
4 hours ago, The Old Hack said:

(That must have been, what, the mid eighties? I remember my first PC had a cool 386SX 16MHz CPU and that was in 1991.)

'79. Tektronix 4051, running a Motorola 6800 processor. Fairly sweet for its day. Booted in BASIC.

 

4 hours ago, The Old Hack said:

Ah, the good old days where men were men and computers were hand cranked.

Lol, I've done that, too. Sort of. I took surveying in high school, and we had to calculate closures to verify accuracy. It can be done manually, but we were given access to a Monroe electric (electro-mechanical) scientific calculator. It was not hand cranked, had an electric motor, could have feasibly been hand cranked, I think. Not programmable at all. You could hear the equivalent of the clock cycles, it was maybe a few a second. It appeared to work in base 10.

Does a slide rule count? I've used a few. My first year of college, HP introduced their first handheld scientific, the HP-35. It was several hundred dollars when it was introduced. (Prior to or in the early part of a decade of high inflation, so bump that up to get an idea of the cost) Even four function calculators were in the low hundreds. I used my slide rule that whole year.

Hand powered tally machines to facilitate counting used to be a thing. These had a series of numbered wheels and either detents and a stylus or push buttons to increment the count. They would basically add your count and keep track of your total. I had one as a kid; never found it either useful or amusing.

I had a Digi-Comp, which was legitimately hand cranked. While it was indeed programmable, you'd be hard pressed to call it a computer; more of a very limited PLA, albeit with an integral display, such as it was. Cycle time was as fast as you could shove the tab back and forth without breaking the thing, typically a second or more per cycle, or as long as you wanted. It was a three bit machine, and could do exciting things like count up to seven, count down from seven, count up to seven, then recycle, and count down from seven and recycle.

I also acquired a compact abacus at some point. I recall it being well made and easy to use, but not particularly useful to me. I didn't see how the beads helped you do arithmetic faster. Maybe less error prone in that it was essentially a numeric recording device, but if that was the case, I never got the hang of it.

 

Share this post


Link to post
Share on other sites
22 hours ago, Darth Fluffy said:

Fewer computations is also generally faster

Not necessary when optimizations and parallelism are involved ...

21 hours ago, The Old Hack said:

Thank goodness the CMD was invented.

CMD? Commodore?

16 hours ago, Darth Fluffy said:

I also acquired a compact abacus at some point. I recall it being well made and easy to use, but not particularly useful to me. I didn't see how the beads helped you do arithmetic faster. Maybe less error prone in that it was essentially a numeric recording device, but if that was the case, I never got the hang of it.

Certainly less error prone and probably even faster if you used it for computing sum of lot of numbers, like doing inventory check in shop or summing price. The idea is to use different lines for different positions, like first line is 1, second line 10, third line 100 ... see wikipedia.

Share this post


Link to post
Share on other sites
1 hour ago, hkmaly said:

Not necessary when optimizations and parallelism are involved ...

Hence "generally". Although, I am curious what particular "optimization and parallelism" you have in mind that would not benefit, speed wise, from fewer calculations? Seems like that would be part of the optimization.

 

1 hour ago, hkmaly said:

Certainly less error prone and probably even faster if you used it for computing sum of lot of numbers, like doing inventory check in shop or summing price. The idea is to use different lines for different positions, like first line is 1, second line 10, third line 100 ... see wikipedia.

I got the mechanics of how it worked. In any case, I have no idea where it is, I probably gave it to one of my kids years ago. Props to you if you enjoy yours, it wasn't for me.

I am prone to add prices as the clerk checks me out at a store, and have caught them in errors (usually it's me misunderstanding the pricing or the way a sale worked, but not always). I haven't needed an abacus to keep up.

 

 

Share this post


Link to post
Share on other sites
8 hours ago, Darth Fluffy said:

I am prone to add prices as the clerk checks me out at a store, and have caught them in errors (usually it's me misunderstanding the pricing or the way a sale worked, but not always). I haven't needed an abacus to keep up.

The advantages of having grown up in the grade school generation previous to pocket calculators becoming ubiquitous in school! ^_^

Share this post


Link to post
Share on other sites
5 hours ago, The Old Hack said:

The advantages of having grown up in the grade school generation previous to pocket calculators becoming ubiquitous in school! ^_^

When I was in school calculators were readily available but usually weren't allowed in the classroom. I learned to do arithmetic on paper and in my head, but it was an ordeal (I don't remember what it was called, but I actually was diagnosed with a math-related disability). While I may know the methods, my results are still extremely unreliable (there have even been a few times as an adult I've gotten the wrong number adding single-digit numbers), and I pretty much need to rely on a calculator or computer for any important math.

Share this post


Link to post
Share on other sites
8 minutes ago, ChronosCat said:

When I was in school calculators were readily available but usually weren't allowed in the classroom. I learned to do arithmetic on paper and in my head, but it was an ordeal (I don't remember what it was called, but I actually was diagnosed with a math-related disability). While I may know the methods, my results are still extremely unreliable (there have even been a few times as an adult I've gotten the wrong number adding single-digit numbers), and I pretty much need to rely on a calculator or computer for any important math.

Absolutely fair! And I am not putting you down. I merely consider being able to do simple arithmetics in my head very beneficial and I have found that a startling amount of younger people simply lack the training to do it, which I feel is sad.

Share this post


Link to post
Share on other sites
1 minute ago, The Old Hack said:

I merely consider being able to do simple arithmetics in my head very beneficial and I have found that a startling amount of younger people simply lack the training to do it, which I feel is sad.

I remember one day at work when a new waitress proved to be utterly incapable of calculating a guest check on her own
Later that week, I ran into my 6th grade math teacher and could only think "Yes, you will need to know this in real life"

Share this post


Link to post
Share on other sites
16 hours ago, The Old Hack said:
17 hours ago, hkmaly said:

CMD? Commodore?

Cosmetic Morphing Device.

Meant that afterwards, men were no longer men but women. :danshiftyeyes:

Oh. I though this referred to the computer bit, not the men bit.

15 hours ago, Darth Fluffy said:
17 hours ago, hkmaly said:

Not necessary when optimizations and parallelism are involved ...

Hence "generally". Although, I am curious what particular "optimization and parallelism" you have in mind that would not benefit, speed wise, from fewer calculations? Seems like that would be part of the optimization.

If you have hardware specifically made to do a matrix operations on 4x4 matrixes by making all operations in parallel, it wouldn't really help you there is another representation with fewer calculations ...

... and I'm pretty sure GPUs are full of those.

15 hours ago, Darth Fluffy said:
17 hours ago, hkmaly said:

Certainly less error prone and probably even faster if you used it for computing sum of lot of numbers, like doing inventory check in shop or summing price. The idea is to use different lines for different positions, like first line is 1, second line 10, third line 100 ... see wikipedia.

I got the mechanics of how it worked. In any case, I have no idea where it is, I probably gave it to one of my kids years ago. Props to you if you enjoy yours, it wasn't for me.

I am prone to add prices as the clerk checks me out at a store, and have caught them in errors (usually it's me misunderstanding the pricing or the way a sale worked, but not always). I haven't needed an abacus to keep up.

Well I didn't said I enjoyed mine (or that I had any ... although I probably had when kid). But yeah I'm not so good at adding prices so I can easily see how it would help.

1 hour ago, ChronosCat said:
6 hours ago, The Old Hack said:

The advantages of having grown up in the grade school generation previous to pocket calculators becoming ubiquitous in school! ^_^

When I was in school calculators were readily available but usually weren't allowed in the classroom. I learned to do arithmetic on paper and in my head

We were using paper in class, but that's SLOWER.

1 hour ago, Pharaoh RutinTutin said:

Math is not a Science
It is a Religion

x9bckiv18gc01.jpg

That's provably false. And if you visit correct classes (UNIVERSITY classes, of course), they even TEACH you how to derive all math from just few axioms.

1 hour ago, The Old Hack said:

Absolutely fair! And I am not putting you down. I merely consider being able to do simple arithmetics in my head very beneficial and I have found that a startling amount of younger people simply lack the training to do it, which I feel is sad.

It's definitely useful, but there is even MORE important stuff math is supposed to teach you: logical way of thinking. The fact lot of people fail at that endangers not just them but whole world.

 

Share this post


Link to post
Share on other sites
4 hours ago, ChronosCat said:

When I was in school calculators were readily available but usually weren't allowed in the classroom. I learned to do arithmetic on paper and in my head, but it was an ordeal (I don't remember what it was called, but I actually was diagnosed with a math-related disability). While I may know the methods, my results are still extremely unreliable (there have even been a few times as an adult I've gotten the wrong number adding single-digit numbers), and I pretty much need to rely on a calculator or computer for any important math.

The things you can do on a calculator today with symbolic math were an AI research topic not all that long ago; 1980s.

The one I miss is my basic business calculator, with interest functions, like future value. There are coarse rules of thumb to approximate those, but no real substitute, and they are difficult and tedious to calculate; inherently iterative.

Share this post


Link to post
Share on other sites
2 hours ago, hkmaly said:

... and I'm pretty sure GPUs are full of those.

I agree, if you have a GPU, you write to the GPU. That's not really the same kind of optimization. The calculation optimization and the caching and fetch optimizations take place in the GPU, primarily at design time.

My project was before such were available. Commodore and Atari fielded their GPUs within the time frame of what I was working on; I believe those were the first commercial ones, and they were tied to specific products. The IBM PC had not yet been released.

 

2 hours ago, hkmaly said:

That's provably false. And if you visit correct classes (UNIVERSITY classes, of course), they even TEACH you how to derive all math from just few axioms.

If you mean the set theory approach to proving arithmetic, no, just, no. "Let's take the obvious and make it painfully obtuse." Godel pretty much shot Hilbert down on that, anyway.

Bear in mind, much of nature counts. Those biological systems did not learn to do that by having theorems proven to them.

 

2 hours ago, hkmaly said:

It's definitely useful, but there is even MORE important stuff math is supposed to teach you: logical way of thinking. The fact lot of people fail at that endangers not just them but whole world.

I'd say that's more true of science as a culture and a method than of math, per se.

 

Share this post


Link to post
Share on other sites
4 hours ago, Pharaoh RutinTutin said:

Remember
Math is not a Science
It is a Religion

x9bckiv18gc01.jpg

While this is obviously making fun of a notion, I think there's some truth to it. Much of our approach to math is cultural; not every culture has approached math the same way. Even today, we have multiple number systems, all base ten, but differing in how we aggregate the groupings of decades (powers of ten).

 

Share this post


Link to post
Share on other sites
1 hour ago, Darth Fluffy said:
4 hours ago, hkmaly said:

... and I'm pretty sure GPUs are full of those.

I agree, if you have a GPU, you write to the GPU. That's not really the same kind of optimization. The calculation optimization and the caching and fetch optimizations take place in the GPU, primarily at design time.

My project was before such were available. Commodore and Atari fielded their GPUs within the time frame of what I was working on; I believe those were the first commercial ones, and they were tied to specific products. The IBM PC had not yet been released.

I remember reading that Atari specifically had some sort of surprising kind of optimization in chip either for graphics or sound, which later proved to be hard to emulate because SW emulations were too slow and HW were not compatible.

However, in general, yes: there used to be less specifically optimized HW in that time. Then there was era where the GPUs were able to do some things really fast but you never get the results back, just on screen. Only relatively recently the common hardware for normal people has the functionality to use GPUs for computations and get results back to CPU.

1 hour ago, Darth Fluffy said:
4 hours ago, hkmaly said:

That's provably false. And if you visit correct classes (UNIVERSITY classes, of course), they even TEACH you how to derive all math from just few axioms.

If you mean the set theory approach to proving arithmetic, no, just, no. "Let's take the obvious and make it painfully obtuse." Godel pretty much shot Hilbert down on that, anyway.

While Peano's arithmetic isn't quite obvious, it's NOT set theory and it's much easier to understand than the set-based definition.

1 hour ago, Darth Fluffy said:

Bear in mind, much of nature counts. Those biological systems did not learn to do that by having theorems proven to them.

What exactly in nature counts to numbers higher than 4 precisely?

Most of computation in nature are neural-net based blackboxes capable of approximating trajectories.

1 hour ago, Darth Fluffy said:
4 hours ago, hkmaly said:

It's definitely useful, but there is even MORE important stuff math is supposed to teach you: logical way of thinking. The fact lot of people fail at that endangers not just them but whole world.

I'd say that's more true of science as a culture and a method than of math, per se.

From the general point, yes. But if you look specifically at education, math is ONLY class until university teaching logic and deduction. Every other subject, including physics (which is closest to math from all sciences taught at school), can be just memorized and memorizing is big part of what is necessary to do in them. At least that was true in schools I visited. I've heard that in US, there is much more emphasis on talking ... which, while likely more useful than memorizing, is again NOT logic and deduction.

Most people claiming to hate math in school actually hate to think.

1 hour ago, Darth Fluffy said:

While this is obviously making fun of a notion, I think there's some truth to it. Much of our approach to math is cultural; not every culture has approached math the same way. Even today, we have multiple number systems, all base ten, but differing in how we aggregate the groupings of decades (powers of ten).

That might be true, if you speak specifically about the approach. Still, math is NOT a religion until you get to concepts like axiom of choice and hypothesis of continuum. Everything can be verified: while the approach says what you learn and what is skipped, you can independently prove that what you learn is true. In religion, hardly anything can be proven and authorities routinely disagree on important basic topics.

Completely different culture - say, extraterrestrial - is likely to approach math differently, but will still be able to understand our math. They will have harder time with physics, because some of that is based on experiments they didn't do, and there will be even more things like that in other sciences. Philosophy and religion? No chance.

Share this post


Link to post
Share on other sites
On 11/2/2019 at 0:18 AM, hkmaly said:

I remember reading that Atari specifically had some sort of surprising kind of optimization in chip either for graphics or sound, which later proved to be hard to emulate because SW emulations were too slow and HW were not compatible.

Both Atari and Commodore used a single chip for video with limited sprites and enhanced sound. Contrast this with the slightly older (couple of years) Apple II, where the processor did all the work (including running every function of the disk drive, if you had them). The disk thing was weird, some of the copy protection schemes wrote spiral tracks on Apple II disks, because they could.

The landscape was proprietary anarchy, with pockets of compatibility, like 8080 and Z80 based CP/M-80 systems, which were at least somewhat software compatible, if you could transfer the files. There was no standard disk format, although at least there were few sizes, 8" and 5 1/4", and only the 5 1/4" soft-sectored were popular with home users.

The available machines were comparable in speed, so emulating each other in depth was not a thing, but spoofing specific functions, such as reading each others disks, was. you current machine that runs approximately three orders of magnitude faster than machines of that era would have no problem emulating them. I believe you can find a C64 emulator; if not now, I know you could have several years ago.

 

On 11/2/2019 at 0:18 AM, hkmaly said:

However, in general, yes: there used to be less specifically optimized HW in that time. Then there was era where the GPUs were able to do some things really fast but you never get the results back, just on screen. Only relatively recently the common hardware for normal people has the functionality to use GPUs for computations and get results back to CPU.

It's been a little while. I recall reading about a supercomputer built out of many linked Playstations. I believe the Bitcoin miners use machine arrays like that.

 

On 11/2/2019 at 0:18 AM, hkmaly said:

While Peano's arithmetic isn't quite obvious, it's NOT set theory and it's much easier to understand than the set-based definition.

I would highly recommend this book.

 

On 11/2/2019 at 0:18 AM, hkmaly said:

What exactly in nature counts to numbers higher than 4 precisely?

Why is four a magic threshold? Also, I can't answer precisely, not even nearly, I am not that well versed on who can do what. I can tell you that infant chimps have been tested and can visually recognize quantities into the teens; they perform much better than their human peer at that age, and it is questionable that they are counting. Crows (and other corvids) can keep track of numbers of objects into the decades, at least thirty.

 

On 11/2/2019 at 0:18 AM, hkmaly said:

Most of computation in nature are neural-net based blackboxes capable of approximating trajectories.

"Most", or "Most that you are aware of"? Our senses appear to be threshold based; whether you see light or not appears to be based on how many sensors fire. Whether you taste a substance is based on how many taste buds fire. From an engineering standpoint, this makes sense; filter out false positives. But I'd dare say, if you were to investigate, I'm pretty sure this basic model holds for things that do not have nerves. Plants and their trophisms, for example.

At an even more fundamental level, the stuff we are made of is essentially numeric; examine a periodic table.

 

On 11/2/2019 at 0:18 AM, hkmaly said:

From the general point, yes. But if you look specifically at education, math is ONLY class until university teaching logic and deduction. Every other subject, including physics (which is closest to math from all sciences taught at school), can be just memorized and memorizing is big part of what is necessary to do in them. At least that was true in schools I visited. I've heard that in US, there is much more emphasis on talking ... which, while likely more useful than memorizing, is again NOT logic and deduction.

Math teaches a formalized system of reasoning an deduction that pertains to math. It does not teach rhetorical logic and deduction as, for instance, a lawyer might need. Not dissing math, it's just not complete.

(Although, adding ad hominem arguments to math might make the classes more exciting.)

 

On 11/2/2019 at 0:18 AM, hkmaly said:

Most people claiming to hate math in school actually hate to think.

That sounds like a rather omniscient proclamation. What do you base it on?

I know a lady that edits sci fi for Baen Books. She is a good conversationalist, tends to be well versed. She says she "can't do math". I know she can think; seems more phobic than not willing to put forth effort. My hypothesis is "bad experience in school".

I almost had this myself, it might be more toward your point. I had trouble remembering the times tables because I did not want to put effort into memorizing. This could have gone bad two ways, either I was left to my own devices, and would have struggled. or my Dad, hearing about it in a conference with my teacher, would have done his usual and belted me, ruining math for a lifetime. Fortunately for me, my teacher must have guided him, because he became uncharacteristically involved, got flash cards, and worked with me to get me past this hump.

In any case, I would highly recommend this book.

 

On 11/2/2019 at 0:18 AM, hkmaly said:

That might be true, if you speak specifically about the approach. Still, math is NOT a religion until you get to concepts like axiom of choice and hypothesis of continuum. Everything can be verified: while the approach says what you learn and what is skipped, you can independently prove that what you learn is true. In religion, hardly anything can be proven and authorities routinely disagree on important basic topics.

Completely different culture - say, extraterrestrial - is likely to approach math differently, but will still be able to understand our math. They will have harder time with physics, because some of that is based on experiments they didn't do, and there will be even more things like that in other sciences. Philosophy and religion? No chance.

Re: "Everything can be verified:" That's neither true for math nor science, nor from fields derived from them. Hence the half life of knowledge, which tends to be shorter than you would expect. Also, it takes a lot of effort to figure out what you don't know. However, for math, we've already discussed, there are more numbers than what every being in the universe (or multiple universes) could identify in the life of the universe(s). We experience very few. We see some needles and miss the haystack; the essentially imperceptible haystack.

Learning is evolutionary, both for individuals and groups, cultures. Cherished ideas are disproven and replaced by more robust ideas. The truisms you learn in grade school are replaced by more detailed models as you mature in intellectual capacity. For example, I learned in school that there are three states of matter. Not even close. No cigar, teacher. Even the definitions are fuzzy.

I was initially flabbergasted listening to a Neil DeGrasse Tyson video, where he stated that we know what (some small fraction that I do not recall exactly, around 10%-ish) of the universe is. Mostly we have placeholder words for stuff we have little clue about. And we have no idea how big it is, how much it goes beyond the visible horizon (which, while continually expanding as more distant light arrives, appears to be shrinking, as the objects at the horizon depart).

I am not saying we should not pursue that knowledge. Indeed, knowing there are things to uncover makes me hungry to find out more.

Re: "Completely different culture - say, extraterrestrial - is likely to approach math differently, but will still be able to understand our math."

That is a sci fi trope. I think there's some basic truth to it, that would be the way to proceed logically, but our track record of communicating with species in our own back yard does not bode well for SETI.

Re: "Religion" - I distinguish between personal belief structure and cultural belief systems. The latter tend to be self serving. (I say tend, but honestly, I can't think of an exception.) The former tends to get shoved aside by the latter. Still, I see your point, the supernatural is intrinsically unverifiable.

Much of math is the same way. Granted, the boundary is creeping outward, but we would not have a Fields prize if everything was easily provable.

 

Edited by Darth Fluffy
fixed typo

Share this post


Link to post
Share on other sites
20 minutes ago, Darth Fluffy said:

That is a sci fi trope. I think there's some basic truth to it, that would be the way to proceed logically, but out track record of communicating with species in our own back yard does not bode well for SETI.

I did read one sci fi story where they found ruins of an alien civilisation and had no idea of how to decode their script. Then they found a Rosetta stone of sorts. Something that turned out to be a periodic table of the elements. By the end of the story they had managed to get some basic idea of how their script worked.

Share this post


Link to post
Share on other sites
34 minutes ago, Darth Fluffy said:

I had trouble remembering the times tables because I did not want to put effort into memorizing.

They never figured out that I had memorized only a small portion of the times table. And I usually got top grades in math (until I got to calculus in college, taught by a professor who couldn't speak English). The portion of the times table that I didn't memorize, I could figure out in my head quickly enough to fool them.

Share this post


Link to post
Share on other sites

I had an older teacher in third grade who openly lamented in class that she could not have us repeatedly recite rote facts because that was not what education was supposed to be in the 1970s

Today, I see the ability to recite lists like the Kings of England, the Presidents of the Untied States, or the leading exports of Australia that are not shipped by air or water largely as exercise in trivia

Of course, if we did not have instant access to this kind of information at our fingertips regardless of current location, the situation would be different

Share this post


Link to post
Share on other sites
On 11/2/2019 at 9:36 PM, Darth Fluffy said:
On 11/2/2019 at 5:18 AM, hkmaly said:

I remember reading that Atari specifically had some sort of surprising kind of optimization in chip either for graphics or sound, which later proved to be hard to emulate because SW emulations were too slow and HW were not compatible.

Both Atari and Commodore used a single chip for video with limited sprites and enhanced sound. Contrast this with the slightly older (couple of years) Apple II, where the processor did all the work (including running every function of the disk drive, if you had them). The disk thing was weird, some of the copy protection schemes wrote spiral tracks on Apple II disks, because they could.

The landscape was proprietary anarchy, with pockets of compatibility, like 8080 and Z80 based CP/M-80 systems, which were at least somewhat software compatible, if you could transfer the files. There was no standard disk format, although at least there were few sizes, 8" and 5 1/4", and only the 5 1/4" soft-sectored were popular with home users.

The available machines were comparable in speed, so emulating each other in depth was not a thing, but spoofing specific functions, such as reading each others disks, was. you current machine that runs approximately three orders of magnitude faster than machines of that era would have no problem emulating them. I believe you can find a C64 emulator; if not now, I know you could have several years ago.

I'm not sure WHEN I read about the problems with emulation. It's possible it was in time when the machines were "just" two orders of magnitude faster and that it wasn't enough. I do know that emulating 8MHz gameboy on 80486 (which had 66MHz or maybe 100MHz) computer in real time was hard enough the CPU didn't had power left to double the screen size. Of course, now it's not problem: first, the computers are faster, second, they don't need CPU to double the screen size :)

On 11/2/2019 at 9:36 PM, Darth Fluffy said:
On 11/2/2019 at 5:18 AM, hkmaly said:

Only relatively recently the common hardware for normal people has the functionality to use GPUs for computations and get results back to CPU.

It's been a little while. I recall reading about a supercomputer built out of many linked Playstations. I believe the Bitcoin miners use machine arrays like that.

Playstation or Playstation II? I think it was Playstation II which was popular like this ... however, it's true that even that was out in 2000. Maybe it's been a little while ... or maybe the time I had on mind was when the ability entered specifically middle-range videocards for PCs, although even that is probably more than 5 years.

Bitcoin miners IMHO use relatively normal machines with "just" four video cards.

On 11/2/2019 at 9:36 PM, Darth Fluffy said:
On 11/2/2019 at 5:18 AM, hkmaly said:

What exactly in nature counts to numbers higher than 4 precisely?

Why is four a magic threshold? Also, I can't answer precisely, not even nearly, I am not that well versed on who can do what. I can tell you that infant chimps have been tested and can visually recognize quantities into the teens; they perform much better than their human peer at that age, and it is questionable that they are counting. Crows (and other corvids) can keep track of numbers of objects into the decades, at least thirty.

Four legs, duh :)

... unfortunately, there is rock band Counting Crows, so I guess I wouldn't be able to find details for this ... I found this about counting but without saying the maximal number.

Anyway, seems that I underestimated the animals with the "4" threshold, but I still think that the threshold is lower than what we aim to teach our children ...

On 11/2/2019 at 9:36 PM, Darth Fluffy said:
On 11/2/2019 at 5:18 AM, hkmaly said:

While Peano's arithmetic isn't quite obvious, it's NOT set theory and it's much easier to understand than the set-based definition.

I would highly recommend this book.

... although I think we SHOULD start with simpler approach at least until 4th grade, maybe (what's the US term) middle/junion high school.

On 11/2/2019 at 9:36 PM, Darth Fluffy said:
On 11/2/2019 at 5:18 AM, hkmaly said:

Most of computation in nature are neural-net based blackboxes capable of approximating trajectories.

"Most", or "Most that you are aware of"? Our senses appear to be threshold based; whether you see light or not appears to be based on how many sensors fire. Whether you taste a substance is based on how many taste buds fire. From an engineering standpoint, this makes sense; filter out false positives. But I'd dare say, if you were to investigate, I'm pretty sure this basic model holds for things that do not have nerves. Plants and their trophisms, for example.

Most that fill engineers with envy, at least :). The way both people and animals are capable of predicting trajectory of object, say falling ball, based on limited input data, is not easy to replicate with modern computers.

Note that I was talking about calculations. Obviously, the threshold-based stuff can be understood in terms of calculations, but it has different purpose.

On 11/2/2019 at 9:36 PM, Darth Fluffy said:

At an even more fundamental level, the stuff we are made of is essentially numeric; examine a periodic table.

The stuff we are made of are essentially probability waves. Also, periodic table is not "exact" computing: even with light stuff like hydrogen, carbon or oxygen, you don't get pure arithmetic multiplication because there are atoms of 14C between the 12C.

On 11/2/2019 at 9:36 PM, Darth Fluffy said:

Math teaches a formalized system of reasoning an deduction that pertains to math. It does not teach rhetorical logic and deduction as, for instance, a lawyer might need. Not dissing math, it's just not complete.

(Although, adding ad hominem arguments to math might make the classes more exciting.)

It's not complete ... however, which other SUBJECT in elementary/high school teaches logic? It's not that they couldn't. You could teach history in way of explaining strategical decisions rulers made, you can explain greek philosophy along with their poetry, I already mentioned physics ... It's that they don't.

On 11/2/2019 at 9:36 PM, Darth Fluffy said:
On 11/2/2019 at 5:18 AM, hkmaly said:

Most people claiming to hate math in school actually hate to think.

That sounds like a rather omniscient proclamation. What do you base it on?

Experience.

Ok, I don't have actual statistics data about it.

On 11/2/2019 at 9:36 PM, Darth Fluffy said:

I know a lady that edits sci fi for Baen Books. She is a good conversationalist, tends to be well versed. She says she "can't do math". I know she can think; seems more phobic than not willing to put forth effort. My hypothesis is "bad experience in school".

I almost had this myself, it might be more toward your point. I had trouble remembering the times tables because I did not want to put effort into memorizing. This could have gone bad two ways, either I was left to my own devices, and would have struggled. or my Dad, hearing about it in a conference with my teacher, would have done his usual and belted me, ruining math for a lifetime. Fortunately for me, my teacher must have guided him, because he became uncharacteristically involved, got flash cards, and worked with me to get me past this hump.

In any case, I would highly recommend this book.

There are definitely people like that lady too, but unless U.S. schools are worse than I though, I don't believe they form a majority.

... although maybe if I read that book I would find that U.S. schools ARE worse than I though.

On 11/2/2019 at 9:36 PM, Darth Fluffy said:

Re: "Everything can be verified:" That's neither true for math nor science, nor from fields derived from them. Hence the half life of knowledge, which tends to be shorter than you would expect.

It doesn't hold in physics and other sciences, that's true. And even in math there are things which, while true, are later declared "not as important as though". However, you may notice that while most of what Aristotle said about physics is obsolete, Pythagorean theorem was actually already known to Babylonians and is still true.

Seriously: most of math university education is taught in term of statement - and how to prove it's true. The fact that mathematicians are only one who actually care about difference between what's true and what can be proved also speaks a lot.

On 11/2/2019 at 9:36 PM, Darth Fluffy said:

However, for math, we've already discussed, there are more numbers than what every being in the universe (or multiple universes) could identify in the life of the universe(s). We experience very few. We see some needles and miss the haystack; the essentially imperceptible haystack.

We don't NEED to experience all numbers. We KNOW how they will behave.

On 11/2/2019 at 9:36 PM, Darth Fluffy said:

Re: "Completely different culture - say, extraterrestrial - is likely to approach math differently, but will still be able to understand our math."

That is a sci fi trope. I think there's some basic truth to it, that would be the way to proceed logically, but our track record of communicating with species in our own back yard does not bode well for SETI.

Well, the species in our own back yard seems to not want to make the effort in creating sum of own knowledge surpassing the memory capacity of individual. You can't really build civilization without that, unless you have individuals with MUCH better memory than humans have.

On 11/2/2019 at 9:36 PM, Darth Fluffy said:

Re: "Religion" - I distinguish between personal belief structure and cultural belief systems. The latter tend to be self serving. (I say tend, but honestly, I can't think of an exception.) The former tends to get shoved aside by the latter. Still, I see your point, the supernatural is intrinsically unverifiable.

... I guess personal belief is technically religion but yes I was thinking more about organized religions ...

On 11/2/2019 at 9:36 PM, Darth Fluffy said:

Much of math is the same way. Granted, the boundary is creeping outward, but we would not have a Fields prize if everything was easily provable.

Might depend on how you measure "much", obviously, but there is big core where everything is provable. Sure, you don't get prizes for core: those are for boundaries and often specifically for PROVING something which is already suspected to be true.

On 11/2/2019 at 10:08 PM, The Old Hack said:
On 11/2/2019 at 9:36 PM, Darth Fluffy said:

That is a sci fi trope. I think there's some basic truth to it, that would be the way to proceed logically, but out track record of communicating with species in our own back yard does not bode well for SETI.

I did read one sci fi story where they found ruins of an alien civilisation and had no idea of how to decode their script. Then they found a Rosetta stone of sorts. Something that turned out to be a periodic table of the elements. By the end of the story they had managed to get some basic idea of how their script worked.

I think what they needed (and got) was basically next step after math.

On 11/2/2019 at 11:39 PM, Pharaoh RutinTutin said:

I had an older teacher in third grade who openly lamented in class that she could not have us repeatedly recite rote facts because that was not what education was supposed to be in the 1970s

Today, I see the ability to recite lists like the Kings of England, the Presidents of the Untied States, or the leading exports of Australia that are not shipped by air or water largely as exercise in trivia

More like uselessness.

... wait, WHAT is exported from Australia not by air or water and how? Did someone build a tunnel between Australia and U.S.?

On 11/2/2019 at 11:39 PM, Pharaoh RutinTutin said:

Of course, if we did not have instant access to this kind of information at our fingertips regardless of current location, the situation would be different

Not that much. There were encyclopedia before computers. And most of that information is very rarely useful in practical life, so it doesn't matter that hardly any student will remember it longer than he needs for the exams.

 

Share this post


Link to post
Share on other sites
2 minutes ago, hkmaly said:

I think what they needed (and got) was basically next step after math.

That makes sense to me. Math describes how the Universe works. The Periodic Table is part of the fundamentals of how matter as we understand it works.

I am trying to think of similar examples but so far I am not coming up with much. One sci fi author suggested calendars. In his case he presupposed two extremely similar species that both had a basic comprehension of astronomy and a diurnal way of life as well as home planets with seasonal weather. Then it was a question of knowing the planet's rotational speed on its own axis as well as the time it took to revolve once around its sun.

Share this post


Link to post
Share on other sites
9 hours ago, hkmaly said:

... wait, WHAT is exported from Australia not by air or water and how? Did someone build a tunnel between Australia and U.S.?

Digital exports?

Share this post


Link to post
Share on other sites
8 hours ago, hkmaly said:

Four legs, duh :)

... unfortunately, there is rock band Counting Crows, so I guess I wouldn't be able to find details for this ... I found this about counting but without saying the maximal number.

Anyway, seems that I underestimated the animals with the "4" threshold, but I still think that the threshold is lower than what we aim to teach our children ...

We've gone beyond the needs of hunting and gathering. Even as shepherds and farmers, even fabricating simple articles like clothing and binding stoned to sticks for weapons, our need to count grew.

The name of the band is a reference to the notion. (Flip side, I passed a LARGE flock of crows or ravens driving past the parking lot of the New Mexico Fair Grounds in Albuquerque; did not notice them at first, they blended in with the asphalt. They would have been difficult to count; they were just a sea of feathery blackness, and it was difficult to spot individual crows.)

In your link, the ability of spiders to count was astounding. They do not look like they have very much cranial horsepower, yet they keep appearing with respect to intellectual feats. I read an article in (US) National Geographics about a species, portia, that hunts other spiders. What was interesting in particular is that it used a different strategy, for different targets. Even being able to link an image (had good eyesight for its size) to the correct entry in a table of strategies is impressive; this is a very small animal.

Re: "Four legs, duh :)" - Yes, centipedes are the brain trust.

 

8 hours ago, hkmaly said:

... although I think we SHOULD start with simpler approach at least until 4th grade, maybe (what's the US term) middle/junion high school.

That is the gist that I recall from the book. The rigorous approach has value to the professional, those that need the mechanics can find it daunting at an early age.

 

8 hours ago, hkmaly said:

Note that I was talking about calculations.

Note that I wasn't. I said "counting".

 

8 hours ago, hkmaly said:

The stuff we are made of are essentially probability waves.

At a more fundamental level, yes, but at the scale of discussion, the pieces are objects, Legos, so to speak.

 

8 hours ago, hkmaly said:

Also, periodic table is not "exact" computing: even with light stuff like hydrogen, carbon or oxygen, you don't get pure arithmetic multiplication because there are atoms of 14C between the 12C.

The differences in binding energy keep chemistry largely distinct from nuclear physics, but the emphasis on counting to distinguish between species is similar. Yes, there are other factors, like how tight are the boxes; your X may become Y + e at a certain rate. Doesn't invalidate the observation.

 

8 hours ago, hkmaly said:

It's not complete ... however, which other SUBJECT in elementary/high school teaches logic? It's not that they couldn't. You could teach history in way of explaining strategical decisions rulers made, you can explain greek philosophy along with their poetry, I already mentioned physics ... It's that they don't.

There are definitely people like that lady too, but unless U.S. schools are worse than I though, I don't believe they form a majority.

... although maybe if I read that book I would find that U.S. schools ARE worse than I though.

I don't think you can teach history well without covering rationale, lest you end up making your students memorize the dry pointless lists that have been alluded to. The list of US presidents for example; dry and pointless, unless you incorporate what each man stood for, how it fit into the context of the day, what it lead to, and such. Decades of compromise over slavery did not preclude eventual conflict; the same kind of elitist a-holes who caused the Great Depression may be doing it again ...

My (US) teachers gave generally sufficient rationale, with little being presented as just rote, although there was some of that. Also, I asked questions and did outside reading.

Are US schools worse? I've heard this, but I haven't seen a lot of evidence, although we are pathetic at languages; less driving need. I chalk it up to "Every card's a winner, and every card's a looser" (That's not the correct lyrics; the song goes, "Every hand's a winner, and every hand's a looser". That is simply not true, there are in any game good hands and bad hands; it is more true of the cards, the card you need is a winner, it can be any other card.)

Some of our more notable efforts to improve our schools backfire. No Child Left Behind has schools narrowing their focus to teaching only material that will be on the evaluation test. It is an economic necessity to avoid a death spiral of becoming underfunded and having even more funds cut in the future as performance falls. Deming is spinning in his grave.

The aforementioned book was a response to The New Math, a more rigorous approach to elementary math.

 

8 hours ago, hkmaly said:

It doesn't hold in physics and other sciences, that's true. And even in math there are things which, while true, are later declared "not as important as though". However, you may notice that while most of what Aristotle said about physics is obsolete, Pythagorean theorem was actually already known to Babylonians and is still true.

Or the other way, how so much way out there abstract math tends to have real practical application.

 

8 hours ago, hkmaly said:

Seriously: most of math university education is taught in term of statement - and how to prove it's true. The fact that mathematicians are only one who actually care about difference between what's true and what can be proved also speaks a lot.

That has not exactly been my experience; as an engineer, I was sometimes introduced to a new math topic in the course of an engineering application; however it was less satisfying; it helped to  learn it again in a more rigorous setting.

Another surprising thing in that regard was that often the math department professors were out of touch with their topic beyond the abstract. I asked our diffy q professor, who taught is topic very well, about the applicability of a topic he was teaching, and it wasn't just that he drew a blank, it was more like my words were foreign.

I found that the best math courses that I took at a higher level were rigorous, but taught by an engineer.

 

8 hours ago, hkmaly said:

We don't NEED to experience all numbers. We KNOW how they will behave.

Fair enough in many cases; we will never see more than a tiny sliver of renderings of the Mandelbrot set. We know pretty much what it all looks like.

Sometimes misleading. We sample the low end of things and draw global conclusions. 2, 3, 5, 7, ... I guess all odd numbers are prime. ... 9, nope, 11, 13, ... with occasional exceptions ....

 

8 hours ago, hkmaly said:

Well, the species in our own back yard seems to not want to make the effort in creating sum of own knowledge surpassing the memory capacity of individual. You can't really build civilization without that, unless you have individuals with MUCH better memory than humans have.

“For instance, on the planet Earth, man had always assumed that he was more intelligent than dolphins because he had achieved so much—the wheel, New York, wars and so on—whilst all the dolphins had ever done was muck about in the water having a good time. Curiously, the dolphins had always believed that they were far more intelligent than man—for precisely the same reasons.”

 

8 hours ago, hkmaly said:

... I guess personal belief is technically religion but yes I was thinking more about organized religions ...

I'm guessing that's an oxymoron. Maybe Scientology, they seem sufficiently Machiavellian and monolithic to qualify.

 

8 hours ago, hkmaly said:

Might depend on how you measure "much", obviously, but there is big core where everything is provable. Sure, you don't get prizes for core: those are for boundaries and often specifically for PROVING something which is already suspected to be true.

Calling out a similar concept to the hugeness of numbers; there's always way more that you don't know than that you do know. You know, sufficiently advanced, blah, blah, blah, magic.

I agree though, that we have a nice, functional core.

 

8 hours ago, hkmaly said:

... wait, WHAT is exported from Australia not by air or water and how? Did someone build a tunnel between Australia and U.S.?

Well, actually, Bugs Bunny does stuff like this; generally involves the routing through Albuquerque and missing his turn. Having lived there, I did not notice anything that would imply a shortcut to other parts of the globe. <shrug>

 

8 hours ago, hkmaly said:

Not that much. There were encyclopedia before computers. And most of that information is very rarely useful in practical life, so it doesn't matter that hardly any student will remember it longer than he needs for the exams.

Yes, but Oh, God, research is soooo much easier and quicker today.

 

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
Sign in to follow this