This post confuses me. Why would code be simpler than the math notation? Both involve symbolic abstraction of basically the same complexity
Memes
Rules:
- Be civil and nice.
- Try not to excessively repost, as a rule of thumb, wait at least 2 months to do it if you have to.
Its got to be a relatively small group who knows enough to understand loops and is also afraid of math symbols.
Hi, I'm the problem. It's me.
Maybe not so small?
I never encountered these math symbols but for loops are like step 3 in any programming language after variables and conditionals
lol, like 2.5% of the USA are programmers and even if we say twice that number have experimented and taken programming classes, that's like 1 in 20 people who would even have ever encountered a for loop. This nsf report says ~70% of highschoolers have taken Algebra 2 or a more advanced math course, which is when sum notation is usually introduced. I think 70% is a little greater than 5%!
That's interesting to hear; somehow my algebra 2 skipped sum notation (and it wasn't remedially covered in subsequent math classes) but I've been writing code for decades now and seeing it in code totally explains the sum notation for me
I'm in that group I think. I do like a liiitle bit of coding in some tiny specific progrqmming language in one piece of software that I use. I understand the basics but try to avoid having to do it. But while code is a little scary to me, math is much scarier lol
I believe this group could be bigger than some may think. I, and the team I work with, work with for loops similar to these on a regular basis. And only one of us has a bachelor's degree in math. The rest of us don't really understand the math unless it is applied.
I'm in this group and I don't like it
Not really sure if this answers your question (I agree with you, ultimately), but here’s my experience:
At the college I attended, these sigma/pi expressions weren’t taught until the end of Calculus 2, but I wanted to take an Algorithms class - which had calc 2 as a prerequisite.
I got an exception from my advisor which allowed me to take Algorithms before the pre-req. In my experience, these concepts were easily learned in the context of algorithmic complexity.
Some might be barred from learning important theory in computer science by “brutal” math classes at university. They might find solace in this post which translates sigma into ‘for’
i hate that we all got so frightened about math. it's genuinely fun to learn how it works when you're not being forced to in a school setting, which was just a fucking nightmare for no reason. i had this former navy DI lady teacher in gifted kid algebra [so already a year ahead] yell at me for asking questions; she wasn't going to 'hold my hand' thru the homework, which was quite literally her fucking job
Turning 35 in a month and I've just started learning maths again after being afraid of it because of a similar situation to yours.
When you study CompSci (depending on where IG) you tend to see them that way when trying to mathematically prove something about an algorithm. It's only really a good way of thinking if you're into coding, but I don't think a teacher for a non-coding related algebra class should show this, it can be really confusing for some people.
I liked this so much I tried to find more. A few seconds googling turned up a lot, but this is the first hit: https://amitness.com/2019/08/math-for-programmers/
People who are arguing that one way of expressing these concepts is easier to learn/understand than the other are missing the whole point. Mathematical notation was not designed to teach students how to do math or explain how to design algorithms. It was invented to communicate precise, abstract ideas concisely between mathematicians who already understand what the symbols mean.
Mathematicians require a notation that has the flexibility to manipulate mathematical objects/symbols in a way that naturally emphasizes their properties and relationships. Often they don't even care whether the objects they're studying are even computable or have a numerical representation. They just need them to have certain properties so that they can be manipulated appropriately.
Discrete sums are a rare example of when the mathematical notation overlaps with the description of an algorithm for computing its value (and the overlap is not even complete; infinite sums are easily represented in math notation but are practically uncomputable when implemented naively). Every other advanced mathematical concept puts a premium on ease of symbol manipulation over computability: integrals, derivatives, matrix multiplication, abstract algebra, etc.
TL;DR math notation is complex because its intended audience is people who already understand it, want maximum flexibility of symbol manipulation, and historically didn't really care about practical computation.
You are right the symbols weren't created so students can learn them, but students have to learn them at one point and for me personally, a student that knows how to program, figuring out that these symbols kind of represent for loops made them easier to understand.
These scary large math symbols aren't scary at all and easily explained. The scary parts of maths lie elsewhere. They are discrete, nonlinear or high dimensional and sometimes even the numbers are complex... Or worse.
Quaternions are the closest you'll ever get to lovecraftian horror in real life.
Yea that's not explained better than a math teach. They just swapped notation common in math, for notation common in one specific programming language. it's only easier for the audience who happens to be familiar with programming in general, and that language in particular.
one specific programming language
I think you'd be hard pressed to find someone with any sort of programming background, even just as a hobbyist, who doesn't understand that for loop notation, whether or not they know the specific language it's from. (I couldn't even tell you what specific language that's from, because that notation matches so many different ones.)
I have a 15 year old son; he definitely has not seen summation in math classes yet, but he has far more than enough programming experience (even just from school) to understand the for loop.
I have no idea what these math things are but I understand the code perfectly lol
Maybe I'm crazy but they did teach me this in school. "This means so this operation until conditions are met".
Just notational difference other than presence of mutation..
How is it harder to understand 3 + 6 + 9 + ... + 3n
means compared to the for loop? Is repeated addition hard to grasp?
No it's not harder to grasp, just less concise. Summation and Product notation exist for the same reason we don't say "a discernible but subtle level of humidity" and just use "moist" instead - it's more convenient. People can be taught to readily understand "moist" or the summation notation. It's much harder to teach people to read the longer notation more quickly.
Wow, this is by far the clearest I've ever seen this explained.
Fuck! Im 40 and this is the first time I understand the sigma sign!! Thank you!
Couldnt they just show this to me at 7th grade or something when i already learned pascal?
The hard part of math isn't understanding esoteric symbols it's the theory behind it and it's application. Number theory will mindbreak almost all people.
While I acknowledhe that I had some pretty awful math teachers, I would like to add that explaining math concepts in an edited video that you could spend a lot of time making has different demands than babysitting/teaching 30+ students at different levels multiple times a day with little prep time.
Also the viewers are actively looking for that content
Ok but this is a bit of an unfair comparison given that Freya is pretty god tier at actually explaining math things.
You can reduce this readable code into one line of confusing python list comprehension that runs 100x slower!
than
I remember how confused I was when I first encountered i=i+1... like, what 🤨? How can this be correct, this thing has to be wrong... and then you start seing the logic behind it and you're like "oooh, yeah, that seems to work... but still, this is wrong on almost every level in math"... and then you grow a bit older and realize that coding has nothing to do with math, instead it's got everything to do with problem solving. If you like to name your variables peach, grape, c*nt, you can, and if that helps you solve the problem, even better, just make it work, i.e. solve the problem 🤷.
and then you grow a bit older and realize that coding has nothing to do with math, instead it's got everything to do with problem solving.
Wait until you realize what math is all about
In a way I always thought coding was more intuitive than maths writing norms. That is if you speak English. If not, it's as much daunting as weird greek symbols.
The biggest difference (other than the existence of infinity) is that the upper limit is inclusive in summation notation and exclusive in for loops. Threw me for a loop (hah) for a while.
Nah, look at the implementation above:
n <= 4
Means it’s inclusive.
You’re probably referring to some other implementation that doesn’t involve such fine control, like Python where range(4)
means [0 1 2 3]