No, you stupid Redditor, what you said isn't even any different from what the first guy implied. You still have gotten nowhere near how averages work. An average is either a mean or a median in most cases; there are some other ways of calculating statistical average, but these are the easiest and most common. For any given variable y of population x, for a mean your calculation will be the total value or frequency of said variable divided amongst the population size. y / x = the mean. Let's assume a randomly generated class of ten students and their test scores, bounded between 50 and 100 - from random.org, our scores are 87, 67, 73, 51, 91, 70, 78, 51, 62, 63.
According to you, the average of 50 and 100 is 75 because it's in the center. Adding up our scores and dividing by the class size to get the average (mean), we get: 693 / 10 = 69.3. The average is not 75, it's 69. On average, calculating for mean, the class failed. 5 students or 50% of the class failed, with their scores all being below the mean (which would not necessarily remain true if there was a larger disparity allowed between scores, or a larger sample size allowing scores clustered closer together), but an 'average' student would literally be above the 50th percentile in this example. This proves that the average when using mean =/= the halfway point.
Now let's work out a median, with a second set of ten randomly generated scores: 64, 86, 53, 83, 57, 60, 95, 69, 59, 80. The median is the middle value in an ordered set, so we should sort these scores out then use math to find the middle - if the median arrives between two data points, you split the difference.
53, 57, 59, 60, 64, 69, 80, 83, 86, 95. Our middle two scores are 64 and 69, meaning that the median score is at:
69 - 64 = 5
5 / 2 = 2.5
64 + 2.5 = 66.5
This time when calculating for median, our average is even lower past the center point, again showing that the average does not mean 50% above and 50% below.