Friday, 8 September 2017

Her Final Word

We've seen the beginning and end of the universe.

We've seen what it means to have a scientific temperament.

We've tried to figure out the meaning of life.

We've done a lot, but it's less than what we haven't done. I can't even start to list it now.

Blogging was a journey. Apart from obvious linguistic benefits, it was a journey for me inside my own mind. Introspection, if you will.

But per the title, this is my final word. A journey can't continue when the purpose of the journey is lost.

"And anyone could see that I'm lost, through the seas I've been tossed...."

I have questioned a lot that I thought I knew about myself, enough to make me dizzy. Motivation has been crumbling, and what once gave me happiness now feels like a chore. I'll start up again, but who knows when that'll happen? When it stops feeling like a chore? It could be months before that happens.

A final word, then.

"There's more to see, than can ever be seen, more to do than can ever be done. There's far too much to take in here, more to find than can ever be found."

Happy exploring.

Tuesday, 15 August 2017

Intelligence and Accepting Fallibility

"Intelligence is not personal, is not the outcome of argument, belief, opinion, or reason. Intelligence comes into being when the brain discovers its fallibility, when it discovers what it is capable of and what it is not."

There's a lot to be said about this.  Intelligence – which is mostly viewed as the amount of "stuff you know" – has a new parameter to consider: fallibility.

Fallibility is the liability to err. For a brain to realize its fallibility is for it to accept that it may have been wrong about some of the beliefs it once held. Even that is not enough. Once the wrong beliefs are sorted, they must be discarded, no matter how valuable they once were.

The recognition of fallibility is a huge and difficult step. It is akin to winning a war and then looking at the destruction and havoc you have wrought to get where you are.  It is the realization of how awful you must have been to those around you back when you had not been aware of your fallibility.

To be sure: being incognizant of one's fallibility is seldom a crime. And recognizing fallibility  does not lead to freedom from fallibility.

The truly "intelligent" thing about realizing fallibility is the weight it puts on your shoulders, so that you are constantly aware of it. It is a far cry from freedom from fallibility, but paradoxically, the two have a lot in common.

With this constant weight on your shoulders, you develop the habit of approaching the world with questions instead of answers, with an open mind embracing all the possibilities which are obscured to a closed mind. In a way, it is a necessary step in reducing conflict and putting your ego aside. You get a new perspective on the world, and the "stuff you know" increases. It then follows that intelligence also increases. That's always a good thing.


Saturday, 1 July 2017

The Battle of the Mind

"I wish it was the entire tree that fell on Newton's head instead of just one apple."

A sentence that's supposed to incite laughter in its audience. One may call it a joke.

Because it's so easy, right? To take pride in not understanding science or math. To wish that Newton and others like him died a horrible death before contributing what they contributed so you could be spared from doing some schoolwork.

Oh, and so you wouldn't have to complain about it on inventions that were made possible because of their discoveries. Yeah, heaven forbid that should ever happen.

You know what? It's no use making the argument that science is beautiful, that science is complex, and that the complexity of science is what makes life worth living. So I invite you to consider the society at the time these geniuses put forward their groundbreaking ideas.

"Before the battle of the fist, comes the battle of the mind."

Galileo Galilei, who dared to go against the Church and become a proponent of the heliocentric theory, was put under house arrest.

Giordano Bruno was burned alive for the same reason.

Copernicus was warned against publishing his heliocentric theory on the grounds that it might spur controversy (remember that the Catholic Church was considered authority in his, Galileo's, and Bruno's time).

The presence of germs was known for nearly two centuries before the germ theory of disease was proposed. People didn't know to wash their hands before the seventeenth century.

Socrates was forced to consume poison because he was found guilty of not believing in the ancient Greek gods and of introducing new ideas to the youth of Athens.

Hypatia of Alexandria was a scholar at a time when women of her intellect were feared. "A woman of science? Sound the alarm!"

People of the past were brutal. Contempt for STEM fields today is quite subtle, but still visible. Anti-vaxxers, climate change deniers, creationists, and flat-Earthers are but a few of the manifestations of the outright denial of science.

But there is something that the past and present have in common. Scientists are always fighting a battle of the mind when the people around them are in outright denial of the same math that says that 2+2=4, and the same science that says that humans need water to live.

And believe me when I say, it is not easy fighting a battle of the mind.

So I am led to wonder: Is sentencing a scientist to a horrible fate, a brutal manifestation of taking pride in cussing out the people (the scientists) because of whom we live in a digital age today?

Thursday, 1 June 2017

Scientific Advancement.

Given all that we have discovered about our world till now, it might be very easy to delude oneself into thinking that there's nothing left to explore. Even though I cannot claim to be involved in scientific research at this point, I do like knowing about it, and I fall into this pitfall myself, every few years. Reading some articles (or books, as of now) is like a refreshing snap into reality.

But then I thought up an analogy for scientific advancement – navigating a maze. (Yes, Potterheads can go ahead and think of the Triwizard Tournament maze.)

The end of the maze is a new level of understanding, where all immediate questions are answered (notice the use of immediate). No one has an aerial view of this maze, but this is a maze that humankind created for itself. There is no disguised villain to tell the "Harry" inside each of us where the end is. But the trophy is a Portkey – it takes us to the next scientific quest.

It's a treacherous maze. There are deadly traps that seem like the right way, but aren't; lots of misdirections; and straight–up traps. You could fall into one of these traps and think, "Oh, I've solved it all", only to find a fellow scientist pass by, stomp his foot, and say, "That's the fifth time I've passed through here!", successfully removing you from your high horse (that you climbed accidentally, you promise!).

People spend their lifetimes in this maze, at different paces, falling for different tricks. Many die before reaching a breakthrough, which can be thought of as a sign in the maze that you're going in the right direction (a breakthrough, that is).

There are people who are ahead of everyone else, people who prevent others from going ahead, and people who change course in the middle of the maze. And many die at dead ends (no pun intended).

But given that this maze was made by humans for humans, some even find it a pleasure navigating it.

Happy journey!

Friday, 19 May 2017

A Glimpse into the Master Notebook

(Disclaimer: The Master Notebook is not a proper noun. It is not how I address my notebook. I hold all notebooks in equal regard.)
May 19 marked the completion of one year since this blog has been fully operational. My distant future is uncertain, like the durability of humans in this Universe. But let's enjoy our brief moment in the sun.
I have taken to updating this blog monthly, so I spent all of May 19th debating whether or not to do this – seeing as it's the middle of the month, and all.
I'm going to post a snippet of something I had considered posting on the blog as an article, but reconsidered it for abstract reasons.
Because apparently this is something authors do for their audience.
"Preparing for the SAT Subject Tests made me think about the physics we take for granted. And I was thinking about physics as the plane took off [...].
"There's Bernoulli lift in the takeoff. The air pressure difference pops your ears as you get to cruising altitude. You think about your position in three dimensions. Velocity takes on a whole new meaning.
"Little moments like this make me reaffirm my thoughts. Yes, I want to study astrophysics. I want to see the interplay of theory and practice. I want to cry with awe at the beauty of the world we live in. I love physics.
"People often groan when I point out the physics of everyday life. These are the same people who think physics has no 'practical application.' "
There's more after this, but what comes after this is the reason I won't post the entire thing. I think in harsh words, and that is reflected in my writing, no matter how tactful I try to be.
Well, until June 1, then.

Monday, 1 May 2017

The Greatest of Mysteries

"The Cosmos is all that is or ever was or ever will be. Our feeblest contemplations of the Cosmos stir us – there is a tingling in the spine, a catch in the voice, a faint sensation, as if a distant memory, of falling from a height. We know we are approaching the greatest of mysteries."
                     – Carl Sagan, Cosmos

This is an article on what this quote means to me. Not anyone else. I can't and won't speak for anyone else.

It's so easy to pretend, sitting on the couch, that the everyday occurrences throughout the Universe don't affect us. The fact that you can't even look up while in the city (because of light pollution) doesn't help matters. But the Universe isn't "out there". It's just an amplified version of science at everyday scales, though that's not to say that what we're experiencing here on Earth is a dumbed–down version of what we could be experiencing. It would be an error on my part to say so, because Earth is intrinsically connected to the Universe. As are we. But that's just it. It's so easy to forget that this is where we came from, that ultimately, we all come from the same event that created the stars and galaxies.

We are the Cosmos, the Cosmos is us. But our lives have taken us away from it, what it truly is. That is why when we think about it, we feel the kind of exhilaration that doesn't happen everyday at the desk, because the desk came after the Cosmos. In our contemplations, we attempt to connect with the most primal parts of us: the parts that came from the nuclear furnaces of stars, the parts shrouded by our basic needs of survival and, too often to ignore, by our ego and pettiness.

The reality of our place in the Universe can be maintained, however. By stepping outside, at the very least, and traveling into interstellar or intergalactic space, at most. (Yes, I know the impossibility of that. That's why I said "at most".) These are better methods than any to get even an inkling of the significance we hold in the Universe (and even that is close to none).

The existence of the Universe is like a big chemical reaction. Here, we're intermediates, a temporary result of the initial conditions that set our Universe into motion.

"Anyone who does not...Gaze up and see the wonder...Of a dark night sky filled with countless stars loses a sense of their fundamental connectedness to the Universe."
– Dr. Brian Greene

Wednesday, 5 April 2017

The Concept of Nothingness

Put  simply, nothing is defined as "the absence of something."

Think of the number zero. For the purposes of this article, we'll take it to mean nothing. The absence of something – the absence of numbers.

There's a lot to be said about a number that represents nothing. The entire number system is riding on zero –  is riding on nothing. It defines place values, defines the significance of digits. This one closed loop that has no value of its own.

You can't convert zero into something unless you think additively. Zero can be converted to one easily enough – 0+1=1. Simple math. But that's not how everyday objects work, and we attempt to describe everyday objects. The trouble is, the simplest everyday object can carry a world of abstract ideas within itself.

Think of it like this – A balloon can be filled to occupy a large space, assuming it doesn't burst. It'll grow to twice its size, three times its size, and so on. In fact, that's usually how we compare quantities. This isn't anything new. But this is one of the most notable differences between something and nothing. You can't do this with nothing.

To convert something into more of something, you can multiply it. Mathematically speaking, 5*2=10. If you have five apples and you add five more, you have twice as many apples. But...
0*1=0
0*2=0
0*3=0, and so on.
x*0=0, not x.

Neither can you distribute something amongst nothing.  Every schoolchild knows – division by zero, by nothing, is undefined.

But for all that we have set up around this number, this entity, our knowledge is incomplete, because it is next to impossible to conceive of nothingness. 

All our lives we have been surrounded by something – air, space, anything. Usually when we speak of nothing, we mean "nothing of consequence" or "nothing that pertains to this conversation." In daily use, "nothing" has become a shorthand for this specific kind of nothingness.

But there's a principle in science – there's no such thing as empty space. Essentially, there's no such thing as nothing. Even in the vacuum of of space, there's at least one particle per some thousands of units of volume. And even here on Earth, nothing is not a concrete concept. We have always described nothing in terms of something. So, we may have turned nothing into a function of something. There's something that can be said about that.

In many ways, our language explains our psychological tendencies. I don't have to define a pen in terms of a pencil. I don't have to define heart in terms of the amount of blood it pumps. I could, but it won't be necessary. These objects don't rely on each other for their definitions, the way nothing depends on something. And even this dependence is one-way, because something doesn't have to rely on nothing to define it.

Our trouble with conceptualizing nothing is an explanation for why the Big Bang is not an easy concept to grasp. The math checks out, sure. But if you sit down to think about it long enough, a question does rise – "How can something be created out of nothing?"

Indeed, because here are some ideas the Big Bang model puts forth:
1. There was no space or time when there was a singularity, and one could not come into existence without the other.
2. The singularity itself was infinitely hot and dense, and all that would become our universe was packed into an extremely tiny volume.

You need space to describe where something is, and you need time to tell when something happened. But how do you explain where the Big Bang occurred, when space was nothing? How do you explain what came before the Bang, when you couldn't invoke an idea of time, since time was also nothing?

Space and time are part of something. They are the most fundamental somethings anyone can think of. Their absence creates the most fundamental nothing, the kind of nothing no one can ever dream of. Because we've never experienced or come close to experiencing such a thing.

And the hot and dense part. This is where we know of something coming from nothing.

The "primordial soup" we call the singularity had an immense amount of energy. Energy equates to heat, hence we say the singularity was hot. But the thing is, space wasn't defined; it didn't exist. So all that energy was packed into...well, nothing. Conditions were getting divided by zero, because that was all that did exist. And anything divided by zero not only is undefined, but it is also infinite. All that energy, packed into nothing...it became everything.

There are few places and conditions where our definitions of the everyday world just crumble, become meaningless. The origin of the universe is one of them.

We've done so much with the concepts of zero and nothingness, it is nearly impossible to imagine where we'd be without them. But at the same time, we can't think of them in the same way that we think about everything else, either. What a concept nothing is!

Wednesday, 1 March 2017

In Defense of Philosophy

I picked philosophy for this post for a few different reasons, two of the main ones being that I have gained some insight into it over the past few months, and that certain people around me don't seem to think much of it. In fact, perhaps discussing philosophy on a blog dedicated to science may come across as a sign of my aptitude being questionable. And this is exactly why philosophy must be defended.

Philosophy today is perceived as a field where people ask unnecessary, or often frankly disturbing questions. Who needs to worry about the great why's of existence when you've got upcoming exams, or a really important project at work tomorrow, right?

Not exactly. In many ways, philosophy has helped us shape our world. You'll often hear people talk about the "philosophy" of their work; the motives behind it, or abstract ideas behind it is what they mean when they say it. Abstract ideas...

I use that term to highlight that human ideas are, in fact, quite recondite – they're called human ideas simply because no other species can have them, or understand them. Try explaining trigonometry to a dog, or expressionism to sheep.

Where I'm going with this is that the kind of ideas humans have, what they think, how they think it, and what makes them think – that is what constitutes philosophy. Simply thinking about anything and everything makes you question the things you hitherto took for granted or as a given, and the analysis of how to think makes you understand why and how people can have opinions that conflict your own. That is philosophy, along with how to justify yourself and incorporate other justified beliefs into your mental catalog. And, yes, that does lead to internal conflicts and existential crises. But there are rainbows after rain, and so it is that at the end of such crises you find that you have shed unjust beliefs, strengthened just ones, and have a clearer outlook on life.

Does this sound familiar?

It must, because this is exactly what science does – just with a lot more paperwork and tangible data. And guess what – there's a reason for that, and that reason is history.

For all of human history, we've been trying to figure the world out, asking what's happening, why it's happening, how it's happening. It could have been about natural phenomena. It could have been about human behavior that harms society as a whole. It could have been about why people need other people. Or, you know, it could have been about finding decent places to sleep.

It didn't matter the kind of questions you asked. As soon as language was created, all what, how, and why questions were (messily) grouped into philosophy.

In fact, traced to its roots, the word philosophy is made from two Greek words: phylos to love, and sophie wisdom. And wisdom is not restricted to scientific wisdom.

Then came divisions, as they so often do in our quest to define stuff. Philosophy split into science, politics, linguistics, visual arts, performing arts, and much more. It is worth noting here that the scientific method was first formulated – and implemented – by philosophers.

So, basically, that's the significance of philosophy. It exists where thoughts and ideas exist – which is to say, everywhere. It's not obsolete, and it's not meta–anything. That is my defense.

Wednesday, 1 February 2017

Hypotheses and Fallacies

I've been sitting on this hypothesis for some time now. It's full of fallacies, but it shows that there is something I know and can think independently about.

So, this is how it goes.

Einstein's theories of relativity explain something called "time dilation" — the idea that time slows down as the object speeds up. It also predicts that no object can go faster than the speed of light.

And in tandem with a hypothesis I was discussing with a friend a few months ago, I tried to think about time travel as a function of speed (yes, think about. Never let it be said that philosophy is dead.). Here are my conclusions:

  1. If time slows down as we approach the speed of light, there has to be a point where time stops. Then there are also points where time should start going backwards. This point comes past the speed of light, which is prohibited by the laws of physics. The speed of light is the cosmic speed limit.
  2. So, the speed of light is that speed at which time becomes meaningless. Sure, we talk about light taking time to reach other objects in the universe, but that's because it is we who perceive time. From the perspective of light, it means nothing. For photons, there is no such thing as time. Thus why we have one photon in more than one place at the same time, or vice versa. 
  3. So, essentially, if we want to travel backwards in time, we have to go faster than the speed of light. Explains why time travel is impossible.
  4. If we want to understand the true nature of time – what it is, as opposed to what it does – we have to study it from the perspective of photons. And in a notion akin to relativity of motion through space, the motion of time, in essence, the speed of time on Earth (or anywhere in the universe) can be measured in relation to light.
There are some problems with this, though.

The first is that light itself doesn't always travel at the speed of light. And yet time is meaningless for it even at slower speeds. One could look to the massless nature of photons for a possible solution.

The second is kind of mathematical. We usually quantify change with respect to something. Usually that "something" is time. But when it is the speed of time itself that needs to be measured, what do you measure it with respect to? What property of light? Because the parameters we speak of here are set at zero. And every schoolchild knows that division by zero is undefined.

Well, it's a step in the process of becoming an astrophysicist, at least.