Ha. As if there's a process. I just stumble across a hill one day and realizing I'm willing to die on it.
Chat
Relaxed section for discussion and debate that doesn't fit anywhere else. Whether it's advice, how your week is going, a link that's at the back of your mind, or something like that, it can likely go here.
Subcommunities on Beehaw:
This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.
Regardless of what I've professed, I've discovered that many of my own beliefs were without basis once they were challenged and likely many still are. This is a very human thing I think about a lot.
I'm not really sure you can choose or provide a process for what you believe. I certainly don't say "I'm going to believe proposition X" and then do some set of things to believe it. I think belief just happens or doesn't.
Your post sounds a little more like how I'd define knowledge. I can believe things for just about any reason, from intuition to desires to random chance. Knowledge for me requires mostly "justified true belief" though this is a philosophically squishy definition, it's also the best shorthand I'm currently aware of. And I try to be pretty clear if challenged or prompted on what is belief and what I'd call knowledge.
Knowledge acquisition works basically like you say.
So, why this pedantry? Well, I'd say it's because you can actively work to gain knowledge, and there's also generally an ability to have percentage confidence levels in a way I don't think works with belief. I generally find belief mostly binary.
To take an example - I can be forced to believe or disbelieve something, and that can happen via knowledge acquisition as to truth or falsehood. For instance, I don't believe aliens have visited Earth. But if "First Contact" happened and a ship landed in front of me an an alien walked out - I'd immediately change my belief. Then of course I'd start the knowledge acquisition to try and be sure I wasn't being pranked in a very expensive hidden camera show or something.
On the other hand, if I believed that a bear was tipping over my trash every night, but many other people said it was a raccoon, I'd reevaluate that belief. It's wouldn't be an active process though, my belief is just shaken, so I then begin to try and get knowledge.
TL;DR: I think belief is subconscious and not directly under our control.
I absolutely agree that to a large degree this is a subjective process which depends on the nature and nurture of the individual forming the belief. It's likely that the vast majority of the things I believe gained that status due to an automatic process which I am unaware of.
I should specify that the process I detailed is what I would intentionally apply to form a basis for highly consequential personal or political decisions. When it was first posed to me and upon studying the psychology involved, I have wondered whether I would have behaved like the average German did in 1938 despite the official ideology being so obviously vulnerable to scrutiny of any kind. I observe a similar method of epistemology being deployed now notoriously against certain minority groups today which I might have gotten caught up in as a teenager due to my upbringing but know better than to fall for now. I may edit my post to specify forming a belief when one is aware they are forming a belief when multiple interpretations of reality are present which may not even be based on the same set of "facts."
Process: Idea formation from within or from outside source > Consider this to be a hypothesis and attempt to adapt it to something which is disprovable using data > Look into available data sourced from places which would not themselves be advantaged or disadvantaged by an interpretation of the data (mainly academic) > Accounting for data, form belief which I consider likely to be true, although I can’t be certain it is true. > Take in other perspectives on my conclusion and re-assess if necessary
Explanation: Although there are certainly beliefs which I hold strongly and would fight for (which involve human rights), I believe that having absolute certainty of anything is absurd. Although I can’t be certain Empiricism is the best possible method to determine beliefs, it is the process which I find to be the most effective at this time. As physical beings who evolved for an ecological niche which few of us are now involved with, we may lack the sheer capability to truly understand the underlying mechanisms of the universe. All current philosophical and scientific paradigms which we can be knowledgeable of are only most likely to be true according to what we have collectively learned so far, and paradigm shifts in basic understandings of sweeping subjects have occurred during my lifetime. Because of this, I can’t assume that anything I believe is strictly true, only that it is what I personally find most likely to be true given my limited perspective. For this reason it is important to take into account the interpretations of reality from as many other perspectives as possible who have applied a reasonable basis of empirical reasoning to their arguments in order to have the best possible chance of approximating truth. Although I will make my best possible effort to be as objective as possible, I know that as a human creature I hold inescapable bias which I may or may not be aware of and much of this process is subjective.
I guess it's reading new information, checking whether or not it's based in reality and then either incorporating it or not. Then it sits there until called upon or challenged and the process repeats. Eventually I feel like this builds up a robust personal belief system that's still open to changing information.
In our age of misinformation, how do you determine whether the new information is based in reality?
Somewhat related is the concept of ontological shock. Most recently saw a post on Reddit about it, linked below. I can relate to the idea of my brain building a running narrative of how the world works, and fitting information into that framework on the fly. When I come across things that don't fit into that framework it's easy to dismiss the information and run towards people that share that worldview. Ontological shock happens when you reexamine the core of your worldview based on irrefutable new information, and realize that your existing framework of predicting and understanding things in the world no longer works. Facsinating stuff, and it truly shows that we can absolutely be living in completely different realities than our neighbours.
https://www.reddit.com/r/UFOs/comments/145kwhw/ontological_shock_is_real_and_you_should_treat_it/
I had quite the ontological shock when I was exposed to methods of investigation and information which was systematically collected and reviewed by people who I became familiar at University. My formerly conservative beliefs were shattered under the weight of the scrutiny which I was now able to apply to them, which caused extreme discomfort because many of the premises which I had planned my life around were not only false, but easily disprovable with the slightest examination. Since then I've tried not to be stubborn about the beliefs I hold and will criticize new information instead of dismissing it (although some information can be criticized so quickly it can be dismissed rapidly).
There hasn't been hard evidence presented for the new claims of extraterrestrials on Earth to my knowledge, nor have the claims been disproven (these specific claims can be disproven). I choose to withhold judgement on the phenomenon until a sufficient amount of evidence is provided (in my opinion, I suppose) or the claims are disproven, but I'm not going to dismiss the possibility that the claims are true off-hand.
It's worth reading The Structure of Scientific Revolutions as something of a starting point.
Thomas Kuhn was a grad student in physics at the time of the so-called quantum revolution, and got really interested in how physics, the supposedly most venerable branch of science, got suddenly upended. He shifted to philosophy, then later on collaborated on a sociological study of scientists. That book was the result of his research.
He proposes that science is made up of paradigms, which is what you learn in grad school. What you already believe conditions what you see when you look out at the world, so already this is going to influence what experiments scientists think to run, how they'll design their experiments, what methodologies they'll use to make sense of their data, and so on. It also influences what they'll see as credible findings when peer reviewing papers. Tack onto this that paradigms get attached to careers—if I'm a senior scientist whose career was launched by X research, then I'm gonna be defensive of X—and you've got a recipe for stagnation that'll only occasionally maybe get shaken up. Not to mention that, particularly outside science and engineering, an awful lot of research funding comes from the Rehabilitating Some Billionaire's Public Image Charitable Trust (since public university funding has been massively scaled back). If things do fundamentally change, the change is going to be huge and rapid.
Of course, this applies to nonscientists as well. There's nothing sillier than a Christian saying they took a long, hard look at their beliefs, then arrived exactly where they started. This isn't because they're dishonest or intellectually lazy, but because they see the world in ways that predispose them to believing in God.
With the bad news out of the way, there are two things I can say.
First off, look at how things work. Any belief about politics (for example) has to take into account that the world is full of real people who are just as complex as you or me. Thinking "oh those people are just _____" is a nonstarter. At the same time, institutions (like government, C suites, middle management, churches...) also strongly condition how people will act in particular contexts. Most people are very sensitive to institutional rules and norms, and deviation from rules and norms will generally be on the down low.
Second, don't let inconsistencies slide. Inconsistencies are often a sign that something is being hidden from you, and focusing on them can give you much better insight into how a person or group actually works. For example, democrats could have stopped Roe from being overturned, because they're just as able to filibuster supreme court appointments as republicans. So do democratic senators actually care about abortion access? And on the other hand, why are republicans who said things like "if we elect Donald Trump, he will destroy us, and we will deserve it," then going on Fox News crying about how mean Justice is being to him?
That particular Kuhn work is one that I often come back to in snippets even though I haven't read it through yet. It definitely made me aware how much the era I'm living in informs what others and myself consider to be fact. Especially in studying History I find many decisions which people really made completely baffling in a contemporary context, but norms were fundamentally different according to that place and time.
My worldview is not fundamentally different than the Christian worldview you described. Although I have a strong preference for Empiricism, I understand that it's mostly a subjective preference of mine rather than the ultimate method of epistemology. According to empircal data, application of empirical reasoning has yielded the most consistent with reality conclusions which we have currently made. Every aspect of the previous sentence is loaded and some elements may not be strictly true, but it would require a different method of reasoning to determine that which may not even be compatible. This makes me sympathetic to scientific anarchy even though I have my personal bias towards empiricism above other methodologies. It does make me curious as to what truths I myself am incapable of appreciating due to my practical adherence to material reality and preference for evidence and scientific consensus.
I'd guess inquisitiveness? Curiosity? Science?
This is something that im relly interested in lately. I was born in a quiet religious community and as i was exposed to more and more outside information i had to reevaluate my beliefs down to their core.
As i started to rely more and more on the scientific method as a basis of truth i felt like i figured it out (atleast the way to figure things out), but then came an other fundamental shift in how i think about truth. At this point i dropped every belief that stood on anecdotes and authority or an ad hoc framing of subjective experiences etc. However i was also the kind of person that would think trans people and allys are idiots for wanting to use preferred pronouns since they are "male/female".
Then as i started to read up more on the views of said people i of course realised that the media i listened to previously set up a false narrative of people wanting to deny science, while in reality these people simply thought about this topic in a more nuanced way, separating biological sex and how someone feels and enjoys expressing themselves. This topic showed me how easy it is to be locked in a framework that fails to adress some parts of reality, while still seeming coherent and rational in its simplified framework.
As of now my goal is not building beliefs but to try and put myself in as many frames of thinking as i can and explore how many ways something can be viewed, hopefully managing to adress ever more nuances of reality but never accepting them as facts.
Of course this is the theoretical part of things, but when i have to make a real life decision i have to settle at my current best. Even then, treating everything i experienced and thought as probabilities and imperfect simplifications of reality helps avoid a lot of mistakes and makes it easier and more productive to work together with different people to find answers.
This is very similar to the development I've experienced, thank you for sharing the process itself. Our brains naturally chunk information, so the use of schemas to understand abstract concepts I think is completely intuitive. As you pointed out though sometimes a schema can turn out to be an over-simplification which falsely indicates a topic is less complex than it truly is. It's interesting to look over all of the cognitive traps we're vulnerable to which we could never escape if we didn't admit to ourselves we could fall into them.
Yes, it can be a bit overwhelming at times, but at the same time always exploring even seemingly understood questions makes life a lot more exciting!