Unsurprising. It's the natural byproduct of overproduction of scientists, brutally competitive job markets, and the shortsighted decisions to use publications as the primary metric for hiring and promotion decisions.
Anyone who is alarmed by this hasn't been paying attention to the perverse incentives scientists have been facing for decades.
Any advice for PhD dropouts? I spent years and years pushing against that boundary in an obscure corner of my field and it never moved. What little funding I had dried up and I left grad school with a half finished dissertation, no PhD, and giant pile of broken dreams.
I'm sure over the years you've known students who have started a PhD and not finished. What (if anything) have you said to them? Do you feel their efforts had any value?
I'm a PhD dropout myself. Serious question: what kind of advice are you looking for exactly? This is not intended as an insult, but it sounds like what you're looking for is not advice but rather consolation, which is natural and understandable given the circumstances.
I'll give you advice. Success in pursuing a PhD isn’t just about the discipline or the degree—it’s about finding the right environment to support you. If earning your PhD is still a dream, focus on identifying a program that aligns with your needs and strengths. Look for a school with the right resources, a program that’s well-structured, and, most importantly, a supportive advisor who believes in your potential. Combined with your dedication and passion, these factors can make all the difference in achieving your goal. Don’t lose heart—sometimes, the right opportunity can change everything.
Disclaimer: I have no idea what I'm talking about. I've never participated in a graduate program.
In the US at least, it is entirely possible to teach at a university without a PhD. Community colleges are full of instructors with masters's degrees, and tons of classes offered by major universities are taught by graduate students or adjunct faculty without doctorates.
Your job title probably won't be 'professor', but you'll be doing basically the same work as one.
Graduate students teach classes at their own universities as part of their departmental funding. This is only a temporary situation and exists only while they're enrolled. It's not a career path.
As a former graduate student myself, I'm actually not aware of any non-PhDs who are adjuct faculty or community college instructors. I'm not claiming that they don't exist anywhere, but given the number of PhDs and the number of available academic jobs, the competition is fierce, and non-PhD candidates are likely to lose out to PhD candidates.
Fwiw my dad had a masters in biology and a PhD in botany, but was an instructor for biology in the local community college (“Mount San Jacinto Community College”). I guess technically he had a PhD, but not in the way most people would think
I think this result is obvious to anyone who has spent any time in the academic world, although it is nice to see some solid numbers behind it.
The harsh truth is that key to academic career advancement is who you know much more than what you know. I every single person I knew in graduate school who got a postdoc position did so through informal means (i.e. knowing someone who knew someone), and having letters of recommendation written by the right people from the right departments at the right schools opens all sorts of doors to the academic hierarchy that would otherwise be closed.
I think you overstate this effect. At least in CS, it’s better to get a strong letter from a good (but maybe not superstar) researcher than it is to get a lukewarm letter from a Turing award winner.
reputation is a currency in academia, and even people in prestigious positions arent usually going to spend it to get someone mediocre into a top position.
Yeah. I used to think it was all nepotism / corruption, but (at least in STEM fields) there’s a bit more to it. My phd advisor is one of the most intense, hardcore people I’ve ever met. If he says someone is a good researcher, that counts for A LOT with anyone who knows his standards. There is no amount of stuff on a resume that could outweigh the word of someone like that.
There also seems to be a lot of cheating and/or metric abuse in academia, so it is hard not to over-emphasize this one signal, if it is all you are going to get anyway.
A generation ago or two ago, it was common for chemists to use taste and smell as a tools for qualitative evaluation of chemical compounds.
So older scientific literature is full of all sorts of knowledge that was obtained in ways that are shockingly unsafe by modern standards, including gems like the taste of all sorts of poisons and how large quantities of plutonium are warm to the touch.
Even as a chemist today you get to recognize the smells of chemicals even if barely exposed.
It's typically only the most toxic that you’d use such equipment to not be exposed at all (but then we tend to avoid those anyways).
You start to recognize the smell of ethers like diethyl ether or tetrahydrofuran (which I love the smell of). Sulfides are obvious (smell terrible).
I made a mistake a couple times smelling things I shouldn’t.
Once was diazomethane gas - a potent akylating agent and explosive. I instinctively put the roundbottom flask to my nose to smell, but realized after how dumb it was. No idea if i heavily alkylated my nasal passage epithelial cells or not, but no side effects.
The other time was a brominated aryl compound similar to tear gas. That was amazingly painful and felt like getting wasabi up my nose despite there being almost nothing left in the flask.
One time which wasn't intention was smelling CbzCl (benzyl chloroformate, a reagent used to add a protecting group to nitrogens). I didn't intentionall smell it, but measured it outside the fume hood in a syringe. It smells pretty awful, but what I realize is that the molecule must bind to your nasal passages (proteins have lots of nitrogens) because I could smell it for the next 24 hours. After smelling it that long, the smell now makes me nauseous pretty quickly.
As a kid I had a Lionel chemistry set. It had a chunk of sulfur that I lit up with a match. Then, curious, I took a deep snork.
Mistake!
Only a few years later in chem class did a teacher show how to use your hand to waft fumes from an open beaker or flask so that you can catch a tiny whiff.
A friend of mine works as a chemist in waste disposal and I reckon a shallow sniff is a pretty common first line tool for identification / confirmation. I doubt it is ideal, but nobody would lie too much about what is in that barrel right..?
It's a low boiling point oxygenated hydrocarbon solvent, so it smells like you'd expect - think things like rubbing alcohol, ethanol (vodka), paint thinner (the ones that have alcohols in them).
Diethyl ether smells very "heavy", for lack of a better word, and pungent. It's almost overpowering, and can become unpleasant after a while.
Tetrahydrofuran (which is just diethyl ether with both ends of the ethyl groups bonded to form a ring) has a "lighter" smell, isn't overpowering and smells "clean" to me. It's still a oxygenated solvent, so it's not pleasant like the smell of flowers or spices, but to me it's more similar to ethanol which is relatively pleasant.
>I am left wondering if anything approaching a "standard" exists for smells ...
You can buy tasting kits for whiskey or wine. They include individual scents like peaty, smokey, oaky, blackberry even some weird ones like band-aid. You can use them to train your nose to deconstruct the smell of whiskey or wine.
It's really eye opening (or nose opening if you will). Since you might even find you suddenly agree with the tasting notes on the bottle.
Interesting. Also, if I may, it seems to me there's more individual variation in "smell discernment" ability among individuals than there is for other senses.-
ie. so called "super-noses" vs. "scent deaf" people.-
As a side note, ether is a lovely smell diluted but inhaled concentrated (for recreational purposes – it's a bit like alcohol in effect) it's bloody brutal, burning your nose & lungs.
(They used to be sweets in the UK called Victory V's which contained a very small amount of ether, and they were just lush. Bought some recently and found whatever additives that was had been removed, oh woe :) )
> older scientific literature is full of all sorts of knowledge that was obtained in ways that are shockingly unsafe by modern standards
My favorite is there are old manuals that recommend smoking while working with cyanide. Allegedly it produces a very disagreeable flavor when you inhale the cyanide through the cigarette, so you get warning to get out of the area*
This was before fume hoods were common, when you would most likely be doing this outside or next to a window
* I have not tested this, and I don't know of anyone who has, so don't rely on what could be an old telephone game for chemical safety
The Stern–Gerlach experiment is famous for many things. One of them is that the only reason the silver deposits could be seen were because the experimenters smoked cheap cigars with sulfur in them, which turned the deposited silver to black.
"After venting to release the vacuum, Gerlach removed the detector flange. But he could see no trace of the silver atom beam and handed the flange to me. With Gerlach looking over my shoulder as I peered closely at the plate, we were surprised to see gradually emerge the trace of the beam…. Finally we realized what [had happened]. I was then the equivalent of an assistant professor. My salary was too low to afford good cigars, so I smoked bad cigars. These had a lot of sulfur in them, so my breath on the plate turned the silver into silver sulfide, which is jet black, so easily visible. It was like developing a photographic film."
A friend’s dad recognised cyanide during a chemistry exam by tasting it. (He survived and passed the exam.)
The task was to say what each of n substances given were in a short enough amount of time, filling out a report. I’m not sure if they still give cyanide to students during exams. That was communist Poland.
He's lucky that he could smell it! About 1/3 of the population lack the gene -- including my grandfather, who discovered this when performing an industrial reaction with cyanides and being alerted by someone at the other end of the room yelling that he could smell cyanide.
Hydrogen sulfide generally repels people to a safe distance due to its strong smell of rotten eggs, but in very high doses, such as when the police open a car door after an H2S suicide within, it quickly disables that very sense of smell.
on this point, the disease "diabetes" comes from an old latin word "diabeetus" which is Spanish for "urine which tastes very sweet with a hint of cinnamon". Now.. .. one can imagine how physicians of the time would go about diagnosing this disease, "diabeetus"
If you are referring to LSD you do not jest. Albert Hoffman intentionally dosed himself, although he took what would now be considered 5-10 times a typical "dose".
250µg is a robust dose of LSD, but not an unreasonable one at all. Someone with some experience who takes that amount will appear to others as obviously tripping, but ordinarily they will still make sense, be able to converse, and so on.
100µg is the usual standard of measurement, as in a drop from a vial or a square of blotter, and plenty of enthusiasts like three of those when they partake. So more like 2.5X of a 'standard dose', and well within the typical range.
I'm certain it was a remarkable experience for someone who had no idea whatsoever what they were getting into, though.
There's no known case of anyone dying from an LSD. Even after taking a few thousand times the typical amount (they thought it was cocaine). They did need hospitalization and would likely have died from aspirating their own vomit without it, however they all fully recovered within 48 hours.
It's a pretty challenging drug to hurt yourself (physically/chemically) with.
I immediately Ctrl-F'd for 'funding'. There's your problem right there. If there's no money to support graduate students, you're never going to get enough researchers to replace the ones you have.
Additionally, graduate students tend to avoid selecting research areas they dislike or find disgusting. The most disturbing presentation I've ever watched was a slideshow given by a parasitologist in which I saw worms in parts of the human body I never imagined it possible for worms to be in. No wonder students aren't lining up to spend years of their life working with them.
> The most disturbing presentation I've ever watched was a slideshow given by a parasitologist in which I saw worms in parts of the human body I never imagined it possible for worms to be in.
I read an essay once by someone who intentionally incubated some kind of fly in himself, and wrote that, after all the effort of being infected and incubating the fly, it chose to emerge while he was at a baseball game, where, he lamented, it was immediately killed by horrified fans over his protests.
Parasites are neglected tropical diseases. That's an euphemism for "third world shithole problems". These things aren't much studied because there is no actual need to study them. They are solved by civilization.
Developed nations solved parasites naturally as they developed. Infrastructure, basic sanitation, standards for food production... All of these things interrupt the natural fecal-oral lifecycle of parasites, solving the problem.
Naturally, developing nations are terrible at all of those things. To put it mildly. And thus parasites are endemic. They are literally every day things. It's actually kind of surreal.
It doesn't matter how much funding people put into parasitology, it doesn't change the fact the true solution is to develop the nation into a proper civilization.
During the most disturbing* presentation I ever watched, the simultaneous translation went dead for a good 15-20 seconds; I assumed the translators had muted their mikes to cover the dry heaves?
* one could always spot the reconstructive surgeons at these conferences; they were the ones who could wander around the poster session, all while calmly nibbling away at their hors d'oeuvres.
The best way to get funding is for your idea to either have economic potential or to be politically useful. Both of those outcomes generate power. Ideas that don't generate power go to the back of the line when it comes to funding.
That brings back memories. One of my first research projects in school was doing sketchy things with a Quanta-Ray Nd:YAG laser. I remember the distinct 'tack-tack-tack' sound of the Q-switching at 10 Hz which I used to create a laser-induced plasma right around eye level.
Fortunately I had the proper goggles on but was always terrified of catching a stray reflection and blinding myself. Now we live in a world of dirt-cheap high-powered diode lasers, and when I see all the stupid things YouTubers do with them with almost no discussion of proper eye safety, I wince.
As someone whose early scientific career was destroyed by null results, no. No one will publish your negative results. Unless you win the lottery and stumble across a once-in-a-generation negative result (e.g. the Michelson–Morley experiment), any time you spend working on research that yields negative results is essentially wasted.
This article completely glosses over the fact that to publish a typical negative result, you need to have progressed your scientific career to the point where you are able to do so. To get there, you need piles of publications, and since publishing positive results is vastly easier than publishing negative ones, everyone is incentivized to not waste time on the negative ones. You either publish or you perish, after all.
Simply put, within the current framework of how people actually become scientists and do research, there is no way to solve the 'file drawer' problem. You might see an occasional graduate student find something unusual enough to publish, or an already-tenured professor with enough freedom to spend the time submitting their manuscript to 20 different journals, but the vast majority of scientists are going to drop any research avenue that doesn't immediately yield positive results.
No it is the right advice. The key is to improve yourself and make yourself more attractive. Accomplish more, work out, become more capable, make more $ etc.
I did that myself by improving my diet, doing kick boxing, get educated, interact more with woman etc. It worked.
If you don’t, you will end up with a partner who loves your fake persona and not you. Forcing you to continue living a fake life. That is not a healthy way to live.
That's probably what the "be yourself" advice was getting at. When you try to be someone you're not you show it through your body language, it looks contrived and it's picked up rather quickly. "Be yourself" was probably meant to be more like act less desperate for attention and more confident in oneself, don't try to impress, etc.. But I agree that it is not a good advice to people who do indeed need to improve.
"reach into the future and pull the best possible rendition of yourself backwards to now forthwith such that you can become thatself once thineself agrees that it is indeed thou that thou wishes to become"
Not sure about US culture, but this seems to carry a peculiar assumption that men should date around same age? Because I doubt most 20yo women in "midwest" are married with a kid or two. Maybe some bits of cultural bagage you need to shed there.
Might not have two kids, but not single is absolutely true. People here tend to find a partner pretty quickly in life.
Anecdotally, a friend is trying to date in her early 30s and basically has no dating pool. Not many single men in their 30s have a stable job, crime free background, and avoid hard drugs.
A good lesson for everyone: your life's time horizon isn't anywhere close to your life expectancy. There are things you need to get right before 30 if you don't want the rest of your life to be misery.
Anyone who is alarmed by this hasn't been paying attention to the perverse incentives scientists have been facing for decades.