Jump to content

Global economic September


Student of Trinity

Recommended Posts

In my ongoing quest to confuse this shareware indie games company board with some kind of grand strategic think tank, here is a speculation about the world economy.

 

There was a big bubble and a crash a few years ago, after the big bubble and crash a few years before that. Now there are all these Occupy protests. Yeah, yeah, these things come and go. Maybe it's just noise, and in a few years it will all be forgotten. But maybe something is really happening. It's probably not the protests themselves that are important, and probably not even all the financial mess in itself. But maybe they are symptoms of a long-term issue that is really not going to go away.

 

I've grumbled before about the idea I got from one of Paul Krugman's books, that the big surge of technological progress has been over for some decades now. It was in the middle of the twentieth century, when we went from horseshoes to moonshots in a generation. Since then the pace has slackened dramatically, even as we keep telling ourselves how rapidly everything is changing.

 

If this is so, and I think it is, then maybe it means that the sweet era of rapid economic growth is also over. It takes a generation or two before technological progress really gets implemented in the economy, so while the technology ramp relaxed into a gentle slope a few decades ago, it is only around now that economies are noticing the sag in impetus.

 

Growth covers vast multitudes of economic sins. Growth means that an average jane or joe can get more and better stuff for their work time than they used to get, because they can produce more and better stuff with that work. Yada yada yada, how that all plays out in the big economic scene. But it's the ground truth on which everything else is really based, the facts for which all the abstractions of money and credit really stand. Rapid technological progress translates into all the good things we want to have: low unemployment, higher standards of living, everybody's happy. If rapid technological progress is continuously bringing along much better methods and much better stuff, society can cope with an awful lot of problems and stupidities, and just take them in stride.

 

Summer time, and the livin' is easy.

 

But now it's September. The air is getting colder. Back to school and back to work.

 

It's not the end of the world. It's just the end of summer. Technology is still rolling, the planet is still big, we can handle this. But it won't be as easy as it used to be. We can't afford so much stupidity any more. Maybe we'll have to face some problems we've been able to ignore for generations.

Link to comment
Share on other sites

We're moving into what they call the Wal-Mart economy back in the US. Unlike the era where Henry Ford was increasing worker's salaries to buy his products, we are now having industries cutting product costs and getting cheaper workers to produce them. It's a death spiral where the old industrial nations have lowered salaries to compete with the new emerging nations and the employees can't afford to buy.

 

Technology is allowing the Western nations to compete with increased productivity. But as Student of Trinity points out the last technology boom is waning down so until the next big thing growth is flattening out. The low cost workers in the emerging markets are soon going to use the new technology and end the Western advantage. So nothing is going to improve in the next few years.

 

Now after a summer of discontent over the European debt crisis, there is a period of instability while people look to get out from their own debt loads. No one wants to pay their debts. The increase in new jobs for the unemployed isn't happening. Until companies spend to hire the instabilities continue.

Link to comment
Share on other sites

How does the notion that technological progress slowed substantially after the mid-20th c. fit with the changes in computer technology that have come in the past 20 years and have infiltrated more or less every device we have? I can't think of how to reconcile those two.

 

More to the point, perhaps, might be something like this:

real+per+cap+GDP.png

The growth curve has been pretty steady. The difference is who the money has been going to. Now, that might have something to do with why the growth happened in the first place (technological change vs. whatever), but it might not.

Link to comment
Share on other sites

It's been more like thirty years for widespread computer use, rather than twenty. Twenty years ago friends of mine were downloading files over the internet. That's a long time, and the advance of computers has been gradual, not revolutionary. They still leave us doing basically the same kinds of things today that we used to do in previous decades, just a bit faster, or with flashier graphics.

 

I don't know how "real per capita GDP" is calculated, but I'm sure it's a pretty heavily theory-laden measurement. Measurements of dollar value are always relative to some standard, and so they are bad at capturing the fact that standards are radically changing. In important terms it cannot possibly be true that people only produce ten times the value today that they did in 1929, because in 1929 no amount of money in the world could have bought you a television, or a computer, or an artificial heart.

 

I think a more fundamentally accurate curve would be one that shot up exponentially from 1900 to 1970, by about a thousand-fold, and has had only much slower growth since. The last few decades have been pretty well represented by a curve like the one you showed. For the decades before that, I think such a curve was fundamentally misleading.

Link to comment
Share on other sites

Surely mobile phones count as a technological advance with a major impact on how our lives go.

 

Here's the thing: yes, 30 years might be an accurate number for the time from the first clearly recognizable vestiges of the internet to its widespread adoption. But this isn't the same thing as the time it took for people to switch from bicycles and buggies to automobiles. Automobiles changed during that time, but not significantly. Their basic function was the same, their impact on commerce and on daily life, pretty much the same.

 

Over the last 20 years, the Internet has changed dramatically. Its basic function has expanded profusely, and its impact on commerce and on daily life has not just grown larger, but also broader. Comparing this to the change in automobiles or road systems is hogwash.

Link to comment
Share on other sites

SoT: I'm trying to figure out what the measure that you're talking about would be. I appealed to real per capita GDP because it takes GDP (total value of all final goods and services sold in a country during the year — measured basically by adding up how much people bought), adjusts for inflation (change in the price level), and divides by the population. Generally people take it to be a reasonable measure of per-person wealth. I don't know exactly what you mean by "theory-laden," but my guess is that it's the opposite of that.

 

But what you're talking about is what you can do with their wealth. Sometimes comparison can be made between past and present (the level of computing power that you could buy for a few thousand dollars in the 1980's would cost next to nothing today), but sometimes they can't (no amount of money would buy a smartphone in the 1970's). This seems much harder to quantify, but even so, I don't know why you think that technological progress has substantially stopped since 1970. I don't really see the difference between the leaps forward in electrical appliances in the middle of the 20th century and the leaps forward in digital technology near the end of the 20th century.

 

In particular, I don't think we're doing the same kinds of things with computers as we were in the 1980's. Maybe a few people downloaded some files here and there in the late 1980's, but Facebook is a difference in kind, not just degree. Maybe a few people had laptops with Wi-Fi in the 1990's, but smartphones are just not the same at all. At dinner last night, I was eating Thai food and realized that I knew nothing about the history of Thailand, so I pulled out my phone and read a Wikipedia article on it; fifteen years ago, I would've sat there continuing to wonder and, at best, gone to a library to find a book on the subject if I really wanted to know more. Constant fingertip ability to retrieve information, communicate with friends, and purchase nearly anything almost instantly from nearly anywhere represents a difference in kind from what was possible in, say, 1989.

Link to comment
Share on other sites

From what I've seen, the last twenty years have taken mostly existing technology and built an impressive infrastructure for it.

 

Our access to the internet has become drastically faster and more mobile, and is practically taken for granted across the generational and financial spectrum. There were internet accessible encyclopedias in the early 1990s, but Wikipedia blows them away with accessibility and content generation. You could shop online, but it wasn't putting whole nationwide chains out of business. DVDs only came out in 1995, and now they're swiftly being replaced by online streaming and distribution, along with music, periodicals and (*shudder*) books.

 

I was under the impression that our economic crisis was less a function of declining technological progress, and more a symptom of the United States' declining role in that progress. I am not, however, an expert, and thus look forward to the continuation of this thread.

Link to comment
Share on other sites

Originally Posted By: messin up ur layout lol
I thought the whole economic crisis was caused by morons and/or banks getting involved with crappy loans, and then everything snowballed.

Shows how much I pay attention, I guess.

That's one reason, but the economy is so complex that there's no way people are going to agree on the cause of a collapse. Heck, there's no consensus among economists on whether FDR's policies helped or hindered the Great Depression.
Link to comment
Share on other sites

Originally Posted By: Excalibur
Heck, there's no consensus among economists on whether FDR's policies helped or hindered the Great Depression.


Modern day libertarian "economists" don't really rate being classified with people like Friedman or Hayek. I wouldn't call a few wingnuts complaining about FDR to promote their meme that government intervention is ALWAYS BAD "no consensus".
Link to comment
Share on other sites

Quote:
There were internet accessible encyclopedias in the early 1990s

Citation please? No, seriously. There were NOT internet accessible encyclopedias in the early 1990s. The first one I can think of was the 1911 edition of Encyclopedia Britannica, which shouldn't even really count because it was the 1911 edition. That came online in I think 1995. There was a spitting match over computer encyclopedias (think Encarta) but they weren't on the internet, and that wasn't until the mid-to-late 90's either. Wikipedia was 2001.

Let's review the actual state of the internet in the early 1990s: it consisted of e-mail, newsgroups, gopher, and a bunch of FTP and telnet servers run almost exclusively by universities and government agencies. There were still plenty of BBS's around, but those don't really count as internet. In 1990 the web had been proposed, but did not exist yet. Web browsers that could handle anything other than inline text were a few years away, and it wasn't really until 1994 that the web even BEGAN to find a use in the general population.
Link to comment
Share on other sites

Originally Posted By: Dantius

Modern day libertarian "economists" don't really rate being classified with people like Friedman or Hayek. I wouldn't call a few wingnuts complaining about FDR to promote their meme that government intervention is ALWAYS BAD "no consensus".

I mistook this for a Youtube comments section, because, you know, obvious troll is obvious.
Link to comment
Share on other sites

I remain convinced that it's not due to a lack of technological growth that so many economic conditions are looking worse, but rather the opposite. Instead of employing grocery store attendants to check out your goods, there are now self check-outs. Instead of having a Ford-style automobile manufacturing plant, robots are increasingly building our cars, our everything. The result of this culmination of technology is that that sort of work is becoming more scarce.

 

At the same time, higher education, high skill jobs are also being lost. The epitome of this is Watson, the Jeopardy playing robot, who can combine the knowledge of Wikipedia into one rapid machine. A more realistic example of how this is applied, though, is in stock brokerage companies. Where once upon a time things literally depended on physical interaction between stock brokers, now the process is increasingly more digitized and online.

 

This has left many people in the dust, so to speak, as they're replaced by machines. However, unlike the Luddites, I don't see this as a bad thing; people are finding new niches, and the educational systems are adapting to help them do just that. There are still plenty of jobs needed that simply cannot be practically be taken over by machines.

 

I agree with Marx in his analysis of history, on this matter. Technologically driven expansion has fueled the economies of the ages, from the Roman slave states to feudalism to capitalism. The Occupy movement and the other solidarity movements, and whatever movements follow them or take their place, are merely an indication of a transition that is occurring.

Link to comment
Share on other sites

I should probably refrain from making statements about "the early 1990s", as I was not a terribly aware creature during that period. To me, the internet yields itself to nothing better than the storage and access of knowledge. I see form a bit of research that the idea didn't really come up until late 1993, and that Interpedia never got off the ground. However, I was under the impression that Encarta started some online distribution in that range, and that some private or payed databases had approached a size to be considered encyclopedic.

 

If that is not the case, then it speaks to the innovation of the last decade.

Link to comment
Share on other sites

I apologize again for making a factual statement without checking first, Slarty. I am not alone among my peers in feeling as though the internet has always been, but most of my fellows are probably smart enough not to behave as if they know what they're talking about unless they do.

 

Terms like "internet" and "web" were once specific and well defined, but I admit that the lines are quite blurred to me. I would have said that, if something can be kept up to date remotely, it possesses an element of the internet, whether or not all the data is actually hosted on a server.

 

Regardless, I did not intend one example to make or break the point. The issue at hand is whether and how much progress has been made in recent years. I asserted that, from a day-to-day life perspective, it has been more a matter of superior implementation than brand new ideas.

Link to comment
Share on other sites

SoT pointed out that the cycle of implementation has run it's course for the last big thing.

 

Cell phones are replacing land lines which also had their boom when costs came down in the 1970s. The Internet is replacing the public library system that made information available to the masses. Modern computers have replaced the main frames of the 1950s. There isn't any break through technology that really hasn't an equivalent in the past that caused a major change back then.

Link to comment
Share on other sites

Originally Posted By: Goldenking
I agree with Marx in his analysis of history, on this matter. Technologically driven expansion has fueled the economies of the ages, from the Roman slave states to feudalism to capitalism. The Occupy movement and the other solidarity movements, and whatever movements follow them or take their place, are merely an indication of a transition that is occurring.

Not exactly. Technology has made a lot of blue-collar work easier to do with a few people and a lot of machines. That's people out of work and reshuffling as the labor needs change. Occupy Wall Street, though, to the extent it's about anything in particular, seems to be about the inequality in distribution of wealth. High unemployment is a catalyst, but the rage is over the fact that bankers make orders of magnitude more than many college graduates, and that's the graduates who have jobs at all.

—Alorael, who acknowledges that there are a lot of reasons for unemployment. Technological innovations, be they ever so gradual, have made a lot of those workers unnecessary, so no one wants to hire them back to do what they did. Some have degrees in areas that don't need that many people. But a lot is just from the economy taking a nosedive and costing jobs overall, which means fewer people have discretionary money, which keeps the economy down. And into that mess got poured the knowledge that the leaders of failed banks made millions.
Link to comment
Share on other sites

Originally Posted By: Kelandon
I appealed to real per capita GDP because it takes GDP (total value of all final goods and services sold in a country during the year — measured basically by adding up how much people bought), adjusts for inflation (change in the price level), and divides by the population. Generally people take it to be a reasonable measure of per-person wealth. I don't know exactly what you mean by "theory-laden," but my guess is that it's the opposite of that.


The theory it's laden with is that cost equals value. If somebody discovered magic, and within ten years suddenly every human in the world were living in their own cloud castle, and dining on ambrosia, then people would still work at jobs, earn money, and buy stuff. They'd work at things like designing websites and breeding miniature unicorns, instead of mining ore and assembling cars, and they'd buy things like magic carpets and mobile phone games, instead of hamburgers and motorcycles. But in inflation-adjusted dollar value terms, not very much would have changed, because cloud castles would cost what small apartments used to cost, and so on. So even though in real terms everything had changed, in GDP terms the change would have seemed very modest. In fact, the changes that had happened in what you could make or get for the same amount of money would be what the big change was all about.

As to the overall rate of technological progress, my conviction is that you just need to really appreciate two words: Horseshoe, moonshot. The point isn't that technology no longer advances. Things are definitely still changing and improving, in significant ways. I find nearby restaurants in strange cities from my iPhone, too. I couldn't do that ten years ago. But in my grandfather's lifetime, things changed a whole lot more than that. Men on horseback; men on the moon. To say that technology is still advancing the same as ever today is like saying it's okay that the whiskey's gone, because we still have beer.

The implementation cycle of the advances from my grandfather's generation has indeed fully run its course by now. The hot blood of youth that used to run through the whole world economy, the fire of the fact that everything was changing hugely, has cooled. Everything is still possible, and the game is still basically the same game. There's still grinding and bosses and lewt. But we're not playing on Casual any more. Every little thing is a notch harder, and it's not going to go away. And in a way that does change the game a lot, because now we have to worry about things that we used to be able to afford to ignore.

That's the thesis, anyway.

As to technology putting people out of work: in the short term it often does. If it's not the robots taking over the assembly line, it's something indirect that still probably started with technology. People lose good jobs that paid well, and they can't get the same kind of work again.

This is a real problem, all right. My wife's grandfather was a very good blacksmith, and the tragedy of this has marked her family down to her generation. But this is a cloud with a platinum lining. Those same unfortunate people's kids will have opportunities their parents never had, to get better jobs, and to live better for the same money because they can get better products for the same money, or cheaper products that are just as good. As Keynes famously pointed out, to reproach economists for taking economic tragedies lightly, In the long run we're all dead. But in the long run our kids also have jobs, and live in a different world.

Sometimes there are glitches and hitches in this generational rising tide, but overall it's undeniable. Life in the economies of the past was a lot harder than it is now. If this long term tide slackens, though, at some point the short term has to adapt to it. I think that maybe that's what we're seeing.
Link to comment
Share on other sites

Originally Posted By: Student of Trinity
The theory it's laden with is that cost equals value. If somebody discovered magic, and within ten years suddenly every human in the world were living in their own cloud castle, and dining on ambrosia, then people would still work at jobs, earn money, and buy stuff. They'd work at things like designing websites and breeding miniature unicorns, instead of mining ore and assembling cars, and they'd buy things like magic carpets and mobile phone games, instead of hamburgers and motorcycles. But in inflation-adjusted dollar value terms, not very much would have changed, because cloud castles would cost what small apartments used to cost, and so on. So even though in real terms everything had changed, in GDP terms the change would have seemed very modest. In fact, the changes that had happened in what you could make or get for the same amount of money would be what the big change was all about.

Ah, point taken, and this is a complex issue to try to manage. I think in principle inflation is supposed to take this into account in part, because if we're eating ambrosia for free, then we can deal with price shifts in that way, but this measure does have limitations in dealing with significant changes in what goods are even available to purchase (as noted in my previous post). I've only taken the bare bones introductory econ curriculum, so I don't know how economists handle that sort of thing, but I'm sure there are some ways. I should look them up, I suppose.

Originally Posted By: Student of Trinity
The point isn't that technology no longer advances. Things are definitely still changing and improving, in significant ways. I find nearby restaurants in strange cities from my iPhone, too. I couldn't do that ten years ago. But in my grandfather's lifetime, things changed a whole lot more than that. Men on horseback; men on the moon.

You know, this seems like almost entirely a matter of perspective. A couple of guys on the Moon doesn't much change daily life, but instantaneous access to virtually any kind of information or communication that is as widespread as smartphones really does. I think that the sense of possibility has changed directions; I think the digital revolution is just now ending, but I think we're on the cusp of a biological/medical revolution that could be even more transformative. This is only a guess, though.

Given who the two of us are, I might say that the early-to-mid 20th century had revolutions in what looks, at the core of it, like physics: transportation (motion) and appliances (electricity). The late 20th century had revolutions in other things, and the 21st century will have revolutions in yet other things.
Link to comment
Share on other sites

Originally Posted By: HOUSE of S
I think most of the world has always been playing on Torment.


Yeah, at least until quite recently. But actually now a lot of people are moving down to Hard. It happens that at the same time those of us in the Casual world have gotten to know more about them, so the difficulty level of play that we know about has been growing. But in fact the game has been getting a lot less tormenting for an awful lot of people. Just the fact that China and India have avoided outright famine in recent decades is a huge win.

That would be a win even if it cost any number of North American SUVs and televisions, of course. But in fact it's more likely to increase North American prosperity than reduce it — not in relative terms, but in absolute. Massachusetts isn't worse off because California is rich and productive. Massachusetts is much better off. Massachusetts would suffer a lot if California fell into the Pacific. But national boundaries are ultimately irrelevant. If Californian success benefits Massachusetts, then Asian success will benefit North America.

At least in the long run. Where we're all dead. But I have children.

Link to comment
Share on other sites

Originally Posted By: Kelandon
[T]he early-to-mid 20th century had revolutions in what looks, at the core of it, like physics: transportation (motion) and appliances (electricity). The late 20th century had revolutions in other things, and the 21st century will have revolutions in yet other things.


Yeah, that's the optimistic scenario, and it might well come to pass. Or maybe even physics will get in another round — hey, it could happen.

But genetics doesn't seem to be paying off quite as fast as people thought. We sequenced our genome a while ago, and it doesn't seem to have gotten us much, yet. Maybe it'll take off in another decade or two, and it'll be summertime again. Or maybe we've really hit diminishing returns, physics is a lot easier to exploit than biology because physics is simpler, and the payoffs from genetic technology will be expensive and slow.

There have been reports recently of a couple of patients being apparently cured of late stage leukemia by injections of their own white blood cells that had been genetically modified, using HIV of all things, to target the cancer cells better. They went into a raging fever during which their bodies digested a few pounds of cancer cells, and emerged in full remission. If that sort of thing ever becomes routine, we'll really have something. But it could stand to hurry up some.
Link to comment
Share on other sites

Originally Posted By: Student of Trinity
But genetics doesn't seem to be paying off quite as fast as people thought. We sequenced our genome a while ago, and it doesn't seem to have gotten us much, yet. Maybe it'll take off in another decade or two, and it'll be summertime again. Or maybe we've really hit diminishing returns, physics is a lot easier to exploit than biology because physics is simpler, and the payoffs from genetic technology will be expensive and slow.


Genetic testing is allowing for better use of medicines that work best with certain genetic types. Especially for cancer, it means no more let's try and see which drug works best and starting with one that is known to be effective.

Medicine will take longer because even when something works, why it works isn't always known. It's easier with physics since the mechanism is understood with the advancement or figured out relatively soon afterwards.

Right now you are mostly seeing improvements on existing technology. Reducing computers down to molecular systems to increase speed isn't the break through of the integrated circuit over vacuum tubes or the Babbage engine.
Link to comment
Share on other sites

I've never worked on quantum computing myself, but I knew a lot of the people who got the ball rolling. I heard Peter Shor explain his algorithm in a seminar soon after it was published, people just down the hall from me proved the first error correction threshold theorem, and I later worked a few floors up from one of the leading ion trap labs.

 

The actual experts in the field have always been surprisingly pessimistic, saying that they don't expect to see any useful quantum computers in their lifetimes. And the boom years of quantum computer research are definitely over, now. I haven't seen a QC article in Physical Review Letters for ages. It's not even really clear what a quantum computer would be good for if we had one. I believe Shor's algorithm for factorizing is still the only practical quantum algorithm known, and it's a negative one that would break public key cryptography, which is a useful thing to have unbroken.

 

It might all be fantastic someday, but again it seems to really be taking its time.

Link to comment
Share on other sites

Originally Posted By: Randomizer
[Genetic testing is allowing for better use of medicines that work best with certain genetic types. Especially for cancer, it means no more let's try and see which drug works best and starting with one that is known to be effective.

Medicine will take longer because even when something works, why it works isn't always known. It's easier with physics since the mechanism is understood with the advancement or figured out relatively soon afterwards.

That's not exactly true. Personalized, genomics-based medicine has been just around the corner since the first draft of the human genome was published 11 years ago. It's not easy: finding the genes that are meaningful markers, finding ways to target them, and finding ways to target them that aren't overwhelmingly toxic are major research undertakings. Genetics, especially the really fast and relatively cheap sequencing available now, open opportunities, but I don't think it will ever be the big breakthrough anticipated by the Human Genome Project. It will instead, as SoT's model predicts, be a long, slow series of developments over decades.

Medicine doesn't take longer than science because you need to know why something works. Actually, in medicine it's easy: why it works is much less important than whether, and in the case of something like cancer the why can be swept under the rug because the chance of long-term harm isn't as big as the threat of immediate death. Again, the major problem is turning science into medicine. It's one thing to know what mutations cause a certain malignancy, but it's quite another to figure out how to reverse that or use it as a drug target.

—Alorael, who has worked on a cancer being aggressively sequenced. If you know a mutation shows up in many of the tumors, that doesn't tell you anything in itself. Is the mutation driving the cancer or secondary? What if you can't target it with anything? What if you have no idea what the gene is even for, and all experiments seem to show that it does nothing?
Link to comment
Share on other sites

Part of the reason that I think we're on the verge of something big in genetic research — not being a biologist myself — is that genome sequencing is becoming much, much cheaper and faster, which means that we can use it a lot more. Knowing one person's complete genome is fun, but for the kind of statistical use that epidemiologists would put it to, you need to know a lot of people's genomes and as much of their medical history as possible. Cheap, fast genome sequencing would allow that.

 

This is sort of like what I was saying with putting a person on the Moon, by the way. Putting a person on the Moon is fun, but until we have a regular Earth-Moon shuttle that leaves every couple of hours (or whatever), it's not really impactful, except in the imagination. Doing it once is fun, but doing it a million times is when it really matters. The prototype is one thing, but the commercialization is another.

 

This might be sort of like what happened with the Internet in the past few decades (I still say two, but maybe three). Having one page on the web is fun, but it doesn't really do anything. Having nearly every single individual and business on the web in one way or another changes the kind of interactions possible. (Economists call this a "network externality," among other things.) Maybe it's true that the single core innovation — getting computers to exchange data at long distance — was made back at the beginning of the Internet, but the leaps and bounds of innovations that were made possible by the development and widespread adoption of the technology changed the nature of the possibilities.

 

Equally, we're not quite yet even at the "dialup" stage of genetics, but it's possible that the "broadband" stage will lead to cascading innovations that we can hardly even imagine now. (I bet if you had described Wikipedia to someone in the 1980's, it would've sounded pretty silly; what awaits us in, say, the 2020's?)

 

This is just a guess, and from a non-specialist, no less, so people who actually know something about biology can disabuse me of these notions.

Link to comment
Share on other sites

I said I was speaking from experience. I've worked in that lab, I've sequenced those patient samples, and while it sounds great, it's nowhere close to being a reality. There are some cases of breakthroughs, but there's not a breakthrough pipeline. I'm not saying it can't happen, but it's not something that is happening, and it hasn't been happening despite being the new big thing in medicine for years now.

 

—Alorael, who thinks getting a sequence from each patient would be a big step. The problem is still that even knowing the individual differences doesn't change most treatment most of the time. Right now, you're most likely to get a more specific prognosis. You may be in the cohort that's usually helped by the drugs, or you may be in the cohort that has grim two year survival statistics. There still aren't tailored drugs even for most of the markers that have been well described already. That's the bottleneck.

Link to comment
Share on other sites

Originally Posted By: Kelandon
This is sort of like what I was saying with putting a person on the Moon, by the way. Putting a person on the Moon is fun, but until we have a regular Earth-Moon shuttle that leaves every couple of hours (or whatever), it's not really impactful, except in the imagination. Doing it once is fun, but doing it a million times is when it really matters. The prototype is one thing, but the commercialization is another.


I'll risk a bit of a tangent to ask... are we ever GOING to see repeat space travel? The United States no longer has the ability to go to the moon or, for the moment, orbit. It's been more than forty years, and we have yet to establish any sort of permanent structure up there, make any progress toward putting a man on mars, and we're not even pulling our weight in the space station any more. And I just don't see the private sector pulling off something on that scale any time soon. That leaves... the Russians and the Chinese, I guess.

/end rant.
Link to comment
Share on other sites

Originally Posted By: Rentar vs. Redbeard
I said I was speaking from experience. I've worked in that lab, I've sequenced those patient samples, and while it sounds great, it's nowhere close to being a reality. There are some cases of breakthroughs, but there's not a breakthrough pipeline. I'm not saying it can't happen, but it's not something that is happening, and it hasn't been happening despite being the new big thing in medicine for years now.

Sorry, I missed the sentence in which you made reference to essentially the entire subject of my subsequent post. I realize that it's not happening now, but it doesn't sound all that far-fetched to imagine it happening 10 years from now. Unless there's some enormous theoretical barrier between the progress that has been made in economical sequencing and the next step in economical sequencing, as with space travel down below, which there very well could be. Or maybe your point is more important: tailoring drugs to specific genes — or whatever it is that we'll do with this genetic information — may represent a larger conceptual or commercial roadblock than simply economizing sequencing does.

Still, my guess is that the next jump will be in biology. Whether it's from gene sequencing or something else, that's where I'd put my money, if I were a betting person/venture capitalist.

Originally Posted By: Actaeon
I'll risk a bit of a tangent to ask... are we ever GOING to see repeat space travel? The United States no longer has the ability to go to the moon or, for the moment, orbit. It's been more than forty years, and we have yet to establish any sort of permanent structure up there, make any progress toward putting a man on mars, and we're not even pulling our weight in the space station any more. And I just don't see the private sector pulling off something on that scale any time soon. That leaves... the Russians and the Chinese, I guess.=

Given that this is a political decision, and given American politics at the moment, I'd say that the answer is and will remain no.

One of the fundamental problems in space travel, as far as I can tell, is that we really haven't made any progress since the 1960's. We're still, at the heart of it, throwing mass out the back of a rocket to propel it forward, and for the most part, it's the same kind of mass as we were using 40+ years ago. The really great practical advances that would make space travel economical are still well beyond our power. (I heard a brief explanation from a professor about trying to use light instead of matter; a relativistic rocket would still depend on momentum conservation, but the momentum of light, not the momentum of matter. It's a great idea for fast travel, but it's impossible to do with current or even near-future technology.)

In the meantime, until we have some great advances, space travel will still be exorbitantly (literally) expensive. We've had some neat advances come from it, but the most famous one is Tang. There's been a marketing problem.
Link to comment
Share on other sites

The real problem is that while there have been advances from space programs, unless I'm not getting the PR there haven't been any practical (i.e. commercial) developments from things found in space. Satellites are very important, to be sure, but we can launch those just fine now. The question is what we get out of hurling more chunks of mass out of Earth's gravity well, and the answer is nothing that we know of.

 

—Alorael, who missed the time stamp on your post. He doesn't think that tailoring drugs to specific genes has any commercial hurdles, except for the genes rare enough that no one will develop the drugs. There aren't even conceptual hurdles. There are, instead, practical hurdles: even with all the genes known, it's hard to figure out which of the genes are the right genes and, of those, which are useful drug targets and, for those, how to make drugs that target them. It can be done and has been done, but there's no simple solution, so for each gene the process has to be repeated. Rational drug design is non-trivial, and while it will get better, it won't be an overnight revolution like penicillin.

Link to comment
Share on other sites

You guys make me feel like Rip Van Winkle; I put my head down for a little upgrade work, and this topic happens.

 

Actaeon: I too have had my nits picked to pieces by the ever watchful House of Slarty. I don't take it personally, I just make sure I take the time to research my response before posting it. It takes a greater investment of time, a commodity I seem to be running short on these days, but it makes for a stronger, more reasoned debate.

 

As for the more deliberate pace of discovery in the field of genetics, I would conjecture that it is, in part, due to the fact that we are talking about a technology that is capable of doing major harm, not just to individual subjects, but to the whole population. Not only are there ethical reasons for caution, there are political causes that seek to throttle research in this area.

 

Alorael, would you liken genetic research to be akin to reverse-engineering an entire operating system with nothing more than a mound of paper covered with hexadecimal symbols; no system documentation, no documentation of the machine code, no replicatable platform on which to run tests, no way to tell what is executable vs data, except by meticulous and tedious tugging and tweaking of bits and bytes until slowly a picture starts to form? The fact that we have learned as much as we know now, in the time that we have been studying it is, to me, phenomenal.

 

The time of basic research to product development is longer than we have seen in recent years on the electronic gadget front, but I expect that we will reach a tipping point in the next decade or two where such advances will start coming quicker.

 

The only thing that is constant is change, but not even the rate of change is constant.

Link to comment
Share on other sites

There are no political reasons to quash genetic research, really. Genetic engineering, maybe, but that has actually been remarkably free of interference (Monsanto!). Human genetic engineering gets the Frankenstein stories, but it's not really actively pursued. Engineering anything past the first week or post-fertilization is an impossibility with current technology; engineering a human before that is just very hard.

 

I'd also dispute your analogy. We know what DNA codes for, and we can usually tell the genes that encode proteins from everything else. Yes, there are tricks like microRNA that lurk in the genome, but that's not even the major concern here. We can figure out the amino acid sequence of proteins. Often, by homology or the laborious work of biophysicists, we can get a structure and probable function on a molecular level.

 

Instead, I'd argue, we're looking at something more like a huge list of blueprints for machines that aren't easy to read. With enough work, we can figure out what the machines are, but it's hard to work out where they fit into the huge assembly of machines. If something goes wrong, even if you can go and see the errors in the blueprints, it's never obvious which are relevant and which aren't.

 

Most importantly, the barrier is figuring out how to fix broken machines, and that's where the analogy breaks down too. A mutation can make a protein non-functional, but it could also become overproduced, or nonresponsive to signals that turn it on and off. Then the question is how to right what went wrong, and it's not easy: we can't go in and change genetic code (see genetic engineering above), so it's a matter of engineering something else to fix things for us. That's drug development, and it's a big industry because it's hard.

 

—Alorael, who thinks he now dislikes genes by analogy to code almost as much as he dislikes likening the genetic code to language. It's not either of those things. Thinking of genetics like things that aren't genetics will only confuse matters.

Link to comment
Share on other sites

Your terse, bold posts with something that looks like wordplay always leave me baffled. I don't want to call them spam and look stupid for missing out on the cryptic logic behind them, but I don't get it.

 

—Alorael, who can pull up animalcules, Sylvia Plath, and... not much else that either makes sense or links them.

Link to comment
Share on other sites

I did *not* say there was a legitimate reason for the politics that try to impede genetic research, just remarking that it is there, in all its absurdity.

 

I appreciate your perspective, re my analogy. I've never had the opportunity to get direct feedback from someone in the field. My analogy goes down in flames before you. But I will carry your description of the work in it's place.

 

Slarty, is this the hidden meaning behind your cryptic post?

quplath1.gif

 

"I wandered down the Plathway of life, searching for the logicule that would enlighten the bio-logical processor that resides in my skull."

Link to comment
Share on other sites

And I'm saying that there are basically no political pressures to impede genetic research. The research isn't being impeded except by funding cuts, and those are general in science and medicine.

 

—Alorael, who thinks that the promises of genetic research have won over the public. Nobody objects. Human genetic engineering is discussed and opposed, but it's not really where the research is happening right now anyway. (It could happen more, but it's not the one frontier that's stymied by government oversight.)

Link to comment
Share on other sites

Shame on me for not qualifying my remark adequately. One of the lurking threads of thought that had crossed my mind related to the story of Dolly, and other such experiments. But on reflection, that does not even enter the realm of study in which you are engaged. Try as I may, I also fall into the trap of relating two things that seem on the surface to be similar, but are in fact quite unrelated.

Link to comment
Share on other sites

There are definitely areas of biological research that have been stymied by (often stupid) political controversies. Heck, teaching core principles of biology in high school science classes has been controversial from time to time (evolution vs. creationism). It's a fair point to consider when thinking about whether biological research generally represents a viable avenue for major developments in scientific applications in the next few decades, even if it doesn't apply to genetics specifically.

 

Somewhat more related is the issue of clinical trials for drugs. They are obviously necessary and obviously cumbersome and expensive. One could make reasonable arguments that they need to be more heavily regulated (and even more cumbersome), and one might be able to make arguments in the other direction, too. This seems like an issue that is at least tangentially related to being able to put genetic research to practical use and at least tangentially related to politics.

Link to comment
Share on other sites

Originally Posted By: Harehunter
Actaeon: I too have had my nits picked to pieces by the ever watchful House of Slarty. I don't take it personally, I just make sure I take the time to research my response before posting it. It takes a greater investment of time, a commodity I seem to be running short on these days, but it makes for a stronger, more reasoned debate.


One of my very first posts was shredded to pieces by Alec. I've crossed paths with TM over at Shadow Vale. My skin is entirely thick enough to handle Slarty's analysis. I apologized because I actually felt off base. After all, my point had already been made more effectively by Kelandon. That actually happens more than you'd expect.
Link to comment
Share on other sites

Sure, there are political obstacles to biology. Stem cells are a major one. Cloning, too, although if it's not human cloning most politicians and bioethicists stop caring. While there are religious objections to teaching evolution, I've actually never heard of attempts to block funding for, say, evolutionary biology. I think it just flies under the radar: if it's not coming up for millions of school children, it's not relevant enough. There's also a strong correlation between opposing the teaching of evolution and opposing funding for the sciences in general, with no need to single out evolution-based research.

 

Drugs that have made it into trials are beyond my area of expertise. I can just say that even in the non-controversial, non-Big Pharma basic science part often carried out at publicly funded research institutions is tough. Whether there are good ways to economically smooth the drug pipeline, or if it should be done, is another issue.

 

—Alorael, who isn't sure how much benefit would be wrung from relaxed regulation. He doesn't think drugs are really being stifled terribly, and there might not be much room to cut costs without cutting necessary oversight. More FDA time and money would probably help make drugs safer without putting a bigger burden on the pharmaceutical industry, but that's just not politically possible.

Link to comment
Share on other sites

The analogy of the genetic code as a recipe does not work--well.

 

A better way to think of DNA is as a recipe.

 

 

Much like with a fully formed cake, you cannot look at the phenotype of an organism and know what ingredients (genes) went into making it. Single genes rarely code for the structure or function of any particular part of an organism; instead, what happens is that the products of multiple genes (proteins) are "mixed" together to form the end-product--the organisms phenotype.

 

The Genome if it is a blueprint at all, is a blueprint for making strings of amino acids; however even here, things like protein shape do not correspond directly to the gene that coded for its amino acid sequence.

 

 

Link to comment
Share on other sites

Harehunter: No. I was really just wanted to bring up biological pathways.

 

Originally Posted By: Actaeon
...shredded to pieces by Alec... crossed paths with TM... Slarty's analysis.

Oh, geez. Well you sure figured out how to shame me! A fine company I find myself in these days.

 

Not that I have anything against Alec or TM as people: far from it. But I am not a fan of their style of nitpicking, which was less about analysis and an obsession with the truth, and more about breaking boundaries and making people feel uncomfortable.

Link to comment
Share on other sites

Precisely what I was trying to convey by the comparison. In fact, I'd much prefer to be called on my errors. That's what dialogue is all about.

 

Staying on topic helps, too. With regards to this topic, while the symptoms have been discussed, I'd like to hear the prognosis. If SoT is right, Winter is Coming. If he's not... well, it might be a more Ragnarok style season.

 

What are we up against? What are the "problems we've been able to ignore for generations" ?

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...