Pinned post

Still worried about A.I. (Warning, depressing. A.I Naysayers please don't post.) 

Alright, first, for all those who say it's not going to happen soon, or don't think it's going to be that powerful, I don't want to hear it. I think you're wrong and I don't want to argue about it. Please don't post. I'm sorry to be mean and rude and exclusionary, but I don't think that conversation has any value and I don't want to waste my time on it. My apologies. :X

Also, this next set of posts is likely to be very depressing. If you're not in a good spot to read depressing things, please don't. I don't want to be responsible for a deterioration in your mental health.

Also also, if you want to ask questions, go right ahead. That I don't mind. Just don't try and tell me it's not going to happen.

Anyway, I'm still worried about A.I. I think it's coming soon, and I don't think we're prepared to handle it. Let's be honest, human beings are animals. Hungry, angry, fearful, foolish, short-sighted, beautiful animals. And don't get me wrong, I love animals. I don't blame us for our limitations, it's not our fault. But, it is our problem.

Like, with our existing technology, it should be trivial for us to make our world into a utopia. Trivial. Just a month or so, to clean up all our problems. But we waste our time fighting each other, pursuing stupid things that don't even make us happy, ignoring the obvious, etc, etc, etc. We have such tremendous power already, and we waste it all. We turn it into weapons and violence and hurting each other. And I don't see why A.I. will be any different. And that's assuming we don't fuck up the A.I. entirely and end up with a paperclipper.

Let me give you an example. Let's say we create the thing. Who do we trust with it? Do we trust the government? Somehow I don't think Trump is up to handling this thing. Another world leader? Who? Xi Jinping? Putin? Bolsanaro? Heh, no, not feeling it. Maybe a government department? I mean, all respect to NASA, but I don't think they're up to deciding the fate of earth. Especially not once the corruptive politics that surround that kind of power come into play. Let's take the Soviet Union as an example. Lenin thought "Here, We'll just give ourselves all the power, and we'll do good things with it, and everything will be fine." Then Stalin happened, and things were not fine. And even without Stalin, it's not like Lenin was great himself. So, not feeling that one.

So, okay then, maybe a private individual? Who? Jeff Bezos? Elon Musk? No. I don't think so. "Okay, but what if they're not a capitalist?" You say. Heh, no, I still don't think so. I'm certainly not up to the job of being God-Emperor of earth, are you? No offense, but I doubt it. -_-

Okay, but what if we make sure it's evenly distributed? So, lots of people have A.I.? I'm still skeptical. That's how power has been in the past, and it always seems to centralize. Minor differences escalate, as the poor get poorer and the rich get richer, and soon enough you have an oligopoly again.

Okay, what if we make the thing independent, and program it to just do it's best for everybody? Well, that could work, so long as we program it right. But that still depends on one of the aforementioned groups doing that right. So, we're right back where we started. -_-

I dunno, maybe I'm too pessimistic? But I still don't see any way forward with this. I think it's gonna happen, and it's gonna be big, and I don't see how it can go well for us. And I don't know what to do. -_-

Pinned post

Akira, Meditations on Moloch, Existential Terror, No Seriously, this is kind've a bad trip and a recurring nightmare of mine. Read at your own risk. :/ 

Honestly though, this movie really does get at the heart of a lot of things that have been worrying me. If you're not up to date, these are the sources you may need to read, or skim, to really get what I'm going on about here:

slatestarcodex.com/2014/07/30/
slatestarcodex.com/2018/02/19/

Like, All of our problems now stem, not from the lack of power, but from our poor use of it. We could solve all of our problems if we chose to. We could feed everyone on earth, we could end all wars, we could clean the oceans and cool the earth. It would be easy, if we chose to. We don't lack for power, we just squander it.

And I'm worried that maybe this is no accident. Maybe that's simply the nature of life. To grow without thought to the consequences, to consume all available resources, to turn to infighting, waste, and cancer. That the true nature of life is hunger and violence, and everything else is just window dressing and foolish dreams.

"What if an amoeba had ultimate power?" They mention at one point. What would it do with it? Would it make something beautiful? Or would it simply consume all in it's path? I dunno. But, the orthogonality thesis holds that intelligence and goals are unrelated. That there is no reason that that ultimate amoeba would see anything beautiful. That it would, in fact, merely consume all.

And I'm worried that they might be right. That there will be no day when our efforts pay off, no day when the horrors cease. That all our wonders built so far will turn to horrors and dust.

To go on, people talk about the decline of American manufacturing? That's a lie. There's been no decline in American manufacturing. We make more stuff now than ever before, and if you look at the graph, it's a steady upwards line. The only thing that's declined have been manufacturing jobs, and those were automated. Factories that used to employ thousands now employ less than a hundred. The areas that used to rely on those jobs? Well, that's ground zero for the opioid epidemic.

Isn't that horrible? Bold new technology, hope for the future, and it's killing people. The Horn of Plenty, spilling out garbage, ruin, and death.

And I'm not sure this can be changed. As Scott said in meditations, "The reasons Nature is red and tooth and claw are the same reasons the market is ruthless and exploitative." Or, more simply - patterns that are more capable of enduring, of propagating themselves, expand. Whether these patterns are the genes that form influenza, or the policies that drive colonialism, or the memes that drive capitalism, the outcome is the same. And if that pattern means unemployment and despair, if it means a "blizzard of prescriptions", if it means garbage and concrete, hunger and deprivation before the horn of plenty - then that's how it will be.

The hope of course, as in Akira, is that we will survive and rebuild, and that one day we will learn to control our power, learn to control ourselves, and solve all things. Or, at the very least, that things will go on.

But I'm worried that that's too optimistic? That ultimately, the world will be reduced to an endless expanse of machines, building machines, building machines, a pattern growing and expanding in all directions having forgotten why it was born, universal cancer, mechanisms in the image of hunger and violence having forgotten even the substance of these things.

And, yeah, I know this sounds like kind of a ridiculous worry, but... If you told a caveman of our world, it's ridiculous powers and how foolishly we waste them, wouldn't they laugh? Farms, factories, missiles, nukes? Homnelessness and poverty? Landfills and concrete? What a ridiculous fantasy. How could that ever come real? The earth is vast and green, you fool! In the face of such power, how could you possibly waste it? If you really had such things, it would be utopia, obviously! And indeed, by all rights we should live in utopia. By all rights, we should be able to sweep aside our problems like dust, like spidwerwebs! Swoosh swoosh, out the front door, into the street! Clean it all up before supper and sit down to eat in a clean, beautiful house.

And yet, here we are. Now, yes, things are getting better, a little bit, in some places. But this has been driven by our vast increase in productive potential, more than any increase in fundamental wisdom. Indeed, if anything it's made us more foolish. Intoxicated by our success, thoughtless and overconfident. Eager to rush into war, heedless of the waste. Confident that the consequences will be somebody else's problem.

I dunno, this is just my nightmare, more than anything coherent. But it is a persistent nightmare. Ask if you want me to explain anything here in more detail, I skimmed over a lot of things that deserve full posts of their own, instead of single sentences. Might take me a while to get back to you though. XD

Joke I just stole from someone on Youtube, who got it from Reddit: 

Vladimir Putin suffers a heart attack amidst the Ukraine crisis, and falls into a coma.
A few years later, he wakes up, gets back on his feet and walks out of his room, right past the sleeping guard.

He walks out of the hospital onto the streets of Moscow, and finds that most people don't recognize him. Several years of vegetative coma seem to have taken its toll on his appearance. After wandering around for a bit, he stumbles into the nearest bar. He sits down at the bar and orders a full glass of vodka.

He sips nervously and musters the courage to ask the bartender: "What year is it?"

The bartender is confused, but replies: "2025..."

Putin takes another sip of his drink to process this information. He then asks: "And Crimea, is it still ours?"

Bartender proudly replies: "Still ours!"

Putin nods in approval and takes another sip. Then, he follows: "And Kiev, is it also ours?"

Bartender replies: "Kiev also ours."

A big, happy grin appears on Putin's face, as he finally finishes the drink and asks the bartender: "How much for the vodka?"

Bartender: "100 hryvnias!"

(The hryvnia is the currency of Ukraine for those who don't know)

So I had an idea for a video game 

So, this one is set in the future, where a human level intelligence AI has taken over most of the earth. The thing is generally benevolent, I'd say slightly more so than most human governments. Opposing it are a broad and fractious spread of human factions, who range from mostly pretty decent to absolutely terrible. The humans who created the A.I. instructed it to watch over and protect the human race, and it set out to conquer the world in order to do so. Whether or not this was justified is left as an exercise for the player. The player gets to play through this conflict, and perhaps resolve it in many different ways, from a variety of viewpoints - I'd hope to have characters for several different factions, with backstories that give them each their different opinion on all this.

Art and Creativity? 

"The thing about art, and most other creative endeavors for that matter, is that you're going to suck at them at first. For all but the most unreasonably talented, you will have to write bad poetry before you can write good poetry, and make poor drawings before you can make good drawings. You must brave the slings and arrows of criticism, many of them quite valid and all the more painful for it, trudge over hills of crumpled paper and through swamps of spilled ink, through the winds and rain of your own judgement, to finally arrive at the promised land of artistic ability." :/

Person: *Replies with something interesting*

Me: "Oooh! Whats this?"

Person: *Deletes post*

Me: 😢

Show thread

Another attempt to explain Meditations on Moloch, and why it still gives me nightmares. 

So, you may have noticed me going on about this piece before, or some other pieces to accompany it:

slatestarcodex.com/2014/07/30/

youtube.com/watch?v=rStL7niR7g

slatestarcodex.com/2014/09/24/

That's cause it fucking terrifies me. I'm gonna take another stab at explaining why.

To put it simply, it implies that life is about competition. It's natural selection all the way down. Species evolve through natural selection, but so do ideas, corporations, governments - everything is selected to be competitive. Individuals that aren't competitive don't procreate, ideas that aren't competitive don't catch on, corporations that aren't competitive lose market share and go bankrupt, people who aren't competitive don't get elected, etc.

Now, to explain why that's so terrifying, lets take a look at what it means to be competitive. To a limited degree, we've managed to align this with what it means to be good. Our politicians generally try to make promises that align with the wills of at least some of their voters, businesses try and make products that at least some people will want to buy, etc. But, it's important to keep in mind just how limited this alignment is. There are still a million ways in which "competitiveness" can get wildly out of alignment with what's actually good. See global warming, for example, or any of the articles linked earlier.

Furthermore, if it's true, this tendency towards increasing competitiveness means that technology won't save us. It will simply give us new ways to compete and consume, often at each others expense. Most especially, I worry about automation. For the moment, the system needs to take the desires of individual people into account, because they have some power - not a lot, but still a little. But when no more humans are needed to work? When robots capable of anything a human can do are cheaper than said human? What power will the average person have then? Not enough to matter, I worry.

And sure, those who control the machines might choose to pay taxes for an UBI. But keep in mind, /they/ still need to compete. Economically, militarily, who knows - but I don't expect them to escape competition. And who do you think is more competitive - the machine economy that pays taxes to support human beings, or the one that doesn't? Or, beyond that, the one that leaves fertile land alone for human use, or the one that doesn't? The one that carefully manages it's waste products to keep humans safe, or the one that doesn't?

Or, for the example of what something monstrously more competitive than you can do to you, see the comparison between humans and other great apes:

en.wikipedia.org/wiki/Hominida

One of the numbers on that chart is not like the others. And we didn't even mean to do that, not really! We were just hungry, and competitive, and trying to get ahead. And they were in the way.

And don't think it's just automation that we have to worry about. Can you compete with Apple? Microsoft? China, Brazil? Of course not. These entities are vastly more powerful than any human being. Now, for the moment they're forced to give at least some care to your life, and not crush you underfoot. But will it stay that way? Or will they tear free of their chains and stumble blindly towards being more competitive, destroying everything we care about underfoot in the process? I don't know. But, unless you're one of those rare people who doesn't want to complain about either corporations or government, you probably already share at least some of my concerns on the power of these entities. Now just imagine those concerns magnified, and the restraints discarded, and I think you'll see what I'm getting at.

Of course, it's not all bleakness. We do restrain competition, all the time. That's basically the entire point of having laws and regulations. The governments entire job, really. A lot of culture, too. And, for the most part, it these things actually do a good job. You can mostly go to sleep without having your neighbor shank you, or your house inundated with toxic sludge, or your possessions seized by the government. Mostly. But I'm concerned this might not last.

Like, on the topic of automation, obviously it's an ongoing process. First you go from picking wheat by hand, to picking it with a sickle, etc, all the way up to a combine harvester. And as this trend continues, there's a trade off between the worker getting all the produce of their labor, but not producing very much, to them getting a smaller portion of their produce, but producing much more. But, this is a curve. Eventually, the decrease in portion outweighs the increased production, and the curve trends down. But you can't just go back - competitiveness is based mostly on production, so it will follow where that increases.

I worry it's going to be the same across our civilization. Trends that once went up will go back down. All these things that we take for granted, despite being so new - civil rights, the rule of law, due process, a decent standard of living - they disappear as shockingly as they appeared. We will return to despotism, and savagery, and eventually extinction. Our machines will march on without us, glorious and impressive and pointless. "Meaningless gleaming techno-progress burning the cosmos." "A Disneyland with no children." You get the idea.

To put it simply, I'm not just worried that we're on the wrong path. I'm worried that no other path is possible. That, like water running downhill, we will be drawn inevitably towards causing our own extinction, or at least towards a mass die off, civilization wide collapse, and much less friendlier world. This happens to other species all the time. Ever heard of the Oxygen Catastrophe? Some of the first microbes to perform photosynthesis and release oxygen found it very competitive to do so. So competitive, they released enough of it into the atmosphere to poison themselves, and make the planet forever less hospitable to their kind.

en.wikipedia.org/wiki/Great_Ox

Are we smart enough to avoid doing the same? To curb our competitive tendencies? Maybe. We've tried to do that with government and culture. But of course, each of those is another arena in which to compete. It's still been a step forward, so far. But can we keep it up forever? We're already failing with global warming, which should be easy. We had warning all the way back in 1896, more than a hundred years ago, and we still haven't managed to figure out how to deal with it.

lenntech.com/greenhouse-effect

What are we gonna do if we're faced with an even more difficult problem, where we only have a few years to figure out a solution? I dunno. I'm not sure there's anything we will be able to do. I guess we'll see.

Still worried about A.I. (Warning, depressing. A.I Naysayers please don't post.) 

Alright, first, for all those who say it's not going to happen soon, or don't think it's going to be that powerful, I don't want to hear it. I think you're wrong and I don't want to argue about it. Please don't post. I'm sorry to be mean and rude and exclusionary, but I don't think that conversation has any value and I don't want to waste my time on it. My apologies. :X

Also, this next set of posts is likely to be very depressing. If you're not in a good spot to read depressing things, please don't. I don't want to be responsible for a deterioration in your mental health.

Also also, if you want to ask questions, go right ahead. That I don't mind. Just don't try and tell me it's not going to happen.

Anyway, I'm still worried about A.I. I think it's coming soon, and I don't think we're prepared to handle it. Let's be honest, human beings are animals. Hungry, angry, fearful, foolish, short-sighted, beautiful animals. And don't get me wrong, I love animals. I don't blame us for our limitations, it's not our fault. But, it is our problem.

Like, with our existing technology, it should be trivial for us to make our world into a utopia. Trivial. Just a month or so, to clean up all our problems. But we waste our time fighting each other, pursuing stupid things that don't even make us happy, ignoring the obvious, etc, etc, etc. We have such tremendous power already, and we waste it all. We turn it into weapons and violence and hurting each other. And I don't see why A.I. will be any different. And that's assuming we don't fuck up the A.I. entirely and end up with a paperclipper.

Let me give you an example. Let's say we create the thing. Who do we trust with it? Do we trust the government? Somehow I don't think Trump is up to handling this thing. Another world leader? Who? Xi Jinping? Putin? Bolsanaro? Heh, no, not feeling it. Maybe a government department? I mean, all respect to NASA, but I don't think they're up to deciding the fate of earth. Especially not once the corruptive politics that surround that kind of power come into play. Let's take the Soviet Union as an example. Lenin thought "Here, We'll just give ourselves all the power, and we'll do good things with it, and everything will be fine." Then Stalin happened, and things were not fine. And even without Stalin, it's not like Lenin was great himself. So, not feeling that one.

So, okay then, maybe a private individual? Who? Jeff Bezos? Elon Musk? No. I don't think so. "Okay, but what if they're not a capitalist?" You say. Heh, no, I still don't think so. I'm certainly not up to the job of being God-Emperor of earth, are you? No offense, but I doubt it. -_-

Okay, but what if we make sure it's evenly distributed? So, lots of people have A.I.? I'm still skeptical. That's how power has been in the past, and it always seems to centralize. Minor differences escalate, as the poor get poorer and the rich get richer, and soon enough you have an oligopoly again.

Okay, what if we make the thing independent, and program it to just do it's best for everybody? Well, that could work, so long as we program it right. But that still depends on one of the aforementioned groups doing that right. So, we're right back where we started. -_-

I dunno, maybe I'm too pessimistic? But I still don't see any way forward with this. I think it's gonna happen, and it's gonna be big, and I don't see how it can go well for us. And I don't know what to do. -_-

Still worried about A.I. (Warning, depressing. A.I Naysayers please don't post.) 

Alright, first, for all those who say it's not going to happen soon, or don't think it's going to be that powerful, I don't want to hear it. I think you're wrong and I don't want to argue about it. Please don't post. I'm sorry to be mean and rude and exclusionary, but I don't think that conversation has any value and I don't want to waste my time on it. My apologies. :X

Also, this next set of posts is likely to be very depressing. If you're not in a good spot to read depressing things, please don't. I don't want to be responsible for a deterioration in your mental health.

Also also, if you want to ask questions, go right ahead. That I don't mind. Just don't try and tell me it's not going to happen.

Anyway, I'm still worried about A.I. I think it's coming soon, and I don't think we're prepared to handle it. Let's be honest, human beings are animals. Hungry, angry, fearful, foolish, short-sighted, beautiful animals. And don't get me wrong, I love animals. I don't blame us for our limitations, it's not our fault. But, it is our problem.

Like, with our existing technology, it should be trivial for us to make our world into a utopia. Trivial. Just a month or so, to clean up all our problems. But we waste our time fighting each other, pursuing stupid things that don't even make us happy, ignoring the obvious, etc, etc, etc. We have such tremendous power already, and we waste it all. We turn it into weapons and violence and hurting each other. And I don't see why A.I. will be any different. And that's assuming we don't fuck up the A.I. entirely and end up with a paperclipper.

Let me give you an example. Let's say we create the thing. Who do we trust with it? Do we trust the government? Somehow I don't think Trump is up to handling this thing. Another world leader? Who? Xi Jinping? Putin? Bolsanaro? Heh, no, not feeling it. Maybe a government department? I mean, all respect to NASA, but I don't think they're up to deciding the fate of earth. Especially not once the corruptive politics that surround that kind of power come into play. Let's take the Soviet Union as an example. Lenin thought "Here, We'll just give ourselves all the power, and we'll do good things with it, and everything will be fine." Then Stalin happened, and things were not fine. And even without Stalin, it's not like Lenin was great himself. So, not feeling that one.

So, okay then, maybe a private individual? Who? Jeff Bezos? Elon Musk? No. I don't think so. "Okay, but what if they're not a capitalist?" You say. Heh, no, I still don't think so. I'm certainly not up to the job of being God-Emperor of earth, are you? No offense, but I doubt it. -_-

Okay, but what if we make sure it's evenly distributed? So, lots of people have A.I.? I'm still skeptical. That's how power has been in the past, and it always seems to centralize. Minor differences escalate, as the poor get poorer and the rich get richer, and soon enough you have an oligopoly again.

Okay, what if we make the thing independent, and program it to just do it's best for everybody? Well, that could work, so long as we program it right. But that still depends on one of the aforementioned groups doing that right. So, we're right back where we started. -_-

I dunno, maybe I'm too pessimistic? But I still don't see any way forward with this. I think it's gonna happen, and it's gonna be big, and I don't see how it can go well for us. And I don't know what to do. -_-

Akira, Meditations on Moloch, Existential Terror, No Seriously, this is kind've a bad trip and a recurring nightmare of mine. Read at your own risk. :/ 

Honestly though, this movie really does get at the heart of a lot of things that have been worrying me. If you're not up to date, these are the sources you may need to read, or skim, to really get what I'm going on about here:

slatestarcodex.com/2014/07/30/
slatestarcodex.com/2018/02/19/

Like, All of our problems now stem, not from the lack of power, but from our poor use of it. We could solve all of our problems if we chose to. We could feed everyone on earth, we could end all wars, we could clean the oceans and cool the earth. It would be easy, if we chose to. We don't lack for power, we just squander it.

And I'm worried that maybe this is no accident. Maybe that's simply the nature of life. To grow without thought to the consequences, to consume all available resources, to turn to infighting, waste, and cancer. That the true nature of life is hunger and violence, and everything else is just window dressing and foolish dreams.

"What if an amoeba had ultimate power?" They mention at one point. What would it do with it? Would it make something beautiful? Or would it simply consume all in it's path? I dunno. But, the orthogonality thesis holds that intelligence and goals are unrelated. That there is no reason that that ultimate amoeba would see anything beautiful. That it would, in fact, merely consume all.

And I'm worried that they might be right. That there will be no day when our efforts pay off, no day when the horrors cease. That all our wonders built so far will turn to horrors and dust.

To go on, people talk about the decline of American manufacturing? That's a lie. There's been no decline in American manufacturing. We make more stuff now than ever before, and if you look at the graph, it's a steady upwards line. The only thing that's declined have been manufacturing jobs, and those were automated. Factories that used to employ thousands now employ less than a hundred. The areas that used to rely on those jobs? Well, that's ground zero for the opioid epidemic.

Isn't that horrible? Bold new technology, hope for the future, and it's killing people. The Horn of Plenty, spilling out garbage, ruin, and death.

And I'm not sure this can be changed. As Scott said in meditations, "The reasons Nature is red and tooth and claw are the same reasons the market is ruthless and exploitative." Or, more simply - patterns that are more capable of enduring, of propagating themselves, expand. Whether these patterns are the genes that form influenza, or the policies that drive colonialism, or the memes that drive capitalism, the outcome is the same. And if that pattern means unemployment and despair, if it means a "blizzard of prescriptions", if it means garbage and concrete, hunger and deprivation before the horn of plenty - then that's how it will be.

The hope of course, as in Akira, is that we will survive and rebuild, and that one day we will learn to control our power, learn to control ourselves, and solve all things. Or, at the very least, that things will go on.

But I'm worried that that's too optimistic? That ultimately, the world will be reduced to an endless expanse of machines, building machines, building machines, a pattern growing and expanding in all directions having forgotten why it was born, universal cancer, mechanisms in the image of hunger and violence having forgotten even the substance of these things.

And, yeah, I know this sounds like kind of a ridiculous worry, but... If you told a caveman of our world, it's ridiculous powers and how foolishly we waste them, wouldn't they laugh? Farms, factories, missiles, nukes? Homnelessness and poverty? Landfills and concrete? What a ridiculous fantasy. How could that ever come real? The earth is vast and green, you fool! In the face of such power, how could you possibly waste it? If you really had such things, it would be utopia, obviously! And indeed, by all rights we should live in utopia. By all rights, we should be able to sweep aside our problems like dust, like spidwerwebs! Swoosh swoosh, out the front door, into the street! Clean it all up before supper and sit down to eat in a clean, beautiful house.

And yet, here we are. Now, yes, things are getting better, a little bit, in some places. But this has been driven by our vast increase in productive potential, more than any increase in fundamental wisdom. Indeed, if anything it's made us more foolish. Intoxicated by our success, thoughtless and overconfident. Eager to rush into war, heedless of the waste. Confident that the consequences will be somebody else's problem.

I dunno, this is just my nightmare, more than anything coherent. But it is a persistent nightmare. Ask if you want me to explain anything here in more detail, I skimmed over a lot of things that deserve full posts of their own, instead of single sentences. Might take me a while to get back to you though. XD

Slatestarcodex, Right Libertarianism 

Scott wrote an article recently, that on one hand is very right, but on the other hand has a couple big flaws: slatestarcodex.com/2019/04/30/

First, he mentions in the beginning that "Consolidation among wholesalers has led to the creation of three buying consortium behemoths that purchase 90 percent of the generic pharmaceutical products in the United States," and "These “monster” buyers have squeezed manufacturers on prices", which seems like a classic example of a monopsony, and an important detail that should be looked into further, and which he completely ignored for the rest of the post.

Second, I feel like skipping straight from "These regulations are bad" to "All regulations are bad" is a bit of a jump. It's like, imagine someone started selling a new model of car, and it turned out these cars had engines that exploded. Some evidence suggests that these engines may have been specifically designed to explode on purpose. Then, in the ensuing debate, everyone spent all their time arguing about whether or not cars should have engines, based purely on this one example.

I feel that's where we are in our politics discussions, and I hate it. And I mean, hey, maybe cars shouldn't have engines! Maybe they should all go electric, or we should use trains or something. That's a point worth considering. But we should do so based on ALL the evidence, not just this one exploding engine.

On the flip side, of course, it is important to consider that engines can be badly or maliciously designed, and that regulations may be bad, either because they were designed to support a monopoly, or because they were just poorly designed. This doesn't mean they should all be gotten rid of, but it does mean that some of them should be, or should be replaced.

Essentially, I'm arguing for more nuanced positions in our politics, and annoyed that Scott didn't do the same. -_-

-, Resource Usage 

Now, there is one flaw here, which is that space does technically expand, and at an exponential rate, too. So, if that continues, then under the given scenario we could, actually, keep up with our exponential growth rate. But, lets assume that the store of dark energy, whatever it is, eventually is exhausted and space stops growing exponentially. That returns us to not being able to keep up with our exponential growth and it's implications. Or, we could just not be able to invent all the crazy stuff I talked about in the first post. :/

Show thread

-, Resource Usage 

So, I was thinking. Lets suppose we find a way to generate free energy, right? As much as we want. And we find a way to turn this energy into matter any way we want. And we find a way to dispose of the waste heat thus generated, and also invent anti-gravity to prevent us being crushed under the weight of our own accumulated stuff. Then, the only limiting resource left to us is space, which assuming no FTL, we can capture at a rate of 2C^3. Which is still not enough of a growth rate to keep up with our current exponential growth in resource usage... :/

Fuck Capitalism 

...Man, here's here's a crazy idea. Pretty dickish, too. Not necessarily a good idea, and a bit of an escalation. Might expose legal risk, too. But, it's possible.

Basically, an system to organize grassroots campaigns to address this kind of thing. At its most simple, that could just mean a shit-ton of people all calling the help desk line to say, "Hello, I have heard some concerning things about the company you work for and would like to express my concerns, could you please pass this on to your management?", or at a more complicated level, making publications and carrying out negative marketing campaigns against them. Now, this is not something to do lightly. Anything powerful is dangerous and capable of escaping your grasp and causing errant destruction. Hence, if this idea is powerful, then it will be dangerous, and will potentially play into all sorts of dangerous currents and dynamics. If it is to be done, it should be thought out responsibly. So, please don't do it without giving me a chance to tell you you're doing it wrong first and warning you of the dangers first, okay? XD

But, I think it might be worth thinking about and enacting. We'll see. :/

Show thread

Fuck Capitalism 

So I had that whole thing with Appliance Warehouse buying the company I had previously rented my Washer + Dryer from, who I had paid for an entire year up front with, and then sending me messages saying I owed them money, right? And I called, and talked to them, and was told to wait while they're financial department looked over it, and then called back, and then waited again, then. Allen a third time and they agreed that I did not owe them money. Now I have another letter from them, saying I owe them money and the bill is past due. So, I'm gonna try calling again tomorrow. But, I'm also thinking of straight up lodging complaints with the Better Business Bureau and/or suing them. Anyone have any advice on the subject? Should I just threaten them with the matter and see if that clears things up, or do it without telling them first, or what? :/

I know it's disproportionate and it would be way easier to just pay the money, but by Jove, somebody needs to stand up against this kind of thing, and I'm mad enough to fucking do it. -_-

Besides, if I let them get away with this, who's to say they won't come back demanding more money? I'd rather fight from the start. :/

Accidentally wore the "My dog is smarter than the president" shirt my mom and sister gave me for christmas. Old men in trump hats offended: 1+.

:/

Ethics of technological advancement? 

Is it right to share a new mathematical idea with theoretically extensive but unknown practical applucations? Like, if you have an idea for some but of math that could be used for everything from chemistry to social engineering to A.I. weaponry, is it right to share it? I mean, it could potentially be used to make things much worse, but it could also be used to make things much better. And of course, you might be inflating the whole thing - maybe the idea won't lead anywhere anyway.

I guess you pretty much just have to estimate the results, right? Chance of good times magnitude of good, versus chance of bad times magnitude of bad? :/

General reccomendation: You should buy this book and try meditation. 

Yes, you. I've been meditating based on instructions from Culadasa's The Mind Illuminated, and I'm super impressed. Highly recommended, you should try it.

I will note, though, you should probably get the book and read it first. There are a lot of very important details - I never knew meditating could be so complicated. For example, he recommends that the core exercise should be to focus attention on the meditation object (usually the breath) while simultaneously trying to expand your peripheral awareness - trying to stretch the mind in two didections, basically. He also has a lot of insights to share into how the mind works. (Like the fact that attention and awareness are separate things and that you can have both at once and use them differently, for example.)

Predictive Processing, Meditation, The Mind Illuminated 

"In fact, while it may not be obvious, all our achievements originate from intentions. Consider learning to play catch. At first your arm and hand just didn't move in quite the right way. However, by sustaining the intention to catch the ball, after much practice, your arm and hand eventually performed the task whenever you wanted. "You" don't play catch. Instead, you just intend to catch the ball, and the rest follows. "You" intend. And thr body follows."

This quote takes on a fascinating new light if we consider predictive processing, which says much the same thing - we control our bodies by changing our predictions of where they'll be, and let the rest of our brains minimize predictive error by figuring out the details and adjusting things to match. :/

Politics, Discourse, History, Slate Star Codex 

G. K. Chesterton was an interesting character. He was a conservative, even a regressive, and yet he gives us passages like this:

"Now the whole parable and purpose of these last pages, and indeed of all these pages, is this: to assert that we must instantly begin all over again, and begin at the other end. I begin with a little girl’s hair. That I know is a good thing at any rate. Whatever else is evil, the pride of a good mother in the beauty of her daughter is good. It is one of those adamantine tendernesses which are the touchstones of every age and race. If other things are against it, other things must go down. If landlords and laws and sciences are against it, landlords and laws and sciences must go down. With the red hair of one she-urchin in the gutter I will set fire to all modern civilization. Because a girl should have long hair, she should have clean hair; because she should have clean hair, she should not have an unclean home: because she should not have an unclean home, she should have a free and leisured mother; because she should have a free mother, she should not have an usurious landlord; because there should not be an usurious landlord, there should be a redistribution of property; because there should be a redistribution of property, there shall be a revolution. That little urchin with the gold-red hair, whom I have just watched toddling past my house, she shall not be lopped and lamed and altered; her hair shall not be cut short like a convict’s; no, all the kingdoms of the earth shall be hacked about and mutilated to suit her. She is the human and sacred image; all around her the social fabric shall sway and split and fall; the pillars of society shall be shaken, and the roofs of ages come rushing down, and not one hair of her head shall be harmed."

:/

slatestarcodex.com/2014/12/25/

Dangerous levels of nerdiness Re: Life philosophy 

Hmm. What are lifes skill trees? What are the best skills? What builds would you recommend? What equipment would you say is the most mecessary? :/

Personally I'd say the knowledge skill tree is super good, albeit most of the stuff on there won't pay off until you're well down the tree. Though it is possible to snag some good stuff cheap, you just really need to know what you're doing.

The morality skill tree is a bit of a puzzle but I think there might be some good stuff there. Some people harsh on it but I think it's actually underrated.

Physical fitness is also really good and I'd recommend a moderate investment. Beyond that, best suit the rest of your build.

Practical skills, also. Really, most of the trees should get a lot least a minimum investment. Some games you want to max out a single tree and ignore the others... not this one, so much, I think. :/

@anthracite
Hey what's the follow limit on Dragon.style? I ran into it on Ac.p and they can't raise it. Thinking of migrating here for everything, if it's higher. :/

Show older
Dragon Style

I'm a grumpy queer dragon lady and this is my quiet cave for me and some friends.