Technologists in particular, taken as a group, have a very specific philosophical outlook that they don't tend to interrogate in themselves because it's so pervasive and intrinsic to what they work on and how they do so. Fish unaware of water, so to speak. It's a set of assumptions that make sense when you're programming software, but break down when applied to other things in the real world.
The tend to assume the universe is deterministic.
They tend to assume (incorrectly) that because it's deterministic a good enough model will be able to predict or explain.
They tend to ignore or not even be aware of the inherent bias towards available and measurable data, or that what we can measure must capture the essential dimensions of it.
The most naive tend to assume that given enough data, a model will get better, that the noise will "average out" (it doesn't).
I don't have a good name for this, but it has all the trappings of a good -ism otherwise.
Beyond philosophy, they should study art, music, literature, and whatever else interests. They should spend time with others who do and not only with people who work in technology. Unfortunately, increasingly college CS programs have cut out general education requirements in favor of questionably useful skills training, leaving graduates in a state where this seems daunting.
Computer scientists are building the world we all have to live in. Is it so much to ask that they be educated in the humanities before they're turned loose to do so?
I mean, I think that the universe is close enough to deterministic at a macro scale, but even in a deterministic world you can still get smashed in the head by a falling brick with zero warning. Whether the the sequence of events that led to the brick falling down is deterministic or random is irrelevant to your ability to predict its descent. You only have a feeble human brain, a soft pink organ that's incapable of fitting more than 5 to 7 items in its working memory even before the collision.
The tend to assume the universe is deterministic.
They tend to assume (incorrectly) that because it's deterministic a good enough model will be able to predict or explain.
They tend to ignore or not even be aware of the inherent bias towards available and measurable data, or that what we can measure must capture the essential dimensions of it.
The most naive tend to assume that given enough data, a model will get better, that the noise will "average out" (it doesn't).
I don't have a good name for this, but it has all the trappings of a good -ism otherwise.
Beyond philosophy, they should study art, music, literature, and whatever else interests. They should spend time with others who do and not only with people who work in technology. Unfortunately, increasingly college CS programs have cut out general education requirements in favor of questionably useful skills training, leaving graduates in a state where this seems daunting.
Computer scientists are building the world we all have to live in. Is it so much to ask that they be educated in the humanities before they're turned loose to do so?