Hacker Newsnew | past | comments | ask | show | jobs | submit | EllieEffingMae's commentslogin

If you're okay with one of these things but not the other, then you are applying a double standard based on a moralisation of sex. Sex as labour is no different than any labour, and there is no criticism of sex work that is whole an complete that does not also criticise the very system you're defending.


> Sex as labour is no different than any labour.

If that is true, then a thought experiment would be: should refusing to enter prostitution be a grounds for losing jobseekers' welfare, eg. jobseekers allowance [1]?

[1] https://www.gov.uk/government/publications/jobseekers-allowa...


It's comparable to working in a butcher that specialises in pork. Clearly, the job would be unsuitable for a Muslim or vegan jobseeker. That's why the government website specifies suitable jobs and has an exception for a good reason not to do the job.

I agree with the general principle that not all labour is the same, but I don't think prostitution is in a class of its own. If you paid me enough I'd probably do it for a week and then retire.


> If you paid me enough I'd probably do it for a week and then retire.

Maybe this means that you don't really like your job? If you had access to the same goods as everyone else around you, more or less, and liked what you are doing, then doing it for a week stops making sense, doesn't it?


If I had millions of dollars I'd stop going to work 5 days a week for a company I don't own, yes. This seems like it'd be true of most people.


For me it would've been ok to work 6/5(better 6/4) for the rest of my life if it actually was something meaningful for the society, and if I had an opportunity to occasionally switch to smth different, and if i got a stable life for it. Even if i didn't like the job with all my heart


> Sex as labour is no different than any labour,

In capitalistic society everything can be called labour, including selling people, their organs and their genitals for sex.


Sex is very different. Last I checked, no matter how much construction work you do, you will never construct another person, even by accident.

Sex is not just another activity. Due to its effects of potentially creating a new human being, it is a category unto itself and deserves special treatment.

For example, would you sell your pancreas? Why not? Is it because a vital organ is not in the same class of goods as say a lightbulb?

The same is true of selling sex, which is not just some social interaction, but a social interaction that can literally make a new person

Your kind of equivocation is morally lazy and conveniently abiological.


> For example, would you sell your pancreas? Why not? Is it because a vital organ is not in the same class of goods as say a lightbulb? [...] Your kind of equivocation is morally lazy and conveniently abiological.

Speaking of lazy, that's a pretty ridiculous comparison. Depriving yourself of a vital organ is not the same as renting out your genitals for a limited time.

"Making a new person" is also a complete red herring. You can hire a surrogate to carry a baby to term, which is also paying to use someone else's genitals to actually make a new person. The only meaningful difference is the absence of "sex", so I think it's clear what you really have a problem with.


> You can hire a surrogate to carry a baby to term

In the vast majority of first world countries this is illegal for precisely the reasons I described. Only extremely poor countries or barbaric jurisdictions, such as California, allow paid surrogacy.

> not the same as renting out your genitals for a limited time.

Given the genitals ability to produce life that can last many years beyond yourself, you're absolutely right. Renting out your genitals is much worse.


> In the vast majority of first world countries this is illegal for precisely the reasons I described.

No it's not. The potential to make humans is not equal to the intentional act to create humans, just like the potential to commit murder is not the same as the intentional act to commit murder.

> Only extremely poor countries or barbaric jurisdictions, such as California, allow paid surrogacy.

You definitely need to update your list, because more states are surrogate friendly than aren't.


There are a lot of people in this world who sold their organs for money. Or people who accept to be infected with various diseases for money.


To address your question about "universal education advocates", they were not the reason that we got it.

Their argument was purely an philosophical one, stating that a well informed society operates better, makes better decisions, ect. And a well educated person can make better decisions for themselves and have a higher quality of life.

The reason that universal education caught on was because large businesses realised that could have more efficient factories if they didn't have to teach everyone to read. Thus college and university because the place where the 'universal education' dream could be realised. Where every person could go and receive an education that would afford them a higher quality of life and allow them to better engage with the world.

Then, factories became more specialised and 90% of universities became slaves to some "industry"

Many universities "computer science" programs are hardly that. Rather they've turned into a coding bootcamp with slightly more math.


Would you mind elaborating on exactly how many studies you think are engaging in "statistical trickery"?


"exactly how many"--I have no idea. I didn't mean to put those studies down, the statistical trickery can be quite cool! Hard to do right, though.


Regardless of your credentials, as far as I am aware the scientific consensus on this topic is that genetics is nearly impossible to separate from other causal factors. Mainly other causal factors that we have clearer and more direct evidence of having an influence on cognitive ability.

So, quite frankly, I don't care if you spend a lot of time thinking about it, it's irrelevant. The consensus is clear.


Yes, exactly like that!


If you would be so kind to explain to me why I should believe you, a random person on the internet claiming to be a "scientist" who "spends a lot of time thinking" about this subject, over thousands of peer reviewed research articles and case studies stating that even if genetics is a factor in intelligence then it is so closely tied to other causal factors that it is effectively impossible to control for, then I am happy to listen.

So please, I am begging you, tell me why I should believe you over everyone else.


That is not, in fact, what "thousands of peer reviewed research articles" state. You might be confusing the % of variance we can predict from genetics (e.g., knowing your ATGCATAGCCGTAG code), which at present is maybe 4%(?), with what we can show is due to heritability (70-80%). We didn't even know about DNA when people started measuring this kind of stuff with twin studies in the early 20th century.

So, to reiterate because I'm realizing that para wasn't so clear: guessing how smart you are based on your genetic code is something we're only just learning how to do. We're getting better at it though. Estimating how smart you are based on how smart your parents are is a totally different game that we've been playing for a century.


Why shouldn't intelligence be inherited while other traits are? It makes little sense to me, especially because intelligence is so important for survival.


> "spends a lot of time thinking"

Why is this in quotes when it isn't a quote?


There was another comment I had just finished reading and merged them in my head. I'll go back through and edit, sorry for any confusion


I first encountered this poem on a pamphlet handed out at my uncles funeral. It has remained one of my favourite poems of all time.

When I was younger it read as a sad poem about loving someone more than they love you. Now that I'm an adult, it feels more like a similar vibe to what I experience when I save the energy to make dinner for my partner when they get home from work.


Could you expand on this more? I don't know a lot about GUI programming, but I'd be interested in learning what the current failings are within the desktop space.


Specifically for Linux, there is no ubiquitous standard with decent performance and a predictable look such as win32 or Cocoa.

Instead, developers rely on toolkits such as Qt or GTK to ensure compatibility, which are usually either bloated or ugly.


That's true for "Linux", but "Linux" on its own is not a desktop OS. If you target KDE then Qt is the ubiquitous standard with decent performance and a predictable look, same goes for GNOME with GTK. Both of them I would say are less "bloated" than win32 or cocoa at this point. And I don't have any comment on whether it's "ugly" or not :)


If you think that Qt is more bloated than Cocoa I don't know what to say. That objc runtime is so heavy and slow, good luck making it run on microcontrollers


Not sure what you’re talking about. ObjC runtime is fairly small. The basics of what you need for ObjC is just objc_msgSend, which is basically a fancy hash lookup written in assembly language. If you want ObjC on a microcontroller, you’d want to port this function to your target architecture. There are a few other components you “need” but objc_msgSend is the key one.

You’d probably also want some form of malloc(), but that’s completely optional. There’s nothing in Objective C that says you have to allocate memory dynamically, or that you have to do it with malloc.

ObjC runtime has grown somewhat to include more features, but you don’t need all those runtime features if you want to run your code on a microcontroller. Just like you don’t need glibc if you want to run C. There is more than one runtime for Objective C you can choose, just like there is more than one runtime for C.


It's probably exactly that runtime...desk devs have forgot that embedded is tiny, megs not gigs of memory, so not one but 2-3 orders of magnitude smaller.

You can have all the 'simple' calls you like but if you need to malloc half a gig of ram just to get started...that is heavy and bloated.

Also you say 'architecture' but unless you are talking a battery hungry cellphone you are probably talking 16 or 32 bit proc not 64, which means potentially massive increase in the size of code to be generated since you lose certain instruction sets. I don't know realistically what Obj C uses in the osx cpu architecture but I'm almost certain it's not going to be as simple as just retargeting your compiler...

Not saying it can't be done but take even the new arm laptops from Apple - that's a significant hardware investment on top of a bunch of software tricks, not just a casual retargetting that can be portably moved to other low power systems.


> I don't know realistically what Obj C uses in the osx cpu architecture but I'm almost certain it's not going to be as simple as just retargeting your compiler...

Objective C is in mainline GCC. You retarget your compiler and port the runtime. As far as I know, any target with GCC support can run Objective C, given a runtime. The important parts of the runtime require kilobytes, not megabytes, of space.

Remember that Objective C was originally a very thin layer on top of C. You take all the [method calls] and replace them with calls to objc_msgSend(). The overhead is fairly small— your object instances need the “isa” pointer, which is basically just a pointer to a vtable. The objc_msgSend() function reads the vtable and forwards method calls to the correct implementation.

My experience is that Objective C binaries typically have a much smaller footprint than C++ binaries. That’s obviously not some kind of rule, but it reflects idiomatic usage of Objective C. In C++ you'd use a std::vector<T> which is a templated class, which gets instantiated for every different T you use in your program. In Objective C, you’d use NSArray, which is monomorphic.

This all should be completely unsurprising—since Objective C first appeared in the 1980s, it’s no surprise that it doesn’t need much memory.


GCC doesn't do modern Objective-C, only what they got from NeXT days.


> GCC doesn't do modern Objective-C, only what they got from NeXT days.

It's definitely a lot more than the NeXT days. GCC got fast enumeration, exceptions, garbage collection (if you really want it), synthesized properties, etc. These are the "Objective C 2.0" features which got released in 2007, back before Apple was shipping Clang.

GCC doesn't have ARC, array / dictionary literals, or other new features. But it's definitely a lot more than Objective C from the NeXT days. These are "modern" Objective C. They're also basically just sugar for the appropriate calls to +[NSArray arrayWithObjects:...] or -[retain] / -[release] etc.


You mentioned a runtime though.

Debating whether some language feature can exist on an mcu is somewhat unrelated - the point was about code size bloat not whether you can fake one instruction set with another...

You can fake e.g. multiply with a for loop and an add instruction (1 or 2 cycles) but that will run orders of magnitude slower than a 2 or 3 clock cycle multiply instruction...

So the point about osx arch is not really addressed, if you have runtime that cannot be easily ported or osx cpu arch which produces prohibitive instruction sets it doesn't matter if gcc can target an old version of objective C, it is, as stated, not that simple.


Sugar or not, doesn't change the fact that it won't compile modern Objective-C code.


Then use Clang, or port the code.


I don't understand the technical reasons behind it, but for whatever reason doing something like resizing a Window for any non-trivial GUI seems to cause the repaints to drop down to 15-20Hz in Qt and spike the CPU usage massively. The same issue doesn't occur with Win32 or Cocoa.


You probably don't have graphics drivers installed...

IIRC the fancier toolkits (Qt, GTK) do widget accell with either mandatory blitting and/or 3d (OpenGL/DirectX) (Other frameworks like WPF and Cocoa are notorious for this too).

There are, or used to be, frameworks that were not gpu dependent.

I've never hear nor seen these massive drops you claim to, so either you are running 20 year old hardware or your graphics aren't on, or you have some other hardware issue...


That's interesting, I have the exact opposite experience, I used to have a MBP dual booted with ArchLinux, and resizing windows was incredibly much smoother on Linux with either Dolphin, Nautilus or Thunar, than with Finder on the Mac which looked like it only drawed 1/5th of the frames compared to linux.


I do understand the technical reasons behind it, and it has nothing to do with bloat. It mostly has to do with mapping a cross-platform (i.e. generic) window abstraction onto a specific OS-provided drawing/event/windowing API.


I built my own cross-platform windowing/event API from scratch at my last job, based on top of win32 on Windows, Cocoa on Mac and Qt on Linux. The only platform where there was any noticeable performance drops at all was Linux, because it relied on Qt.

When I said "I don't understand the technical reasons behind it", I didn't mean "I don't understand how to abstract OS-native APIs". I meant "I don't understand how a team of intelligent people managed to fuck things up this badly".


I'm not it's a Qt issue: it could be a lack of hardware acceleration on Linux's GPU driver..


This is something I've observed cross-platform (Windows, macOS and Linux) across both nVidia and AMD machines with updated drivers. Actually try it.

Get your favourite Qt program, resize the window like a madman and watch your CPU usage launch into the stratosphere.


There are some reports of bad things happening with inefficient QPainter usage, and some bugs related to text drawing especially after qt5, after a cursory look...would be better to provide specific examples. I do rather suspect something machine or even program specific, though.


for "microcontroller class" whats the perofrmance level your looking at, something akin to a 68030 or so?


As do I! Crafting interpreters is definitely one of the coolest programming books I've ever read. I used it as a blueprint for busing more than one parser in my time :D


You can actually download all their articles as a PDF with a small yearly subscription. If that's not an option I've had good luck with pandoc in the past.


I maintain a Fork of a program that does exactly this! You can check it out here

https://github.com/Lifesgood123/prevent-link-rot


My personal solution to this is using http://itty.bitty.site

You can write a lot on one page. And I wrote some bash functions to store the links when you close your browser, and then give you the option of opening one based on the title.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: