Hacker Newsnew | past | comments | ask | show | jobs | submit | ardit33's commentslogin

I have a 39" (almost 40") LG ultrawide, and it is the perfect size. Can't see how a larger monitor would fit a normal desk...

BUT.... this is perfect for folks that want to use one monitor for both work, and as/for entertainment /just normal tv watching in a living room.


I am in Miami right now, and verizon is not working here. (I get the SOS sign).

KONAMI's PES (Pro Evolution Soccer) - (or Winning Eleven as it was called originally), has always had better gameplay. What it suffered was lack of liseneces.

Now it is free and called eFootball... and obviously not that same, but PES has always been the favorite of fans gameplay wise.


They say to replace them with Banners, which are just a different style of a "Toaster", just usually stay longer, or are permanent until the user takes an action.


No, it's different. A "toast" is a floating element, a "push notification" which is detached from the panel, e.g. can disappear quite quickly. The banner is placed closest to the context where the action is triggered, e.g. might require the user to close it. If you have some disability, e.g. cognitive, visual or motor, it's likely to be easier to perceive the information. When a user creates an "issue", the "issue item" is displayed in the list, meaning that you won't need a secondary notification process.


The main problem with toasts is that they disappear with no hope of recovery before you get a chance to read them, obscuring other content in the process. Banners don't obscure other content, don't disappear without user action, and could theoretically have a message history.


Oh, dang.... the great 2030 pandemic is coming...

I don't think this bodes well for the future. Both rats and bats have been huge vectors of diseases. This is going to produce some kind of super virus that will be another middle age plague like pandemic.


This probably isn't new behavior, simply something we're witnessing for the very first time.

We haven't observed orcas predate moose or primates, but the former has plenty of supporting evidence and the latter has probably happened at some point.

In any case, zoonotic reservoirs likely slosh around a lot more than we think before spillover events.


2/3rds of all human pathogens originated from zoonotic spillover!

https://pmc.ncbi.nlm.nih.gov/articles/PMC8182890/


We also pass on diseases to animals so it is a never-ending cycle.


I agree. This has probably been happening for millions of years. Bats often live in dense colonies in caves and tree trunks with small exits making rat predation possible.

Some native Arctic peoples have traditions of killer whales eating people, although officially they have only killed people while in captivity. A person in a skin kayak would be easy prey for one.


Apple Silicon was started by/during the Steve Jobs era in 2010. You seeing the rewards now (well starting in 2019), because it takes so long to produce a chip.


Apple Watch was also started under Jobs


Composition folks can get very dogmatic.

I have some data types (structs or objects), that I want to serialize, persist, and that they have some common attributes of behaviors.

In swift I can have each object to conform to Hashable, Identifiable, Codabele, etc etc... and keep repeating the same stuff over and over, or just create a base DataObject, and have the specific data object inherit it and just .

In swift you can do it by both protocols, (and extensions of them), but after a while they start looking exactly like object inheritance, and nothing like commposition.

Composition was preferred when many other languages didn't support object oriented out the gate (think Ada, Lua, etc), and tooling (IDEs) were primitive, but almost all modern languages do support it, and the tooling in insanely great.

Composition is great when you have behaviour that can be widely different, depending on runtime conditions. But, when you keep repeating yourself over and over by adopting the same protocols, perhaps you need some inheritance.

The one negative of inheretance is that when you change some behaviour of a parent class, you need to do more refactoring as there could be other classes that depend on it. But, again, with today's IDEs and tooling, that is a lot easier.

TLDR: Composition was preferred in a world where the languages didn't suport propper object inheretance out of the gate, and tooling and IDEs were still rudemmentary.


> In swift I can have each object to conform to Hashable, Identifiable, Codabele, etc etc... and keep repeating the same stuff over and over, or just create a base DataObject, and have the specific data object inherit it and just .

But then if you need a DataObject with an extra field, suddenly you need to re-implement serialization and deserialization. This only saves time across classes with exactly the same fields.

I'd argue that the proper tool for recursively implementing behaviours like `Eq`, `Hashable`, or `(De)Serialize` are decorator macros, e.g. Java annotations, Rust's `derive`, or Swift's attached macros.


Yes, all behaviors should be implemented like definitions in category theory: X behaves like a Y over the category of Zs, and you have to recursively unpack the definition of Y and Z through about 4-5 more layers before you have a concrete implementation.


I'll be honest here. I don't know if any comment on this thread is a joke.

There are valid reasons to want each one of the things described, and I really need to add type reflexivity to the set here. Looks like horizontal traits are a completely unsolved problem, because every type of program seems to favor a different implementation of it.


    > The one negative of inheretance is that when you change some behaviour of a parent class, you need to do more refactoring as there could be other classes that depend on it. But, again, with today's IDEs and tooling, that is a lot easier.
It is widely known as the "unstable base class" problem.

Another one is, that there are cases, where hierarchies simply don't work well. Platypus cases.

Another one is, that inheritance hides where stuff is actually implemented and it can be tedious to find out when unfamiliar with the code. It is very implicit in nature.

    > TLDR: Composition was preferred in a world where the languages didn't suport propper object inheretance out of the gate, and tooling and IDEs were still rudemmentary.
I think this is rather a rewriting of history to fit your narrative.

Fact is, that at least one very modern language, that is gaining in popularity, doesn't have any inheritance, and seems to do just fine without it.

Many people still go about "solving" problems by making every noun a class, which is, frankly, a ridiculous methodology of not wanting to think much. This kind of has been addressed by Casey Muratori, who formulated it approximately like this: Making 1-to-1 mappings of things/hierarchies to hierarchies of classes/objects in the code. (https://inv.nadeko.net/watch?v=wo84LFzx5nI) This kind of representing things in the code has the programmer frequently adjusting the code and adding more specializations to it.

One silly example of this is the ever popular but terrible example of making "Car" a class and then subclassing that with various types of cars and then those by brands of cars etc. New brand of car appears on the market? Need to touch the code. New type of car? Need to touch the code. Something about regulations about what every car needs to have changes? Need to touch the code. This is exactly how it shouldn't be. Instead, one should be thinking of underlying concepts and how they could be represented so that they can either already deal with changes, or can be configured from configuration files and do not depend on the programmer adding yet another class.

Composition over inheritance is actually something, that people realized after the widespread over-use of inheritance, not the other way around, and not because of language deficiencies either. The problems with inheritance are not merely previously bad IDE or editor support. The problems are, that in some cases it is bad design.


SwiftUI is more realistic actually. UIKit, much tougher (more mature, more tied to the IOS ecosystem).


Highly doubt it, Safari on Windows ran on AppKit and that thing is from the early 1990s. You'd be surprised how high level UIKit actually is.


Source?


Don’t have links, but it’s true. iTunes for Windows also includes chunks of AppKit.

The Windows ports of AppKit in both likely trace their lineages back to Yellow Box, which was the Windows port of AppKit that Apple briefly made available prior to the release of OS X 10.0.


My understanding was Foundation and bits of CoreGraphics but not AppKit. Yellow Box required DPS.


UIKit is very mature and tied to the iOS ecosystem and a bit more complex. SwiftUI is easier to port (since it is still a incomplete / subset features of UIKit).


Circular universe...? big bang -> expands -> expansion slows -> starts retracting -> singularity again -> big bang again

Roger Penrose seems to be leaning/more convinced of the circular universe theory....


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: