The only thing I didn't understand is the part about garbage collection. What does garbage collection mean and what is the benefit in this implementation?
Programming language with automatic memory management. Look up "garbage collection" for all the details in the world.
The reason I brought it up is because it's a common trope that you can't do low-level programming with GC, and systems like Oberon are direct evidence to the contrary.
Exactly. I see the question on Stack Overflow all the time and just keep dropping a whole list of counterexamples. Too many myths in IT that hold mainstream back.
Hmmm, if you don't know what garbage collection is (and I don't mean this as sneer) how are you sure that you understood everything else said?
In the sense that garbage collection is a pretty basic CS term, and the summary contained lots of CS terms and implied knowledge to understand it (modules, interfaces, persistence, etc).
Actually, as a self-taught developer who started with Python and never had to learn about memory management, "garbage collection" was a concept I only became familiar with after a few years of reading gradually more & more material, usually from HN. The other stuff in the summary above would have either made sense or been "guessable" to 3-years-ago-me.
Garbage collection was never really anything I recall being discussed in my Cmpt Science curriculum, back in 1990-1993, though there was lots of interesting theory about memory allocation/deallocation/defragmentation. It wasn't until I ran into Java around 2000 or so that people started talking about Garbage Collection.
I'm not saying that it didn't exist, it was just less of a priority 25 years ago than 15 years ago.
Everyone has different experiences and interests in college, but
I recall Garbage Collection as an important subject decades before 2000.
I studied CS in the 1970s (and 1980s) and garbage collection was already an
important subject. It was a part of the ACM recommended CS curiculum
as far back as 1968. Google Scholar reveals thousands of results for
"garbage collection algorithm" before 1990. The subject was studied by
great computer scientists (Lamport, Dijkstra, and Liskov are all Turing
award winners) and the results were fun and interesting:
1989
AW Appel: "Simple generational garbage collection and fast allocation"
1988
JF Bartlett: "Compacting garbage collection with ambiguous roots"
J Crammond: "A garbage collection algorithm for shared memory parallel
processors"
1987
AW Appel: "Garbage collection can be faster than stack allocation"
DI Bevan: "Distributed garbage collection using reference counting"
1986
B Liskov, R Ladin: "Highly available distributed services and fault-tolerant
distributed garbage collection"
1985
J Hughes: "A distributed garbage collection algorithm"
1983
H Lieberman, C Hewitt: "A real-time garbage collector based on the lifetimes
of objects"
1982
RJM Hughes: "A semi‐incremental garbage collection algorithm"
1981
J Cohen: "Garbage collection of linked data structures"
1977
HC Baker Jr, C Hewitt: "The incremental garbage collection of processes"
1976
Douglas W. Clark: "An efficient list-moving algorithm using constant workspace"
PL Wadler: "Analysis of an algorithm for real time garbage collection"
1975
Guy L. Steele, Jr.:"Multiprocessing compactifying garbage collection", CACM
Edsger W. Dijkstra, Leslie Lamport, et al.: "On-the-fly garbage collection:
an exercise in cooperation"
1974
Gary Lindstrom: "Copying list structures using bounded workspace"
1973
Edward M. Reingold: "A nonrecursive list moving algorithm"
1972
H. D. Baecker: "Garbage collection for virtual memory computer systems"
Ben Wegbreit: "A space-efficient list structure tracing algorithm"
1969
Robert R. Fenichel, Jerome C. Yochelson.: "A LISP garbage-collector for
virtual-memory computer systems"
Joseph Weizenbaum: "Recovery of reentrant list structures in SLIP"
1967
H. Schorr, W. M. Waite: "An efficient machine-independent procedure for
garbage collection in various list structures"
Peter J. Denning: "The working set model for program behavior"
1963
Daniel J. Edwards: "Secondary Storage in LISP"
The Communications of the ACM, during this period, was the most read
CS Journal. Articles appearing there were intended to be of interest
to the widest population of computer scientists and 19 of them were on
garbage collection.
LISP, APL, LOGO, ML, Prolog, CLU, Scheme, Cedar, Smalltalk, Icon, SML,
Mathematica, J, Haskell, Python, Dylan, Self, Lua and Javascript are
all important, garbage collected programming languages that came
before the late 1990's.
There may not be an alternative, but that doesn't mean it isn't snake-oil. (It doesn't mean it is either. Those two things are independent)
But, what he's saying has a good point, whether or not textio is snakeoil: it's better to have numbers other people can compare to, even if the method for determining those numbers isn't made public.
I don't think either will be a catalyst for anything significant - there's just craploads released every day and it's unlikely anyone will notice yours unless you, your startup, or your investors are notable enough that journalists are specifically watching for them. It definitely won't be worth paying for.
"'I could be arrested for this action,' Ladar Levison told NBC News about his decision to shut down his company, Lavabit LLC, in protest over a secret court order he had received from a federal court that is overseeing the investigation into Snowden."
http://www.rollingstone.com/music/news/lawsuit-apple-delete-...