Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I admit I'm one of those students who never used Racket in a non-academic setting (but mostly because I needed to contribute to already-existing projects written in different languages), and I was taught Racket from one of its main contributors, John Clements at Cal Poly San Luis Obispo. However, learning Racket planted a seed in me that would later grow into a love of programming languages beyond industry-standard imperative ones.

I took a two-quarter series of classes from John Clements: the first was a course on programming language interpreters, and the second was a compilers course. The first course was taught entirely in Racket (then called DrScheme). As a guy who loved C and wanted to be the next Dennis Ritchie, I remember hating Racket at first, with all of its parentheses and feeling restricted by immutability and needing to express repetition using recursion. However, we gradually worked our way toward building a Scheme meta-circular evaluator. The second course was language-agnostic. Our first assignment was to write an interpreter for a subset of Scheme. We were allowed to use any language. I was tired of Racket and wanted to code in a much more familiar language: C++. Surely this was a sign of relief, right?

It turned out that C++ was a terrible choice for the job. I ended up writing a complex inheritance hierarchy of expression types, which could have easily been implemented using Racket's pattern matching capabilities. Additionally, C++ requires manual memory management, and this was before the C++11 standard with its introduction of smart pointers. Finally, I learned how functional programming paradigms make testing so much easier, compared to using object-oriented unit testing frameworks and dealing with mutable objects. I managed to get the project done and working in C++, but only after a grueling 40 hours.

I never complained about Racket after that.

In graduate school, I was taught Scala and Haskell from Cormac Flanagan, who also contributed to Racket. Sometime after graduate school, I got bit by the Smalltalk and Lisp bugs hard....now I do a little bit of research on programming languages when I'm not busy teaching classes as a community college professor. I find Futamura projections quite fascinating.

I'm glad I was taught programming languages from John Clements and Cormac Flanagan. They planted seeds that later bloomed into a love for programming languages.



To be fair, "write an interpreter for a subset of scheme" is a core use case for lisp-family languages.

If it had been,"write a real-time driver for a memory-limited piece of hardware", you may have had a different preference.


Guile is GNU's extension language, and a Scheme.

It is meant for low level programming, like how it is used inside GDB.

Or high level, like how it is used in Make or Google's schism.

If you want memory limited, then you can turn it around in uLisp [0] without really changing the dev experience.

[0] http://www.ulisp.com/


that's an often repeated misconception about lisps.

lisps are pretty good at low-level programming, but then you'll need to make some compromises like abandoning the reliance on the GC and managing memory manually (which is still a lot easier than in other languages due to the metaprogramming capabilities).

there are lisps that can compile themselves to machine code in 2-4000 LoC altogether (i.e. compiler and assembler included; https://github.com/attila-lendvai/maru).

i'm not saying that there are lisp-based solutions that are ready for use in the industry. what i'm saying is that the lisp langauge is not at all an obstacle for memory-limited and/or real-time programs. it's just that few people use them, especially in those fields.

e.g. i'd easily prefer a lisp to put together a specialized byte-code interpreter to shrink firmware size for small embedded devices (e.g. for a radio https://github.com/armel/uv-k5-firmware-custom/discussions/4...).

and there are interesting experiments for direct compilation, too:

BIT: A Very Compact #Scheme System for #Microcontrollers (#lisp #embedded) http://www.iro.umontreal.ca/~feeley/papers/DubeFeeleyHOSC05.... "We demonstrate that with this system it is clearly possible to run realistic Scheme programs on a microcontroller with as little as 3 to 4 KB of RAM. Programs that access the whole Scheme library require only 13 KB of ROM." "Many of the techniques [...] are part of the Scheme and Lisp implementation folklore. [...] We cite relevant previous work for the less well known implementation techniques."

BIT inspired PICOBIT (last changed in 2015): https://github.com/stamourv/picobit racket (only a .so into an already running VM): http://download.racket-lang.org/docs/5.1.3/html/raco/ext.htm... scheme: gambit, chicken


People always point out this as a failure, when it is the contrary.

A programming language being managed doesn't mean we need to close the door to any other kind of resource management.

Unless it is something hard real time, and there are options there as well, we get to enjoy the productivity of high level programming, while at the same time having the tools at our disposal to do low level systems stuff, without having to mix languages.


C++ is one of my favourite languages, and I got into a few cool jobs because of my C++ knowledge.

However, given the option I would mostly reach for managed compiled languages as first choice, and only if really, really required, to something like C++, and even then, probably to a native library that gets consumed, instead of 100% pure C++.


I didn’t know you like C++. I’ve been reading your posts for a few years now and your advocacy of the Xerox PARC way of computing. I’ve found that most Smalltalkers and Lispers are not exactly fond of C++. To be fair, many Unix and Plan 9 people are also not big C++ fans despite C++ also coming from Bell Labs.


Back when C++ was becoming famous, my favourite programming language was Object Pascal, in the form of Turbo Pascal, having been introduced to it via TP 5.5 OOP mini booklet.

Shortly thereafter Turbo Pascal 6 was released, and I got into Turbo Vision, followed by Turbo Pascal 1.5 for Windows 3.1, the year thereafter.

I was a big Borland fan, thus when you got the whole stuff it was Object Pascal/C++, naturally C was there just because all C++ vendors started as C vendors.

On Windows and OS/2 land, C++ IDEs shared a lot with Smalltalk and Xerox PARC ideas in developer experience, it wasn't the vi + command line + debuggers are for the weak kind of experience.

See Energize C++, as Lucid was pivoting away from Common Lisp, with Cadillac what we would call a LSP nowadays, where you could do incrementatl compilation on method level and hot reload

"Lucid Energize Demo VHS 1993"

https://www.youtube.com/watch?v=pQQTScuApWk

https://dreamsongs.com/Cadillac.html

Or the Visual Age for C++ version 4, which introduced a database, image like system for doing C++ in workflows similar to Smalltalk.

https://www.edm2.com/index.php/A_Review_of_VisualAge_C%2B%2B...

https://www.edm2.com/index.php/VisualAge_C%2B%2B_4.0_Review

Then there is C++ Builder, still going on, even though the way Borland went down spoiled its market mindshare,

https://www.embarcadero.com/products/cbuilder

You're right altougth C++ was born on UNIX at Bell Labs there is that point of view, and also a reason why I always had much more fun with C++ across Mac OS, OS/2, Windows, BeOS, and Symbian, with their full stack frameworks and IDE tooling.

However with time I moved into managed languages, application languages, where it is enough to make use of a couple of native libraries, if really required, which is where I still reach for C++.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: