Quite promising: clean-looking syntax, little overhead, portability seem to be driving motivations. Especially the following feature could promise adoption and momentum:
dynamic typing (not a fan myself, but for scripting: OK --- would be great if all such languages had a statically&strong-typed option --- after all the compiler/JIT needs to guess/work much less)
higher order functions and classes
coroutines (via fibers)
closures
garbage collection
operator overriding
powerful embedding api
Very neat-sounding. Will have to keep watch. Not as low-level as Go (can become wordy!), not as "foreign" initially as Rust (what, no GC, borrowing, a borrow checker, what?), not (yet!) as "haphazard" as... the majority of shell-scripting/server-scripting/client/web-scripting approaches.
For simple apps, dynamic typing may be OK, but I'm always writing complex apps, and static typing just seems like a minimum required feature for any language I want to use for complex apps.
So I guess I'll stick with TypeScript or C# for writing my cross-platform apps. The ecosystems for both are much more robust anyway.
I think a problem is that complex apps are made with scripting languages. This nasty practice seems to have finally become widespread habit in earnest with Perl in the hey-days of CGI httpd, and then of course came along Python and JavaScript and here we are!
I agree that strongly & statically typed languages allow for and indeed yield more robust ecosystems and complex apps.
But scripting languages have their place, roughly at the level of bash/.sh/.bat scripts IMHO .. the issue is that as LoB apps evolve from "simple one-off experiments" and "just a quick'n'dirty prototype to see if-and-how" to "mission-critical infrastructure" there just seems to never be a good time for anyone to say "we stop here, let's transfer this to a more stringent language early on first now, better today than too late".
Best would be if "scripting languages" actually had Haskell levels of type inference, even if not the rich type system, this way there wouldn't be type annotations (just like in scripting) BUT with a toggle-able "scripting-like lenience" that can be turned off to make the type-checker brutal, or initially during prototyping/scripting turned on by default to allow for ambiguous / uninferable / defer-to-runtime-and-allow-unsafe-conversions RAD-style work. With an additional middle-ground setting where it's still lenient but emits console warnings to review potentially troublesome sections.
Someone should create a whole new scripting language just for that! Any takers?
> I think a problem is that complex apps are made with scripting languages. This nasty practice seems to have finally become widespread habit in earnest with Perl in the hey-days of CGI httpd, and then of course came along Python and JavaScript and here we are!
Exactly, though the dynamic languages like Python did reveal some serious shortcomings with the "older" languages like C++ and Java. Which is why I'm so happy using TypeScript. It gives you the ability to layer static types on a language that still offers the flexibility and concise code that are unavailable in C++ and Java.
IMHO Haskell is not a bad "scripting language", whatever that means. My current project has lots of nice, composable blocks that get compiled in a library project and I use stacks scripting feature to just glue them together. Works for me.
Personally, I write games a lot, and I write apps with rich UI behaviors. Both have a lot of inherent statefulness and imperative behaviors that are, in my opinion, modeled better using a a language that at least supports imperative structures. Both also benefit a lot from OO principles.
I also use functional concepts, and my code also ends up with lots of nice, composable blocks, but being in a language that forces functional concepts where they don't fit is a recipe for unneeded complexity and unintuitive code.
Functional isn't a silver bullet. There Is No Silver Bullet.
Naa ... without wanting to get into a religious war here: Haskell actually provides an imperative "mode" (aka "do-notation"). But I agree about the silver bullet thing ... I am/was a Java developer by trade and have written my fair share of Python and C/C++ code. Haskell just clicked with me after getting over the initial hurdles and is a good fit for the work I do (which is processing about 400kEvents per second from a custom sensor).
I wonder how it compares to Lua in terms of portability. In terms of features though I don't see that many advantages from your list, or did I get something wrong?
No you didn't per se, just thought these were features that can only help adoption / momentum of a scripting language, not that these are where this has all others "beat" =)
For a short moment, I hoped that this was about a revival of one of my favorite non-Turing-computable esoteric programming languages, but I guess that's of pretty low probability!
Unfortunately the original spec is long gone, but [1] at least mentions the gist:
Having tried this in the past with personal programming languages, I always found I needed the parentheses to disambiguate control flow statements. You end up with stuff like:
if foo*bar baz()
...being parseable as either of these:
if (foo) { *bar } baz()
if (foo*bar) { baz() }
That example can be solved with enough lookahead but I think I found cases where it couldn't be (although I can't bring them to mind). Terminator tokens are another way to solve it, so you get Ada-style:
That's only if newlines are meaningful in the grammar. In C and related languages they are not, and where they are they tend to be a pain (the dreaded semicolon insertion...)
And even if you make newlines meaningful you only trade one pain for another, namely, you now need some kind of line continuation construct and lots of special cases where newlines are allowed...
Can't you just treat newlines like any other delimiter? Granted many tools assume newlines, but you could just not use those tools, or replace newlines with a placeholder so you can parse programs as a single line.
What you linked is about braces around the /body/ of an if statement.
What the GP poster was talking about was parentheses around the /conditions/ of an if statement.
Note that Rust (for example) doesn't require the latter.
In fact, if you always require braces (as in the post that you linked), then it's very easy to parse and read code which is lacking those parentheses on the condition.
i.e. `if (condition) x = x + 1` would be hard to read without parens.
but `if condition { x = x + 1 }` (with mandatory braces) is quite clear.
Ah, yeah. I didn't twig until you mentioned it, but... if ever you get tired of emacs vs vi religious wars, try switching to discussing whether 'bracket' refers to a parenthesis or a brace.
Given the context, I'm pretty sure the OP was using the 'parenthesis' meaning.
Says it supports Functional programming, but I don't really see functional features. Seems premature?
I see what the goals of Gravity are, but are there any reasons I want to use it?
Side note: not a huge fan of implicit self, i.e. class methods automatically referring to "y" as the instance variable "y". I realize c# and Java do it, but worrying about name elision at all is something I'm not a fan of. From an ergonomics perspective, it makes editor autocomplete features slightly worse, as a prefixed "this" or what have you can give it additional context vs "anything in scope"
It supports closures and higher order functions. The sample code doesn't show any of it.
It basically looks a lot like a subset of Swift designed to allow someone familiar with Swift to get going faster with Creo. Creo looks interesting mainly as a better integrated development experience than XCode provides.
Closures and higher-order procedures do not a functional language make. It has to be convenient to implement procedures that compute actual mathematical functions.
Assume there exists a (pure) function `f` and a list `xs`. Does the following expression always return `true`, or, at least, never return `false`?
By this criterion, both Haskell and ML are functional. If you attempt to compare values whose type doesn't support decidable equality testing, you get an error, either at compile time (Haskell, Standard ML) or at runtime (OCaml, arguably a design defect). And, if `f` is a pure function whose codomain supports decidable equality testing, then `map f xs = map f xs` (or double equals using Haskell syntax) is always true.
> And, if `f` is a pure function whose codomain supports decidable equality testing, then `map f xs = map f xs` (or double equals using Haskell syntax) is always true.
This statement is true in any language. So it's useless as a criterium for determining which language is "functional" and which isn't.
Edit: except, of course, Haskell - `xs` could be infinite, in which case the expression wouldn't be true (but bottom instead).
lolcathost% js17
js> function twice(x) { return 2*x; }
js> var xs = [1,2,3];
js> var ys = xs.map(twice);
js> var zs = xs.map(twice);
js> ys == zs
false
js>
Of course, the fundamental problem (in both Haskell and dynamic object-oriented languages) is types:
(0) Haskell doesn't have a type of lists. It has a type of potentially infinite streams.
(1) Dynamic object-oriented languages don't have a type of lists. They have a (dynamic) type of objects with physical identity, whose state (mutable or otherwise) can be interpreted as a list.
ML is free of these problems. It has a bona fide type of lists. Of course, you can also define a bona fide type of streams if you want to. Or a type of dynamic objects with identity. And those will be different types.
'==' or '===' is not an equality operator when used on objects (including arrays) in Javascript. It's a reference equality operator: are the two variables referencing the same object?
There is no builtin object content equality binary operator in Javascript, at least as of ES5.
Equality for objects means reference equality. That's precisely the problem with object-oriented languages: you can't manipulate the values that you really want, you have to manipulate objects whose state is hopefully the value you want.
That's surprising, but really has more to do with the failure of JavaScript's standard library than with being functional. For example, the same is true for OCaml (using `==` reference equality instead of `=` structural equality), but `[1, 2] == [1, 2]` is true in Python (a dynamic language, one that most people probably wouldn't call "functional").
In Python, the actual equality testing operator is `is`, not `==`. The fundamental property of equality is that equal objects can't be distinguished in any way. (Indiscernibility of identicals, aka Leibniz's principle.)
xs = [1,2,3]
ys = [1,2,3]
# are xs and ys equal?
ys.append(4)
print(xs == ys)
# no!
Ah, yes, there's a problem with `Float`s in Haskell. There should be no `Eq Float` instance. Standard ML handles this properly:
> map Math.acos [2.0] = map Math.acos [2.0];
poly: : error: Type error in function application.
Function: = : ''a * ''a -> bool
Argument: (map Math.acos [2.0], map Math.acos [2.0]) :
real list * real list
Reason: Can't unify ''a to real list (Requires equality type)
Found near map Math.acos [2.0] = map Math.acos [2.0]
Static Errors
What do you mean by "class methods automatically referring to "y" as the instance variable "y". I realize c# and Java? do it". Neither C# nor Java even use the terminology 'class method'.
be overly pedantic and claim "method" is the term vs "class method".
That's not me being overly pedantic, that's you misusing standard terminology. The term "class method" has a specific meaning that's quite distinct from "instance method" - the thing you're actually talking about. In fact, if you go to the docs of the very language this post is about:
You'll find verbiage about 'class methods', again, as distinct from 'methods'. I can't think of an OO language or context in which 'class method' to just mean 'instance method' would make any sense.
Sure. Does not seem that difficult to infer from context what I was referring to, but you're technically correct. The terms are not that far off from each other in either definition or literal wording.
I think pvg's point is that "class method" is often used interchangeably with "static function" -- that is, functions that are not called on instances (e.g. SomeClass.foo() vs (new SomeClass()).foo()).
It would seem that you were talking about what most people would call instance methods, or commonly just "methods" (unless you're in a language like Ruby, where you don't have functions as a first class entity -- if you want "class methods" as they are called, you define a method on the target class object's meta-class.)
He means how in Java you can do something like this:
class Foo {
private Integer bar = 0;
public Integer baz() {
Integer inc = 2;
bar += inc;
return bar;
}
}
Where `bar` is referred to in the same way a local variable is, even though it's actually a property of the class the method is defined on. It's equivalent to using `this.bar` but the compiler accepts it anyway.
please don't use java.lang.Integer unless you really have to.
I know it's just an example but it gives the wrong ideas of what good code is.
Each operation in baz() consists of Integer.intValue() and Integer.valueOf(int), like
Integer inc = Integer.valueOf(2);
bar = Integer.valueOf(bar.intValue() + inc.intValue())
...
besides dealing with unneeded indirection values outside [-128 : 127] actually create an object that has to be allocated+GC'd each assignment.
Such nuances give Java the bad rap. 'int' maps straight to the hardware, java.lang.Integer - not so much.
you mean the syntactic sugar or the jab at java.lang.Integer? Flip note, the code would fail w/ NullPointerException if bar was not initiated (bar=0) but left null.
I have my pet annoyances, too, don't get me wrong. Just the other day, I saw someone in a coffee shop using 8-stop tabs and had to set them straight. They were in the middle of a conversation about food, but this is important.
"Hey, want to buy a car?" "Maybe. What color is it? Manual or automatic?" "Why does it have to be a manual?"
It doesn't have to be anything, but I'm not going to buy a car (or invest time looking at a new programming language) if you don't even tell me what it is.
As others have noted, the design goals of the project make it sound like Lua would have worked quite well for them. They didn't pick Lua, so it suggests their project has some advantages over Lua. They don't need to justify their decision, but if it does have advantages over Lua, it would be interesting to know what they are. And even if it doesn't, it would be interesting to know why they made the decisions they did.
The point of asking questions is to find out the answers; it doesn't mean we're all prejudging them based on the assumption their answers are going to be terrible, or even that the answers might be terrible. It's okay to start a project just because you want to!
It sounds like they want first class objects with inheritance which Lua lacks. You can kind of, somewhat, do objects with inheritance in Lua, but no two projects that do that do it in the same way.
This doesn't answer the question; in particular, there are loads of other languages (Lua probably being the most popular) which would also satisfy this requirement.
Yes, but I guess it's only an example to demonstrate one liners and anything would do. I prefer not to have to surround conditions with () and terminate statements with semicolons. Plenty of languages demonstrated that it's ok. However I don't know how that impacts the complexity of the compiler. I used to know how to write one and reason about it but I spent my last 25 years doing other things.
May be off topic, but did Apple remove their restriction on using a VM in an app put on the App Store? Or was that restriction only for JITs and not interpreters?
This looks like a language designed for use with a commercial product / development environment (http://creolabs.com/).
I'm curious how well that works - I know that used to be fairly common in the '80s and '90s, but it feels like that hasn't been happening much of late. The only very similar examples I can think of are Swift and Xamarin; Swift had the advantage of a large customer base (everyone writing iOS apps), and Xamarin was based on an existing, well-established language (C#). And all the older big examples that come to mind, VB, Delphi, Objective-C, etc., were variants of an existing language (Basic, Pascal, C, respectively), not a brand new language.
Creo folks, are you finding that customers / potential customers are excited about picking up the new language? I'd love to live in a world where there's more work on programming languages (clearly, none of our existing languages are optimal) but I'm not super optimistic.
I really like what the company is doing with their other products, mainly their design tool. It looks interesting.
But I have the same question as yours. Why not just leverage Lua. I wish they made it clear what they're gaining. Maybe the end bundle size is smaller than Lua?
Concurrency is always hard and fibers could be interesting, but looking at src/runtime/gravity_vm.c It appears to be single threaded with max 1 fiber running simultaneously per VM.
I think there is still a niche for something like a Lua/Erlang hybrid where there is a real possibility that the VM will run high level code units on multiple cores.
The other thing missing from Lua and LuaJIT which would be nice in future VMs would be support for SIMD vector optimizations.
I do use C, but my point is that it is feasible to incorporate both of those features in an embedded VM, by using Erlang style actors for concurrency, and by providing builtin vector types that use SIMD instructions where available. If future language designers were to do so, they would be able to better differentiate their language from Lua, which does not provide these features.
The only explanation of its intent in the readme is that it was created "in order to offer an easy way to write portable code for the iOS and Android platforms." I think the parent comment was pointing out, in a sideways manner, that they didn't need to create a new language to do that, so it's somewhat mysterious what this is actually meant for.
The parent comment said 'strange...' as if they had claimed that nobody had done it before, and therefore it was strange how he was doing it. But they hadn't claimed that. So what is strange? If a new bakery opens in my town I don't say 'strange... I've been buying bread for years'. It's an alternative offering. You may say it's not necessary, but it's not strange. That just doesn't make any sense as a snarky remark in this situation.
If somebody asked why Hypothetical Dave opened a bakery and Dave answered "So this town would have a source of bread," I think it would be reasonable to find that strange. He isn't explicitly saying nobody has done it before, but the answer is weird if it's already been done.
I mean, creating an unnecessary programming language isn't strange — it's extremely common, and some are even good. But providing a rationale for creating the language that doesn't explain why you created the language is a bit odd, isn't it?
I think the main value of this project is hidden under the "internals" link of the site. It discusses the construction of the language. Although it is incomplete as of now - hopefully the pages get filled up.
Since you asked: This comment is pretty insubstantial. There's no comment on what's good or interesting about this language, just "really loving what it's putting down," which is pretty inscrutable as praise goes.
Can only speculate about the downvotes that perhaps "putting down" is being read in its traditional meaning, which might give your whole note a sarcastic feel. Other than that, I have no idea.
// a represents a range with values 1,2,3
var a = 1...3;
// b represents a range with values 1,2
var b = 1..<3;
Ah! I wish they had gone for 1..3 and 1>..3 and 1..<3 and 1>..<3 to make it easy to understand. The above would map to [1,2,3], [2,3], [1,2] and [2] respectively.
Range operator quirks with ... and .. have to today be memorized and waste developer seconds that add up to man-years.
For those being negative about "yet another programming language," try creating a lang sometime...it's a fantastic exercise and, in the very least, will make you a much better programmer.
This isn't some hobby project. This is a commercial offering, submitted to HN by the corporate account of the company that created it. I don't see why anyone should hold it to hobby project standards.
It is written mostly by a single programmer, while building a cross-platform development platform for iOS and Android, while building a company around the vision.
+1 that this is no hobby project. But if it were one, it isn't that great. For its intended audience (of Creo users I presume) I can't say, anymore than I can say whether an "R" project write up is good.
It definitely beats the standards of my (unsubmitted) hobby projects:
1) Logo
2) Getting started guide
3) Other books
On the other hand, it loses as a hobby project posting. Indeed the effort I think should go into such a posting is the reason why my hobby stuff isn't posted. Such posts are good when they document that something significant can be achieved "out of the gate" (C4 or scheme48 or something-amazing-in-500-lines) or if they document the steps to reach a more-mature-than-expected state.
There should be more hobby languages and more hobby transactional RDBMSs and more hobby ANYTHING we take for granted.
If someone writes up "docker in bash" as an example from about a year back, or something similar, whether it is a hobby thing or in "experimental production", it stands or falls not on what is achieved but what we the community get from knowing that they achieved it.
I'll admit that my first reaction is "yet another programming language", but mostly because it's rare to see the announcement of a new programming language that will offer anything significantly different than the existing languages.
The FIRST thing I do when checking out these new language announcements is search for something, anything, that shows this language is different than all the others.
My next concerns are wondering if the language will...
1) Make it to 1.0 release
2) Have thorough documentation
3) Have much of a community
4) Have tooling that makes it easy to use
5) Survive long enough to make a viable long-term investment to learn and to use.
Edit: I also am concerned if this new language makes the same mistakes as previous languages. For example, I consider allowing null as a mistake. Null is familiar, it's easy to implement, and it causes untold grief.
> Edit: I also am concerned if this new language makes the same mistakes as previous languages. For example, I consider allowing null as a mistake. Null is familiar, it's easy to implement, and it causes untold grief.
The language is dynamically typed. null is only questionable in statically typed languages. There's no benefit to a Maybe type in a dynamically typed language.
I don't understand that mindset. What's the alternative? Nobody creates any new languages? Only corporate giants should create new languages? It seems like a pervasive anti-intellectual attitude.
There's probably thousands of languages in existence, with new ones being created all the time. Sure, if you're doing it as an exercise, more power to you. But if you're trying to get other people to use your language, the natural response is why should I bother with yet another language? I already don't have time to learn the the ten languages I'm interested in learning that already have a following.
People have a right to their opinion, and people have a right to oppose other's opinions. I question whether people have a right to continue to argue about such an opinion in a way that spreads virally, however.
One good technique for spreading irrationality virally is to ask leading questions which continue the "debate" about something that really isn't that important. We have programming languages, they are often times hard to learn, and learning new ones, or making new ones, is something only a subset of the programming aggregate can and are willing to do. i.e. It's an important topic for a few, but for the rest of humanity, it really doesn't matter that much day to day. (Which is not to say a new language might eventually be useful to all, but that argument itself is irrational given it may occur in the future.)
I'm only calling out this argument here because I find this general viral behavior is occurring around other concepts in a way that is creating inefficiencies in communication channels - something we didn't really intend when building the Internet.
I agree with you. This isn't something that needs a big discussion. "Do we have too many languages? Should we cut some of them?"
But I think the common response of "oh god, here we go again" is unproductive and, frankly, stupid. It reflects something about the community. I'm not asking you to use my toy language to build the Next Greatest Enterprise Application (TM). I just made something cool, and maybe you can use it to make something cool too.
Presumably, those who oppose hobbyist languages believe that all the time and effort being spent on these projectw could be somehow harnessed and redirected toward improving existing languages, to the benefit of a wider community. I think the attitude is not so much anti-intellectual as a lack of understanding of how software development really works. Similar to the fallacy that putting twice as many programmers on a project will cut development time in half.
Do people actually oppose people spending their own time on hobbyist languages? This is different than having a negative outlook on the prospects of a given language. I think its important to separate an honest assessment of the virtues of a language, an assessment of its value to the larger community a great language with no libraries after all still may not be too useful, and worrying about how people spend their time.
It's a kind of tragedy of the commons. If two languages are basically equivalent yet incompatible, neither of the authors did something wrong, but it's a shame about the missed opportunities and problems that didn't need to happen.
What missed opportunity? There is little reason to suppose someone that got motivated to create language A would have spent their time on language B if only they hadn't got involved in creating A. They could have spent that time learning to play chess or fishing.
I think languages should have built in help for serialisation, though not tied to a particular format. This is to get around the tension between automating the process (avoiding boiler-plate) versus "gating" to validate and transform data as it enters/leaves the domain model.
All serialisation frameworks I've seen map internal record types (structs, objects) to an external schema, and write directly to record fields -- breaking down that "gate". What I would like is to map those schema to something like a constructor/factory parameter list instead. (Python keyword arguments come close, but the lack of typing means there is still no schema).
Serialisation libraries should confine themselves to converting between (typed) key-value pairs and wire formats. Then there should be syntactic sugar to help the transition between that plain-old data and the real domain model.
Pretty sure we had all the programming languages we needed by the late 80s. Nonetheless, we need new perspectives on languages more than we need snark.