Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I've been shocked at the level of disrespect for language standards here. Yes: adding semicolons is probably good practice because it avoids the chance of stumbling over bugs like this. And yes: the ASI feature in Javascript is in hindsight a terrible mistake.

That said: you go to war with the language you have, not the one you might want or wish to have.

ECMAScript is ECMAScript. Arguing that your transformation tool doesn't need to handle a feature specified in the language and supported by all known implementations is just ridiculous. Arguing that people making use of a feature that the language standard says they can (and that works) are "sloppy" is equally dumb. Even weirder are the people who jumped on the use of the && operator to effect an "if" as "abuse" -- this is an idiom pervasive in lots of areas and (again) well-supported by Javascript.

Not everyone is going to have the same aesthetics, everyone has a different idea about what features are "fun" and which are "sloppy". And decades of experience in the "code convention hell" world of enterprise programming has taught us nothing if not that this sort of pedantry helps no one. If you want to use Javascript, you have to take the whole language -- warts and all.



I think you missed the point.

This is not really about whether semicolons are required or not.

The point is that in order to ensure maintainability of code you should try to use language (and framework) in a way which ensures better maintainability, supportability, and portability of your code.

Look, the statement like "a && b" is 100% valid in many languages but in order to increase maintainability, supportability, and portability of your code it should be written like "if (a) { b; }".

The easiest way to understand the point of this rule is to get a job maintaing some old crappy code-base :) - I learned that way.


No no, I assure you that was the point I took away. And it seems you missed mine, which is that it's fine to make pronouncements like "don't use short circuit and as an infix if" for your own code. But flaming about them in public is rank pedantry. It's the kind of nonsense that the enterprise world has been dealing with for 20 years now: chasing the "maintainability rule of the week" is going to hurt you badly long term.

Learn to read and maintain the language you have, because you can't win this.


Okay, I'm going to bite.

&& is logical AND, and it operates on boolean expressions (or types).

So "a && b" evaluates to true if both a and b are true.

How can that be replaced by "if(a){b;}"?

Is it "r = a && b" being replaced by "if(a){r=b}"?

Using Chrome's javascript console:

> a = "asdas"; b="da"; a == b; false

> a = "asdas"; b="da"; if(a) { b }; "da"

Could you (or some other javascript expert) explain it to me?


In JavaScript, && does not evaluate to a boolean. It evaluates to the thing on the left if that thing is false (in which case the right hand side is not evaluated at all), otherwise to the thing on the right. So in said JavaScript console,

  "asdas" && "da"
evaluates to "da" and

  "" && "da"
evaluates to "".

So in an assignment context |r = a && b| would need to be replaced by: if (a) { r = a } else { r = b }, which may or may not be more confusing than the original statement with &&. But in a statement context, where the result of && is not being assigned to anything at all, "a && b" is exactly the same as "if (a) b;" except harder to understand.


Nitpick: you mean `if (!a) {r = a} else {r = b}` (or switched blocks).


Er, yes. Of course I do. ;)


The && operator short circuits. It evaluates its second operand only if the first is true (becuase if the first is false, then the expression result is known to be false). It thus acts pretty much exactly like "if", but with an infix syntax. This trick is used pervasively in shell programming, where the native if syntax sucks.


Thanks, I get it now :)

So the second operand can be a function / method, with any (or no) return type.

> function bob () { alert("bob") }; true && bob();

Alerts "bob".

Nice trick, though not something I'd personally ever us as it's not concise and isn't portable to C# (and I assume Java/C++).

Plus using the IF statement is a better showing of intent.. and I don't really like relying on job security through obscurity..


It is concise. There are fewer tokens and bytes needed to write A && B than if ( A ) { B }. And the intent thing is, as I've tried to explain, very much situation-specific. There are many programmers out there very comfortable with this idiom. That it doesn't happen to be popular in the web development community doesn't mean it sucks or isn't worth learning.

Don't pass judgement on syntax you literally just learned, basically. Don't assume the world you know is the only one worth knowing. Good hackers know their tools.


I'm not judging it, I'm just saying that it's not something I'd use because it's confusing for my colleagues / anyone maintaining my work. Unless they're a pure Javascript person.

It is a neat trick though, and proof that I need to spend a little more time hacking in javascript to improve my knowledge :-)


Going the other way: In languages where if is an expression

    if (a) { b; } 
would return b if a was true and nothing otherwise. Which is a similar to a && b.


I agree with you to a certain extent, but I do think it is important to distinguish between those that think that they should just use a semi-colon and those that seek to disrespect a standard.

In my case, and I've seen this sentiment expressed repeatedly by others on HN, I think it is odd that a minification script would actively choose not to support a syntax that is standards compliant. On the other hand, I think it is crazy for a major project to use a valid syntax that not only breaks said [popular] minifier but also offers no noticeable benefit.

I do not feel as if I'm disrespecting any standard just because I think simply adhering to that standard is not a sufficient justification for doing something.


But "no noticeable benefit" is your aesthetic decision, not an objective truth. Not everyone feels the same. Pythonistas, for example, might quibble with you about that, because they skip semis when typing all the time and will experience editor friction when using Javascript. And even if you think it's "crazy" to skip the semicolons, you might not think it's crazy to write something like "test && result" as a simple if. Shell and perl programmers like that sort of thing and can read it without difficulty.

I'm not saying that all working code is good code, or that you have to actually use all the language features in all your code, or that you can't have your own well-reasoned opinions about this stuff. What I am saying is that if you're serious about using "Javascript" and interacting with the broader community of "Javascript" programmers, this kind of feature pedantry is going to hurt you and the community badly. You will constantly be running into useful (maybe even brilliant) code that does "crazy" things.


As a Pythonista, I beg you to leave me out of this. I love Python's syntax but I can see Javascript's semicolon insertion is awful.


Interesting point, and I do hope that no one ever takes one of my opinions as an objective truth. That said, I do feel strongly that if the only benefit you can name about writing code in one particular manner is that it is aesthetically pleasing to you, then that is hardly a benefit at all.

When it comes to this particular case, I've heard a few different arguments for why the syntax they use is poor, including the following:

1. The syntax relies on parsing behavior that is expected to change in the future.

2. The syntax they use does not work in one of the major minifiers used throughout the community (the crux of the whole issue to begin with, but certainly a drawback in and of itself)

I have yet to hear any benefit to NOT including a semi-colon to resolve this issue beyond aesthetics, and aesthetics alone is just not a rationalization that I can get behind.


I do feel strongly that if the only benefit you can name about writing code in one particular manner is that it is aesthetically pleasing to you

Isn't that exactly what's happening here? A bunch of people piling on skipped semicolons and infix if's because they think they're "sloppy"? Refusing to support ASI (which is, in fact, precisely specified) in their transformation tools because it's "broken"? Why are some people's aesthetics more important than others?

Again: you use the language you have. You will never get the community on board your private yacht of your "sane subset" Javascript. It's been tried for decades. It doesn't work.


>>On the other hand, I think it is crazy for a major project to use a valid syntax that not only breaks said [popular] minifier but also offers no noticeable benefit.

That's basically the take-away I got from yesterday's semicolon drama. You can appreciate that JS allows you to omit semicolons or you can bash those who choose the ambiguous over the explicit. But if you're the lead on a hugely successful project you should pick the syntax that will make it work everywhere. It's especially odd for a web developer to choose aesthetics over pragmatics when it comes to things like this.


That said: you go to war with the language you have, not the one you might want or wish to have.

That's taking the quote out of context, which is saying a lot because it's a oft-quoted example of Donald Rumsfeld's tap-dancing. I doubt Mr. Rumsfeld was advocating driving around in Humvees as if they had armor. I'm sure he would laud the attempts of soldiers to improvise and mitigate the risks inherent in their equipment as much as possible.

And decades of experience in the "code convention hell" world of enterprise programming has taught us nothing if not that this sort of pedantry helps no one.

Over a decade of experience consulting in that world has shown me that adherence to code conventions has tremendous benefits. In shops where there was strict adherence to code conventions, I could be 10X or 100X more productive when refactoring using automated tools.

If you want to use Javascript, you have to take the whole language -- warts and all.

You need to justify this. This strikes me as a silly and counterproductive notion. Even in a tiny language like Smalltalk, you don't want to use, "the whole language -- warts and all," every chance you get. Hacky tricks have a cost. Just because you can implement an entire parsing system using doesNotUnderstand handlers, doesn't mean you really want to. (And yes, I've seen this happen in real life -- you really Do Not Want!)


That's all well and good until you write code that other people have to read. It's basic courtesy to respect the value of their time.


Are you kidding? JSMin is open source. If you care so much, send a pull request. JS has plenty of weird corner cases. It's up to the author to decide whether he wants to spend time dealing with every single one.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: