Hacker Newsnew | past | comments | ask | show | jobs | submit | crbnw00ts's commentslogin

When stuff like this makes the front page on a regular basis, it's already a joke site:

https://news.ycombinator.com/item?id=5735421


> The way the thing is filmed with the smiley face and lighting up eyes, I could easily imagine a sci-fi horror film being based around it. :)

Like this?

http://i.imgur.com/W1s9crO.jpg

In all seriousness, I feel it is only a matter of time before someone dies as a result of their home automation being hacked.


my god this is an accurate depiction of chef


+1 Nice work.


This is the correct answer. The manager's job is to manage the people, not the project. That's what a tech leads and PMs are for.

A good people manager works to keep their employees comfortable, aims to avoid conflicts (and resolves them if they arise despite these efforts), mentors and coaches people to develop new skills, and shields them from politics. Doing the work of an individual contributor just gets in the way of that. The manager should be familiar with and understand what their people do, but leave the actual performing of that work up to them.


That is incorrect. There are also metrics regarding "microevents", which are points in time during which brainwaves momentarily exhibit the characteristics of the first stages of sleep. This is effectively a measure of how often the subject is starting to "nod off". The napping group showed far fewer of these events.


What's sad is that it wasn't so long ago we were treated to breathless articles regarding the software that was used as part of the President's re-election campaign, which apparently was well-tested and well-engineered enough to actually do its job when the time came:

http://www.theatlantic.com/technology/archive/2012/11/when-t...

It seems today's politicians are (at least in some cases) familiar with what it takes to build reliable software. Perhaps the problem is that they are only willing to see that it's done when it benefits them directly, but not when their constituents need it.


The requirements for working with the USGov are restrictive enough that only specific companies are capable of participating in the bidding process. The Democratic party itself and Obama's re-election are, by contrast, private organizations that are not limited by these rules.

Basically, it seems like the government sourcing process has shrunk the pool of potential bidders too small to provide a properly competitive marketplace for software for USGov customers.


> Basically, it seems like the government sourcing process has shrunk the pool of potential bidders too small to provide a properly competitive marketplace for software for USGov customers.

And don't think for a second that this is unintentional.


The INTENT is to limit corruption. If they could give contracts to anybody they want, then they could reward donors.

I think it's actually achieved that goal. It comes at the expense of competence, however, as those with the best lawyers (not best developers) win contracts.


Except it hasn't. The current system has turned what is left of the pool of potential bidders into key donors.


So we have a choice between corruption and incompetence ... I'm pretty sure that corruption would actually be better, at least shit would work.


We only have a choice of which we want to start with. Eventually we end up with both.


> And don't think for a second that this is unintentional.

Okay, what if I do?


Lobbying exists and is legal. Corruption is ever-present and well-documented, from 3rd to 1st world countries. Contracts are widely regarded by private industry to be obscenely overpriced at best, and highway robbery at worst. Senators are regularly known to block necessary bills to write in pet projects that will benefit their campaigns/constituents directly, even at the detriment of everyone else. Want an example? Northrop Grummond is hardly a scrappy small company, but they lost out to Boeing, even though Boeing was going to create a worse, more expensive aircraft, because senators in South Carolina didn't want their state to lose the jobs.

If you are anything but cynical regarding the government contract bidding process, you're asking to be made a fool of.


It was probably intended as something else, but it has probably become a way for discouraging competition through mechanisms like regulatory capture.


It's not really enough to just handwave the word "regulatory capture." What's your evidence that the acquisitions process has been captured.


I'm obviously speculating here.


Harper Reed (CTO of Obama for America, who can be credited for those articles you're referring to) and Clay Johnson (CEO of the Dept for Better Technology) wrote an op-ed addressing what they see as the root cause for this kind of failure: http://www.nytimes.com/2013/10/25/opinion/getting-to-the-bot...


Well, think this out a bit farther. Let's say the SNAFU coefficient of a piece of software is 5%. If you're using the software to manage your election campaign, you lose 5% of your digital premium (over traditional electioneering using posters and TV commercials and other one-size-fits-all mass communication media), most of whom will presumably vote for The Other Candidate - that's bad, but you can just as easily lose the same or larger with a careless remark (eg Mitt Romney's casual dismissal of 47% of the electorate as 'takers' at the last election which ended up alienating an awful lot of swing voters even though he was obviously pandering to his audience of wealthy donors at the time he said that).

But put that in the government, and you're potentially disenfranchising 5% of the citizenry which is not only politically foolish but quite likely illegal, given constitutional requirements about equal treatment and so forth. If you have to provide universal service of some kind, then your marginal costs go way up. Suppose 99% correctness were the acceptable standard, such that Social Security, Medicare, VA etc. could just ditch that 1% of claimants that caused the most administrative problems; the administrative savings would probably be far more than 1%, I'm guessing more like 10-15% because once the administrative burden of dealing with a given citizen rose above 1 or 2 standard deviations you could just dump them from the system and cut your losses.


> Extraordinary claims require extraordinary evidence.

For how many years were people dismissed as nutjobs and "conspiracy theorists" when they talked about NSA spying? Now we know it was all true. Would you like to make a bet as to whether or not the US military-industrial complex acts in its own self-interest when comes to the "War on Terror"? When it comes down to it, there is nothing particularly "extraordinary" about these claims.

> You have made these accusations without any evidence.

The evidence is circumstantial and based on past behavior patterns of the US government. Read "Decent Interval" and its follow-up "Irreparable Harm" by Frank Snepp, or "State of War" by James Risen (for something more recent) to get started on this history. Again, there is nothing "extraordinary" about the previous poster's claims.


>For how many years were people dismissed as nutjobs and "conspiracy theorists" when they talked about NSA spying? Now we know it was all true.

Just because it is true, doesn't mean those people aren't nutjobs or conspiracy theorists.


It's also not at all logical to say:

[Unlikely thing X] came true! Therefore, [unlikely thing Y] will, too!

Even if X and Y were incredibly similar -- in this case they aren't, at all -- that's still not a logical assumption. It's not even evidence toward Y.


There is your problem, you think unlikely thing x came true when in truth it was likely.


I'm curious what your definition of "nutjob" is, given that you would apparently apply it even to people who are correct about all of their theories.


>I'm curious what your definition of "nutjob" is

I don't really have one but if you must, it's probably someone with a dissociative disorder. I'm just noting that you don't have to be sane to be right.

>given that you would apparently apply it even to people who are correct about all of their theories.

When exactly did I say that?


You clearly implied that nutjob status is not correlated to the correctness of one's claims.


I am very sad that we did not actually land on the Moon after all. :(


There sure are a lot of hard-coded numbers in that codebase. In many cases it's easy to figure out where the numbers came from, but in others, it's nearly inscrutable without a named constant or a comment or something. Here's one example:

https://github.com/cisco/openh264/blob/master/codec/encoder/...

Where does "15" come from? I suppose if I'd written a codec like this before, or if I stared at the code long enough, I could figure it out, but wouldn't it be better to use an enum or a #define?


Encountered this topic recently in the office.. I'm from the side of the fence that doesn't overly care about code tidyness, so long as it performs its overall function. 10 years ago this kind of thing might have bothered me a lot more, but at some point crossed a threshold where I realized _all code generated to the present day_ is pretty ugly and long term unmaintainable (but that's a story for a rather large and rather boring essay).

The tl;dr is simply that if you obsess over minor details on this level, a lot of brainpower is wasted that could be used for bigger problems you should be much more worried about.

Playing devil's advocate, in this case the if() is obviously a guard for the subsequent switch. Moving just the constant '15' into a #define would make it read more like some magical sentinel value, unless you also #defined all the literal values used in the switch, at which point you've introduced a wholly bullshit layer of abstraction to what was otherwise incredibly concrete and explicit code.

Let's assume you have a great reason for doing that. OK. So what do you call these constants? Well, the code appears to be branching to special cases based on the width of some integer. So we instead have what, WIDTH_1_BIT, WIDTH_2_BITS, ..., WIDTH_15_BITS? Now we've pulled those constants out, you stare at the block of #defines, and think, damn, this is so ugly since most of the value space isn't fully defined! So some kindly maintenance programmer comes along and pads out the rest, producing a perfectly beautiful block of utter line noise.

That is arguably considerably less readable than what we started with


I'm not sure how much this applies to open source. Closed source, quite possibly, since it probably won't survive long enough / need to change enough / need enough people to understand it.

Open source, meanwhile, requires being understandable to new people. It needs to explain itself, or it needs to be idiomatic (within its specialty). If neither, you're condemning people to rewrite it or waste time trying to figure out the original intent.


The second half of what I said explained how the code becomes less readable. If someone wants to hack on that code, they better understand the algorithm it implements, which means at a minimum they've probably absorbed the same 2000 page spec the original developers absorbed (and know exactly what 15 means). No amount of #define can fix that, although in the right circumstance some comments might help.


Your first paragraph sounds quite defeatist. In my experience, the way you deal with "minor details" is to write them down in a style guide. It'll take away a lot of the quibbling about small things. Some will find it limiting, others will find it liberating.

I've worked at several jobs that had decade old code in production. Some cared about the little things, some didn't. I know which code was the best to work with. In my experience, the broken windows theory is true when it comes to code. The little things matter in the long run.


A simple //comment would have sidestepped all of that and still clarified the number's purpose.


It's possible the original code had a //comment and it was stripped when they created the F/OSS repository.


This is a very insightful comment. There has also been some debate about the appropriateness of non english comments and comments that only made sense in context of internal code names for Cisco projects. The high level test we used was lets get the basic code out on github as early as possible.


It's all very well when it's a simple variable but quite often I have come up against variables that would take a whole paragraph of text to explain their purpose and don't have a concise or obvious name.


I find quite often a little hard to believe. Occasionally, maybe; and why not provide a paragraph of text? Sometimes it's the right thing to do.


Then write a paragraph of text. An hour writing today will save weeks of reverse engineering later.


IMO it doesn't make sense to read code like this without a copy of the H.264 spec in hand. And once you've got that spec, why paraphrase a fraction of it, poorly, in the source code?

I do think that comments at the function level indicating which part of the spec you should be reading would be nice. They might not be an issue for people who are indoctrinated into the code though.


// spec 12.3.4.5

would certainly help if you had a spec open.


Huh. I don't think it's just you - that looks like some properly nasty code. Formatting is all over the place, lines are commented out, comments are useless and obviously wrong in places...

I don't really speak C++ though, so maybe this is normal.


It's not because it's C++... this is just really ugly code.


The good point is that other hacked together quasi-compatible implementations now have an "official" reference implementation to browse. It could be a lot worse... I've waded through uncommented OCaml to discover how a certain shall-go-unnamed commercial, non-restful XML API worked. Or, Cisco could've never open-sourced it.


AVC/H.264, like many MPEG standards, has an official reference implementation: http://iphome.hhi.de/suehring/tml/


I'm not in to video circles anymore, but I was under the impression the existing open source implementations were way better. The reason this code is significant is Cisco is providing licensed binaries.


As with many things it seems like knowing how an H264 codec works before reading the code is essential. I don't someone has an approachable reference for the standard?


while i'm sure there's many other examples, in that particular case it's just the maximum value of an unsigned 8 bit integer, as defined above on line 1252.

it would be a bit better if it explicitly had a nice #define but recognizing the values of common powers of two (minus one) is a useful code reading skill.


I think you mean 4bit.


Slight correction: the government went after them for anti-solicitation agreements[1], and then they were sued for salary-fixing by former employees[2].

[1] http://www.justice.gov/opa/pr/2010/September/10-at-1076.html

[2] http://www.macworld.com.au/news/apple-and-google-face-salary...


And then of course once they acquire those toys and have them sitting around for a while, they start to get itchy for an excuse to "use" them.


Yes, that guy was fake, but the protestors blocking the bus and preventing it from leaving were real.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: