Hacker Newsnew | past | comments | ask | show | jobs | submit | deadbadger's commentslogin

Getting continuous errors on deploy during the bundle stage like so:

  /usr/lib/ruby/1.9.1/rubygems/remote_fetcher.rb:215:in `fetch_http': bad response Not Found 404 (http://bb-m.rubygems.org/quick/Marshal.4.8/activesupport-3.2.11.gemspec.rz)
Is this because rubygems.org is being nailed?


seems to be - mine is stuck at getting metadata from rubygems.org for 45 minutes. is there an alternative server to fetch this from?


For a less hammered server, can use http://bundler-api.herokuapp.com

I usually just use that one for my apps, much faster than rubygems.org


It's been an hour, but you could also fetch straight from github with this in your Gemfile:

   gem "rails", :github => "rails/rails", :tag => 'v3.2.11'


This is a mirror, so that's why it's probably out of date...can you try just production.cf.rubygems.org ? I have heard zero reports of downtime or other issues. :/

http://uptime.rubygems.org/131647


Hi - thanks for responding. My deploy's gone through now, but I'm curious how I ended up contacting a mirror - I was running the default bundler capistrano task, so (approx):

  $ cd <deploy-path> && bundle install --gemfile Gemfile --path <shared-path>/bundle --without development test
I haven't deliberately pointed anything at a mirror - do you have any idea how might my install have ended up doing so? This is with rubygems 1.8.11, bundler 1.0.21.


I'm getting exactly the same thing. Glad to know it's not just me.


For the time being I've rolled back the gem changes and applied the suggested hotfix (removing XML from the default params parsers). I've been trying for a solid hour to get a deploy out with the updated rails version, and it's just not having any of it.

Edit: finally got it out. This deploy model is completely screwed, though. It just shouldn't be normal to have a service like rubygems.org in the daily deploy loop. This is absolutely not a knock on the fantastic volunteers that run it - they simply shouldn't be dealing with this sort of load spike.


Registration is not a necessary condition for enforcement of trademark rights (although it obviously helps). In some jurisdictions, of which the USA is one, an established history of use can be sufficient to establish rights to a given mark.

So while iCloud Communications do not appear to have a registration for the marks, that doesn't mean they have no rights to them.

http://www.uspto.gov/faq/trademarks.jsp#_Toc275426680


My understanding was that registration was necessary if you wanted to have standing to defend your trademark in federal court. Yet here I Cloud Communications is suing in US district court.


Maximise the value for whom? The data is being monopolistically exploited by a private entity, for private profit. Where is the value for the taxpayer there?

I think most people would agree that taxpayer value for public expenditure should be maximised, and were the money from the licences going directly into the public coffers then we could certainly have a discussion about whether taxpayer value is maximised by selling the data or giving it away for free.

That, however, is not what is happening here. As so frequently happens in the UK, the data is publically funded, and the profits are privately realised.


The government pays the train operating companies to operate trains, not to feed data to iPhone apps, just as they're not paid to sell crisps and sodas from a small cart. This creates value (otherwise there would be no profit to extract), so why not?


The government pays the train companies to run a transport service; departure times are an essential part of that, in a way that crisps quite plainly (or quite saltedly, ha ha ha* ) are not.

I agree that large-scale API provision is added value, and I have no objection to a private company making a profit from that. What I object to is a private company being handed a monopoly on that private data, with no effective oversight being given to the terms under which they provide access to it.

ATOC, by dint of the exclusive licence granted to them by Network Rail, have complete control of the market for this publically-funded data. They are adding value by serving it as a reliable API, certainly; however the price they are able to command has little relation to that value, because they have no competitors.

* sorry


Buy why should we have to pay to license data about our own trains that we pay for?


On the subject of try() chaining, there's always Ick, which implements (among other things) a kind of Maybe monad for Ruby. This acts as a self-propagating nil guard, so you're able to write things like:

  maybe(Person.find("geoff")) { |person| person.manager.authority_level.permissions }
without worrying about chaining things yourself. I've not used it in production code yet, as I haven't really had a chance to do due diligence on it - be interested to hear if anyone else has used it in anger...

http://ick.rubyforge.org/


Even better to avoid chaining altogether by using delegation (cf. Law of Demeter) and taking advantage of the :allow_nil option to Active Support's Module#delegate (http://api.rubyonrails.org/classes/Module.html#method-i-dele...).


There's a solution in Smalltalk that I'm wondering why no one's brought to Ruby. In applications where this pattern is useful, there's a class, Null, with the singleton null (in contrast to UndefinedObject and its singleton nil). nil raises an exception, but null simply eats all messages sent to it, much like the maybe monad you listed.

This makes it really easy to mix and match the two paradigms in your code: in places where it's okay for the object to just eat messages, you return null. In places where no value is a real error, you return nil. Either value can be converted to the other with a one-line statement, and to existing code, null looks like nil when queried (i.e., isNil returns true, ifNil: and ifNotNil: and friends behave as if null were nil, etc.).

Seems as if writing a similar tool for Ruby would be trivial.


But then don't you have to make your choice about whether no-value is an error when you return the value, rather than when you make the call?

With #try et al it's made obvious at the point the message is sent whether it'll be swallowed by nil (and that you're fine with this). To me this seems safer than having two classes of entities knocking round, one swallowing messages and one not, and no way to tell at a glance which is which.

(I've not used Smalltalk btw, so apologies if I've misunderstood...)


You can use it either way. I usually employ it at return time, because usually there are classes (little-case C) of return types that I'm fine with being ignored. A list of objects that need to be signaled on an event, for example: it's fine with me if that's not initialized, so it's okay to return null there.

Sometimes, you want to do what you're saying, and have a brief snippet where you switch to message-eating null. That's easy enough using the built-in ifNil: message:

    (foo ifNil: [ null ]) baz quux frob: bar.
If this is common, it'd be easy to add a method "try" to Object that returned self, and one to UndefinedObject that returned null, at which point you could do the same thing as Ruby:

    foo try baz quux frob: bar.
So either way, really.


The Null Object pattern is used in RSpec and some other testing libraries for mocks.


A related effect, of course, is that it becomes practically impossible for politicians to plan for any sort of career, or parties to make personnel plans with more than a single-parliament horizon. Even a candidate with a hefty 60% support can expect only a single term at more than even odds.

Not only would this rob the legislature of experience, it would also mean politicians towards the end of each term would be focused on securing their next job, rather than performing their current one.


> Not only would this rob the legislature of experience,

I'm not sure it does, or at least not drastically: It robs the legislature of individual experience, but on average there's no reason you'd expect each politician to be in for the first time. They could even rerun later.

> it would also mean politicians towards the end of each term would be focused on securing their next job, rather than performing their current one.

This is a really easy problem to solve: You give a recently ousted MP a paid vacation after their term is up.

(This works better of course if you stagger elections rather than electing everyone at once, but it's fine without)


Better that than the current career plan for a politician:

- Oxford PPE

- Policy researcher

- Safe seat

- Minister

- Non-exec chairman/director


Well, possibly - but the point is that while ubiquitous safe seats are obviously detrimental, so is their elimination at all costs. The problem with safe seats isn't that popular politicians are consistently elected; it's that minority voices in those constituencies are marginalised. You can certainly address the latter by preventing the former, but it's a far from ideal way of doing so.

(Edited for ghastly sentence construction, sorry)


Och yes, I don't think I'd be quite so cavalier about my advocacy of this particular voting system if there was any chance of it actually being used.

(A bit like the one time I voted for the Scottish Socialist Party - I only did so for their entertainment value, not because they would have any likely control over policies).


I see from your profile you're based in Canada - here in the UK, I've found it's the regional redirects that are breaking things (because the regional sites aren't on the new setup yet). Assuming the same thing is affecting Canadian users, if you delete either the "ca." from the hostname or the #! from the path string, the link should work.


It seemingly applies everywhere except the US. Even though I now know the workaround, I still refuse to read their articles out of spite at their incompetence. It's all I can do to stop myself spitting at the screen.


The link works for me in Australia but the sidebar is positioned over the content.


Thank you for explaining why this is happening! I couldn't understand why literally every single Gawker link doesn't work for me, on any computer... it seems unbelievable they haven't fixed this yet.


I only wish I could up-vote this more. See also this earlier Language Log essay (http://itre.cis.upenn.edu/~myl/languagelog/archives/003366.h...), in which Pullum points out that passives occur in "Politics and the English Language" at a considerably higher rate than was typical in contemporary periodicals - Orwell couldn't follow his own advice, even as he was dispensing it.

The same goes for Strunk and White, who in the very passage instructing writers to abjure the passive, say:

"Many a tame sentence of description or exposition can _be made_ lively and emphatic by substituting a transitive in the active voice for some such perfunctory expression as /there is/ or /could be heard/."

Still, at least the guide linked in the OP has picked actual examples of the passive voice as illustrations, and concedes that there are times when it can be reasonably used (sorry, "writers can reasonably use it"). This makes it something of a rarity, even if the advice is still poor.


Certainly not when you're trying to judge energy payback horizons for a device incorporating silicon, nickel and cobalt, all of which come with significant costs of production. What if the degradation isn't linear, as seems very likely? Obviously it's a significant improvement, but 45 hours' operation tells us virtually nothing about the device's likely practicality.

The real problem here is that, as usual, we're forced to speculate about what should be straightforward facts, because the story in the OP is based on a dumbed-down press release relating to a talk we can't see, describing research we can't freely access (http://web.mit.edu/chemistry/dgn/www/pubs/publications_2011....), and as a result contains almost no actual information. So it's grandiose claims "we could power the third world!" without any means of assessing whether they're even plausible.

None of this is to say the new cell isn't really interesting, of course. But it's just so typical of science churnalism.


It is indeed. Now if only Google Accounts would integrate with Google Accounts, everything would be spiffing.

</cheapshot>


Cheap but warranted. Because I use Google Analytics, my GAFYD account is in "migration limbo" and I can't use some newer Google services like the Android Web-Market-to-Phone installation feature, etc.


If you check out the relevant paragraph of the Directive itself (quoted below) you'll see that's just the BBC's rather odd interpretation. The Directive doesn't mention shopping cart cookies at all, but rather carves out an area of potential exemptions for information that is essential to the service the user is accessing.

The BBC's technology reporting veers wildly between the patronising, the embarrassing, and the just plain wrong. I think this is a stab at the former.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: