A few days ago we repointed gems.rubyforge.org to the gemcutter.org box.
This means that Nick Quaranto’s excellent gemcutter app is now indexing
and serving all the gems - so rather than having two gem indexes, we now
have one. As a consequence of this, when you release files to
RubyForge, you will probably also want to do a “gem push” to get them up
into gemcutter and into the main gem index.
Sorry we didn’t post this here at the time. There was a certain amount
of tweeting and whatnot but this is the place where these things should
be announced.
A few days ago we repointed gems.rubyforge.org to the gemcutter.org box. This means that Nick Quaranto’s excellent gemcutter app is now indexing and serving all the gems - so rather than having two gem indexes, we now have one. As a consequence of this, when you release files to RubyForge, you will probably also want to do a “gem push” to get them up into gemcutter and into the main gem index.
Sorry we didn’t post this here at the time. There was a certain amount of tweeting and whatnot but this is the place where these things should be announced.
Tom, does RubyForge have a mechanism for bulk emailing all accounts?
I think there are a lot of Ruby hackers who don’t read RubyTalk
anymore these days.
Honestly, even if there was an announcement here, I might have missed
it, I’m just lucky to have vigilant users on Prawn who noticed our gem
hadn’t updated
Yup, there’s a bulk email gizmo that can create a message and then a cron thingy to email everyone. Hm… looks like it has an option for only sending to project admins, which would limit the volume. I vaguely recall that it sends an email to the admin of each project, though - so if you have 10 projects you get 10 emails. Hm, let me see if I can do something with that…
I’m fine getting 10-20 emails on this if it is a 1-time thing. Esp for
something this big/important.
Honestly, even if there was an announcement here, I might have missed
it, I’m just lucky to have vigilant users on Prawn who noticed our gem
hadn’t updated
Hi Greg -
Yup, there’s a bulk email gizmo that can create a message and then a
cron thingy to email everyone. Hm… looks like it has an option for
only sending to project admins, which would limit the volume. I vaguely
recall that it sends an email to the admin of each project, though - so
if you have 10 projects you get 10 emails. Hm, let me see if I can do
something with that…
Yup, there’s a bulk email gizmo that can create a message and then a cron thingy to email everyone. Hm… looks like it has an option for only sending to project admins, which would limit the volume. I vaguely recall that it sends an email to the admin of each project, though - so if you have 10 projects you get 10 emails. Hm, let me see if I can do something with that…
I’m fine getting 10-20 emails on this if it is a 1-time thing. Esp for something this big/important.
Yup, there’s a bulk email gizmo that can create a message and then a cron thingy to email everyone. Hm… looks like it has an option for only sending to project admins, which would limit the volume. I vaguely recall that it sends an email to the admin of each project, though - so if you have 10 projects you get 10 emails. Hm, let me see if I can do something with that…
I’m fine getting 10-20 emails on this if it is a 1-time thing. Esp for something this big/important.
Yeah, in fact, that might be a ‘feature’ here.
Hm, good point. OK, I’ll do one to all project admins… Postfix,
prepare yourself…
Honestly, even if there was an announcement here, I might have missed
it, I’m just lucky to have vigilant users on Prawn who noticed our gem
hadn’t updated
I don’t think there was much of any discussion or announcement on
ruby-talk about moving gems off rubyforge and having gemcutter take
over.
Honestly, even if there was an announcement here, I might have missed
it, I’m just lucky to have vigilant users on Prawn who noticed our gem
hadn’t updated
I don’t think there was much of any discussion or announcement on
ruby-talk about moving gems off rubyforge and having gemcutter take over.
No, there wasn’t, as all the communications about the move, the
decision and everything happened in the background.
Just to summarize:
August 26, 2009:
Nick Quaranto posted his proposal to replace the canonical gem serving
with Gemcutter.
October 26, 2009:
Nick Quaranto announced that conversation with Ruby Central about
shutting down certain services of RubyForge having Gemcutter taking
over some of them:
Of course, there was lot of twitter noise about it, lot of disruption
about gem service that mistakenly got reported to RubyInstaller and
RubyGems projects.
A trend or practice that is getting used in Ruby-land:
Honestly, even if there was an announcement here, I might have missed
it, I’m just lucky to have vigilant users on Prawn who noticed our gem
hadn’t updated
I don’t think there was much of any discussion or announcement on ruby-talk about moving gems off rubyforge and having gemcutter take over.
I think the hope was that for most people nothing would change, since gems.rubyforge.org was repointed to the new box. So “gem install rails”
keeps on working. Also, Ruby Central is running the gemcutter app
instance that’s serving everything, so that’s the same also. Many gem
publishers are affected by the move… although many had already moved
over to gemcutter (or first to github and then to gemcutter). Hopefully
everyone else will see the recent ruby-talk traffic and the mass email
to all RubyForge admins.
I guess the idea was that this was more of a service improvement that
most Ruby users wouldn’t notice, except for the improvements. Instead
of a static gem index with a 5 minute delay and no user interface,
there’s now an excellent site that can actually be searched effectively.
But I agree - more comms on ruby-talk would have been good.
Hm, good point. OK, I’ll do one to all project admins… Postfix, prepare yourself…
And, sent.
It’s funny, the mass mailer sends emails ordered by user_id ascending.
So I’m seeing emails go out to all the folks who signed up for accounts
back in 2003… Paul B., Curt H… anyhow, good times.
No, there wasn’t, as all the communications about the move, the
decision and everything happened in the background.
Yup, please see my other post about some of the thoughts behind that.
Tom, I’m not arguing why the communication was in the background or
not, I’m cool with Gemcutter and really like it.
The issue, from my PoV is other.
I’ve closed several bug reports and responded several emails about 500
errors and 403 errors when installing gems.
Is not that these users didn’t connect properly to the new server, but
they had an older version of RubyGems as part of the combo with other
issues that I couldn’t replicate
A better heads up, not just for Ruby gem authors for community as a
whole using channels like RubyForge, Ruby-Lang and Ruby-Talk would
have been great.
Anyhow, canned a template answer with links to either my own posts and
Gemcutter ones for the time being.
they had an older version of RubyGems as part of the combo with other
issues that I couldn’t replicate
A better heads up, not just for Ruby gem authors for community as a
whole using channels like RubyForge, Ruby-Lang and Ruby-Talk would
have been great.
Anyhow, canned a template answer with links to either my own posts and
Gemcutter ones for the time being.
Thank you for your time answering my comments.
Luis, you’re absolutely right, that’s a hole in the new system - it
doesn’t generate yaml.Z and yaml, so old rubygem clients can’t work with
it. And of course you’re on the receiving end of complaints about
that… you’re right, that is a bummer. I wonder if there’s something
we can do about that… maybe pull the gems off S3 and generate those
old indexes once a week or something.
As you said, comms could have been better throughout this process.
Something for us to improve in the future,
Luis, you’re absolutely right, that’s a hole in the new system - it doesn’t generate yaml.Z and yaml, so old rubygem clients can’t work with it. And of course you’re on the receiving end of complaints about that… you’re right, that is a bummer. I wonder if there’s something we can do about that… maybe pull the gems off S3 and generate those old indexes once a week or something.
Nick Quaranto has done some work on gemcutter to support these older
indexes at least for “gem update --system”: