3D Printing of a Gun. How can you have gun control if you can download your gun like music?

January 1, 2013

– The 3-D Printed Handgun –

Here is what gets to the heart of the argument. my father brought up technology. what happens with 3d printers? Will the government be able to stop the transfer of a blue print of a gun over the internet any better then they can stop the transfer of music and video.
And then there’s Defense Distributed, a.k.a. the Wiki Weapon Project, the initiative cooked up by a University of Texas Law student and some of his buddies to 3-D print a working firearm. The group’s Indiegogo funding campaign was shut down in the early going and 3-D printer maker Stratasys revoked the lease on Wiki Weapon’s fabricator at one point, but through Bitcoin and other technology providers they’ve managed to keep the project alive and funded.

Last we saw the Defense Distributed boys out on the range, they were firing an AR-15 rifle with a 3-D printed lower receiver–not of their own design, but one that is already available out there on the Web. They managed to get six rounds off before the plastic component broke, but they learned a bit about recoil and stress as they pertain to 3-D printed plastic in the process. These guys seem pretty serious about bringing their own, freely distributed, publicly available printable firearm design into being relatively soon, which could make 2013 an interesting year in terms of ethics and legal infrastructure that are scrambling to keep up with accelerating 3-D fabrication technologies.

Fox News Blames Shooting on ‘Online Activities,’ ‘Gaming’

December 16, 2012

(gawker.com)Fox News anchor Megyn Kelly and analyst Keith Ablow got together on the network today to chat about some of the factors in today’s tragic shooting in Connecticut. “You mentioned earlier how people lose themselves in online activities, gaming and what have you,” Kelly remarks. “Reality TV is no friend of preventing such things,” Ablow responds. “Facebook is no friend of preventing such things.” What fantastic, nuanced analysis.(I didn’t shoot anyone)

Just great. The UN wants to take over the Internet.

December 7, 2012

From Al Arabiya/AFP:

Telecom companies across the world may be given the opportunity to dig through data passed across the Internet more easily following a move to allow the United Nations new authority to regulate the Web.
At a conference in Dubai this week, Members of the United Nation’s International Telecommunications Union (ITU) agreed to work towards implementing a standard for the Internet that would allow for eavesdropping on a worldwide scale.
The ITU members decided to adopt the Y.2770 standard for deep packet inspection, a top-secret proposal by way of China that will allow telecom companies across the world to more easily dig through Web data, according to a report from Russia Today.
The gathering which opened this week in Dubai of the U.N.’s International Telecommunications Union has triggered fierce objections from Washington, and from Internet freedom activists who fear new rules that could end the freewheeling system of the Internet.
The U.S. House of Representatives voted unanimously on Wednesday to oppose any efforts to give the United Nations new authority to regulate the Internet.
The 397-0 vote, following a similar vote in the Senate, came as delegates were meeting in Dubai to revise a global telecom treaty, a gathering which some say could be used to impose new controls on the Internet.
Representative Greg Walden said ahead of the vote that lawmakers should “send a strong bipartisan, bicameral signal about America’s commitment to an unregulated Internet.”
He said Washington should not “stand idly by while countries like Russia and China seek to extort control over the Internet.”

The Internet controlled by the same countries that control the UN?
Even though it is unlikely to happen anytime soon, in a decade or two this could become a major problem.
Just imagine what Muslim countries could band together to do to the Internet.

Your Facebook Comments, Coming Soon to a Google Search Near You

November 3, 2011
(h/t Republican Jewish Coalition / WebMonkey.com) Mind what you say in Facebook comments, Google will soon be indexing them and serving them up as part of the company’s standard search results. Google’s all-seeing search robots still can’t find comments on private pages within Facebook, but now any time you use a Facebook comment form on a other sites, or a public page within Facebook, those comments will be indexed by Google. The new indexing plan isn’t just about Facebook comments, but applies to nearly any content that’s previously been accessible only through an HTTP POST request. Google’s goal is to include anything “hiding” behind a form — comment systems like Disqus or Facebook and other JavaScript-based sites and forms. Typically when Google announces it’s going to expand its search index in some way everyone is happy — sites get more searchable content into Google and users can find more of what they’re looking for — but that’s not the case with the latest changes to Google’s indexing policy. Developers are upset because Google is no longer the passive crawler it once was and users will likely become upset once they realize that comments about drunken parties, embarrassing moments or what they thought were private details are going to start showing up next to their names in Google’s search results. For now most of the ire seems limited to concerned web developers worried that Google’s new indexing plan ignores the HTML specification and breaks the web’s underlying architecture. To understand what Google is planning to do and why it breaks one of the fundamental gentleman’s agreements of the web, you first have to understand how various web requests work.There are two primary requests you can initiate on the web — GET and POST. In a nutshell, GET requests are intended for reading data, POST for changing or adding data. That’s why search engine robots like Google’s have always stuck to GET crawling. There’s no danger of the Googlebot altering a site’s data with GET, it just reads the page, without ever touching the actual data. Now that Google is crawling POST pages the Googlebot is no longer a passive observer, it’s actually interacting with — and potentially altering — the websites it crawls. While it’s unlikely that the new Googlebot will alter a site’s data — as the Google Webmaster Blog writes, “Googlebot may now perform POST requests when we believe it’s safe and appropriate” — it’s certainly possible now and that’s what worries some developers. As any webmaster knows, mistakes happen, especially when robots are involved, and no one wants to wake up one day to discover that the Googlebot has wreaked havoc across their site. If you’d like to stop the Googlebot from crawling your site’s forms, Google suggests using the robots.txt file to disallow the Googlebot on any POST URLs your site might have. So long as you’re surfacing your content in other ways — and you should be, provided you want it indexed — there shouldn’t be any harm in blocking the Googlebot from POST requests. If, on the other hand, you’d like to stop the Googlebot from indexing any embarrassing comments you may have left on the web, well, you’re out of luck. [Photo by Glen Scott/Flickr/CC]

I’ve nothing to lose at this point, but the rest of you do