Hacker News from Y Combinator

Syndicate content
Links for the intellectually curious, ranked by readers. // via fulltextrssfeed.com
Updated: 19 min 14 sec ago

FBI Case Against Aaron Swartz Declassified and Released for the First Time

19 min 14 sec ago

Aaron Hillel Swartz (November 8, 1986 – January 11, 2013) was an American computer programmer, writer, political organizer and Internet Hacktivist.

Swartz was involved in the development of the web feed format RSS, the organization Creative Commons, the website framework web.py and the social news site, Reddit, in which he became a partner after its merger with his company, Infogami. Swartz's later work focused on sociology, civic awareness and activism.

He helped launch the Progressive Change Campaign Committee in 2009 to learn more about effective online activism. In 2010 he became a research fellow at Harvard University's Safra Research Lab on Institutional Corruption, directed by Lawrence Lessig.

He founded the online group Demand Progress, known for its campaign against the Stop Online Piracy Act. On January 6, 2011, Swartz was arrested by MIT police on state breaking-and-entering charges, after systematically downloading academic journal articles from JSTOR. Federal prosecutors later charged him with two counts of wire fraud and 11 violations of the Computer Fraud and Abuse Act, carrying a cumulative maximum penalty of $1 million in fines, 35 years in prison, asset forfeiture, restitution and supervised release.

Two years later, two days after the prosecution denied his lawyer's second offer of a plea bargain, Swartz was found dead in his Brooklyn, New York apartment, where he had hanged himself. In June 2013, Swartz was posthumously inducted into the Internet Hall of Fame.

Originally, this page was setup to house the FBI File of Aaron Swartz - however, the Secret Service also released a cache of records pertaining to the internet activist.

Below, you will find the original story published by The Black Vault, along with the FBI File, and the records released by the Secret Service thus far.

FBI Records

FBI File of Aaron Swartz [ 25 Pages, 1.77MB ]

 FBI File on the case against Aaron Swartz (Case #288A-WF-238943) [ 87 Pages, 4.71MB ]

 FBI File on the PACER system being compromised, Case #288A-WF-238943 [ 221 Pages, 6.9MB ]

Secret Service Records

 Secret Service Response to Request for Aaron Swartz' Records Part 1 [ 104 Pages, 4.38MB ]

 Secret Service Response to Request for Aaron Swartz' Records Part 2 [ 26 Pages, 1.4MB ]

 Secret Service Response to Request for Aaron Swartz' Records Part 3 [ 379 Pages, 22.59MB ]

 Secret Service Response to Request for Aaron Swartz' Records Part 3 - Spreadsheet [ 4,067 Pages, 7.71MB ]

 Secret Service Response to Request for Aaron Swartz' Records Part 3 - Photographs [ 190 Pages, 29.81MB ]

 Secret Service Response to Request for Aaron Swartz' Records Part 4 [ 1 Page, 131kb ]

 Secret Service Response to Request for Aaron Swartz' Records Part 5 [ 1 Page, 121kb ]

 Secret Service Response to Request for Aaron Swartz' Records Part 6 [ 1 Page, 128kb ]

 Secret Service Response to Request for Aaron Swartz' Records Part 7 [ 7 Pages, 450kb ]

 Secret Service Response to Request for Aaron Swartz' Records Part 8 [ 237,397 Pages, 424.35MB ]

 Secret Service Response to Request for Aaron Swartz' Records Part 9 [ 90 Pages, 4.21MB ]

 Secret Service Response to Request for Aaron Swartz' Records Part 10 [ 259 Pages, 21.54MB ]

 Secret Service Response to Request for Aaron Swartz' Records Part 11 [ 17 Pages, 930kb ]

Secret Service Response to Request for Aaron Swartz' Records Part 12 (Not yet released by the Secret Service)

 Secret Service Response to Request for Aaron Swartz' Records Part 13 [ 254 Pages, 19.76MB ]

 Secret Service Response to Request for Aaron Swartz' Records Part 14 [ 9 Pages, 0.7MB ]

 Secret Service Response to Request for Aaron Swartz' Records Part 15 [ 13 Pages, 1.2MB ]

 Secret Service Response to Request for Aaron Swartz' Records Part 16 [ 547 Pages, 21.64MB ]

 Secret Service Response to Request for Aaron Swartz' Records Part 17 [ 403 Pages, 12.69MB ]

 Secret Service Response to Request for Aaron Swartz' Records Part 18 [ 2 Pages, 0.15MB ]

Tim Berners Lee slams Internet fast lanes: ‘It’s bribery.’

19 min 14 sec ago
By Brian Fung September 19 at 10:51 AM

A quarter-century ago, Timothy Berners-Lee designed the world's first Web browser and server, kicking off a thing that people started calling the World Wide Web.

In a visit to The Washington Post, on Thursday, Berners-Lee said that system is now in danger from Internet service providers (ISPs) who stand to amass too much power over what was intentionally built as a decentralized network — one where no single actor could dictate outcomes to everyone else.

Berners-Lee pushed back against opponents of net neutrality regulation who argue that applying new rules on ISPs is tantamount to regulating the Internet. But there's a difference between regulating providers of broadband and the services that run on top of it, said Berners-Lee. Strong net neutrality rules would help preserve that line dividing the two and limit the incentive of ISPs to meddle in the market for services.

"A lot of congressmen say, 'Well, sign up for the free market' and feel that it's just something you should leave to go by itself," said Berners-Lee. "Well yeah, the market works well so long as nobody prints money. So we have rules, okay? You don't steal stuff, for example. The U.S. dollar is something that everyone relies on. So the government keeps the dollar a stable thing, nobody steals stuff, and then you can rely on the free market."

When Berners-Lee built the Web, he took the telephone wire coming out of his wall, plugged it into his computer and could instantly connect to any other computer. He didn't have to ask his telephone company's permission to introduce a new feature, he said.

But the rules currently being deliberated by the Federal Communications Commission, which would tacitly allow ISPs to charge content companies for priority access to consumers, would change how easily inventors can spread their ideas. In such a future, Berners-Lee warned, new technologies and companies might crop up faster in other countries if services were forced to "bribe" their way to success.

"We need rules," said Berners-Lee. "If businesses are to move here and start here rather than start in Europe or Brazil or Australia — they're going to look around and make sure, 'Oh, does the power stay up?' And they'll look for other things. "Is the Internet open?' Will they have to effectively bribe their ISPs to start a new service? That's what it looks like from the outside. It's bribery."

To many consumers, "the Internet" is simply the collection of applications and services they use on a daily basis, not the technical equipment and business relationships that help shape the economics of the Web. And to Berners-Lee, that's a good thing: It lets people go on with their lives and keep the rest of the economy running.

"How the technical bit — all the deals about peering all that — is really complicated and difficult," said Berners-Lee. "That is something normal people in the street aren't going to understand — and they shouldn't have to! If you have to start understanding what's happening inside, then the Internet has failed already."

Brian Fung covers technology for The Washington Post.

SECTION: {section=business/technology, subsection=null}!!!
INITIAL commentConfig: {includereply=true, canvas_permalink_id=washpost.com/8bvh5zpd9k, allow_comments=true, commentmaxlength=2000, includeshare=true, display_comments=true, canvas_permalink_app_instance=m6yzjj840m, display_more=true, moderationrequired=false, includefeaturenotification=true, canvas_allcomments_id=washpost.com/km4ey0dajm, comments_period=14, defaultsort=reverseChronological, includevoteofftopic=false, allow_videos=false, includesorts=true, markerdisplay=post_commenter:Post Commenter|staff:Post Writer|top_commenter:Post Forum|top_local:Washingtologist|top_sports:SuperFan|fact_checker:Fact Checker|post_recommended:Post Recommended|world_watcher:World Watcher|cultuer_connoisseur:Culture Connoisseur|weather_watcher:Capital Weather Watcher|post_contributor:Post Contributor, childrenitemsperpage=3, includeheader=true, includeverifiedcommenters=true, defaulttab=all, includerecommend=true, includereport=true, maxitemstop=2, source=washpost.com, allow_photos=false, maxitems=7, display_ugc_photos=false, includepause=true, canvas_allcomments_app_instance=6634zxcgfd, includepermalink=false}!!!

UGC FROM ARTICLE: !!!

FINAL commentConfig: {includereply=true, canvas_permalink_id=washpost.com/8bvh5zpd9k, allow_comments=true, commentmaxlength=2000, includeshare=true, display_comments=true, canvas_permalink_app_instance=m6yzjj840m, display_more=true, moderationrequired=false, includefeaturenotification=true, canvas_allcomments_id=washpost.com/km4ey0dajm, comments_period=14, defaultsort=reverseChronological, includevoteofftopic=false, allow_videos=false, includesorts=true, markerdisplay=post_commenter:Post Commenter|staff:Post Writer|top_commenter:Post Forum|top_local:Washingtologist|top_sports:SuperFan|fact_checker:Fact Checker|post_recommended:Post Recommended|world_watcher:World Watcher|cultuer_connoisseur:Culture Connoisseur|weather_watcher:Capital Weather Watcher|post_contributor:Post Contributor, childrenitemsperpage=3, includeheader=true, includeverifiedcommenters=true, defaulttab=all, includerecommend=true, includereport=true, maxitemstop=2, source=washpost.com, allow_photos=false, maxitems=7, display_ugc_photos=false, includepause=true, canvas_allcomments_app_instance=6634zxcgfd, includepermalink=false}!!

SECTION: {section=business/technology, subsection=null}!!!
INITIAL commentConfig: {includereply=true, canvas_permalink_id=washpost.com/8bvh5zpd9k, allow_comments=true, commentmaxlength=2000, includeshare=true, display_comments=true, canvas_permalink_app_instance=m6yzjj840m, display_more=true, moderationrequired=false, includefeaturenotification=true, canvas_allcomments_id=washpost.com/km4ey0dajm, comments_period=14, defaultsort=reverseChronological, includevoteofftopic=false, allow_videos=false, includesorts=true, markerdisplay=post_commenter:Post Commenter|staff:Post Writer|top_commenter:Post Forum|top_local:Washingtologist|top_sports:SuperFan|fact_checker:Fact Checker|post_recommended:Post Recommended|world_watcher:World Watcher|cultuer_connoisseur:Culture Connoisseur|weather_watcher:Capital Weather Watcher|post_contributor:Post Contributor, childrenitemsperpage=3, includeheader=true, includeverifiedcommenters=true, defaulttab=all, includerecommend=true, includereport=true, maxitemstop=2, source=washpost.com, allow_photos=false, maxitems=7, display_ugc_photos=false, includepause=true, canvas_allcomments_app_instance=6634zxcgfd, includepermalink=false}!!!

UGC FROM ARTICLE: !!!

FINAL commentConfig: {includereply=true, canvas_permalink_id=washpost.com/8bvh5zpd9k, allow_comments=true, commentmaxlength=2000, includeshare=true, display_comments=true, canvas_permalink_app_instance=m6yzjj840m, display_more=true, moderationrequired=false, includefeaturenotification=true, canvas_allcomments_id=washpost.com/km4ey0dajm, comments_period=14, defaultsort=reverseChronological, includevoteofftopic=false, allow_videos=false, includesorts=true, markerdisplay=post_commenter:Post Commenter|staff:Post Writer|top_commenter:Post Forum|top_local:Washingtologist|top_sports:SuperFan|fact_checker:Fact Checker|post_recommended:Post Recommended|world_watcher:World Watcher|cultuer_connoisseur:Culture Connoisseur|weather_watcher:Capital Weather Watcher|post_contributor:Post Contributor, childrenitemsperpage=3, includeheader=true, includeverifiedcommenters=true, defaulttab=all, includerecommend=true, includereport=true, maxitemstop=2, source=washpost.com, allow_photos=false, maxitems=7, display_ugc_photos=false, includepause=true, canvas_allcomments_app_instance=6634zxcgfd, includepermalink=false}!!

Implementing XKCD #688

19 min 14 sec ago
readme.md

XKCD #688 is meta -- it includes information about itself. Why not try and recreate it?

Of course, there are certain inaccuracies. I didn't try and make the bars the exact same width and the like. The last panel doesn't accurately portray the comic, as the resolution isn't high enough. All that aside, this script works in general.

It turns out that this is an example of fixed-point iteration. In essence, that's just trying to estimate the original function (which just says this pixel is black, this pixel is white). The core of this algorithm is just a for-loop; the function always returns a correct value for the current estimate! As you might guess (I didn't), this estimate might become unstable.

According to Prof. Jarvis Haupt, this algorithm could possibly become unstable. Because the third panel has an infinite regression, it might be possible to set up a non-contractive map where the sequence grows too much and never converges. I couldn't get my comic to be unstable; I'm guessing I could have made long and painful modifications to the code to change it.

This is abstract and non intuitive; you can't see it in the real world. To see it in the real world, let's instead pretend we have a similar equation:

To implement this equation, we would only do

def f(x): return 3.7 * x * (1 - x) x = 0.5 while True: x = f(x)

This for-loop doesn't appear to be unstable, but after you examine the first couple values, you can see it clearly is. The first five values of x are approximately 0.59, 0.89, 0.35, 0.84, 0.48 -- they don't seem to converge to any particular value.

This problem gets incredibly complex when you consider any r instead of just 3.7. For this particular and simple function, there's a graph of values of r and the final settling value:


"LogisticMap BifurcationDiagram" by PAR - Own work. Licensed under Public domain via Wikimedia Commons.

As you can see, for certain values of r, this is unstable. For other values of r, this algorithm is stable. For some values of r (say r=3.2), this algorithm oscillates between two values. Of course, there's a whole field of study behind this. Stability and chaos experts study similar problems in great depth. I don't know the details to explain the full stability, especially the reason there are stable regions surrounded by unstable regions.

This is a classic example of the confusion that is found in almost all of mathematics. It's a simple concept (just a for-loop!) that has very deep theory behind it (when is it chaotic?).

Biggest Mistakes in Web Design 1995-2015

19 min 14 sec ago

I've gathered what I think are the biggest web design mistakes committed during the period 1995 to 2015. Yes, it is a little facetious to say these mistakes will be made in the year 2015, but it's human nature to repeat your mistakes over and over. But it's human nature to repeat your mistakes over and over <grin>.

I've added more material to the article—especially the sections on Contrast, Graphics, Flash, JavaScript and Text. I've added more videos and screenshots, including those of sites that have changed. I have proof of how bad they used to be.

Some mistakes I'll discuss aren't actually design mistakes in the classic sense—ugly graphics, bad navigation, etc.—but serious big picture problems.

1. Believing People Care About You And Your Website.

These women are laughing at you. Why? You designed your website for your needs, not theirs. It gets worse. After they stop laughing, they're going to one of your competitors' sites to buy something.

Here's an email I received about the topic:

Powerhouse is a UK electrical goods retail store. We knew they had a nice bread maker at an even nicer price, so (we) went to their website to see if we could buy it.

Because we use Firefox, we weren't allowed in. (Note: The site has disappeared. Did a Firefox-free environment cause this to happen? I don't know, but I'm sure you don't want to find out what happens if you keep potential customers out of your site. — vf)

Comet's website worked a treat and they have our money now! (Ironically they're gone, too, though I suspect not for web design reasons).

Write these two sentences where you can see them as you're working:

  1. The only reason my website exists is to solve my customers' problems.
  2. What problems does the page I'm looking at solve?
Nobody Cares About You Or Your Site.

Really. What visitors care about is solving their problems. Now. Most people visit a website to solve one or more of these four problems:

  1. They want/need information
  2. They want/need to make a purchase / donation.
  3. They want/need to be entertained.
  4. They want/need to be part of a community.

Too many organizations believe that a website is about opening a new marketing channel, getting donations, promote a brand, or increase company sales by 15%. No. It's about solving your customers' problems. Have I said that phrase enough?

If there ever was a site that was not about meeting their customers' needs, it has to be the Association of International Glaucoma Societies. They've "fixed" the site, but here is a video catching them in the act. The YouTube version is below.

2. A Man From Mars Can't Figure Out What Your Website Is About In Less Than Four Seconds.

You should be able to look at the home page of any site and figure out what the site is about in less than four seconds. If you can't, the site is a failure. A current example of a site that would confuse a man from Mars is Genicap.

People who make Mistake #1 often end up making Mistake #2. An example of a site that fails the Four-Second Test is an older version of the Mars Hill site (shown below).

Is this an African sewing cooperative? Are they trying to sell us something? What is this site about? Who knows? Who is going to care enough to stay around and find out?

The name of the organization (Mars Hill) and the "tag line" (we are beginning a new season of covenant) tell you nothing. Non-profit organizations are the worst offenders when it comes to names and tag lines. Here's a typical non-profit organization's name and tag line:

      Big Hands of Hope
      – It's all about compassion

No. It's all about solving your visitors' problems. Nothing in the name or tag line tells you this organization helps African children.

Here's an over-the-top example of a name and tag line that's better:

      Save the African Children
      – We keep them from dying a horrible death

Yes, you must tone down the tag line, but at least you understand the mission of the organization.

back to top

3. Contrast, Dammit.

According to Wikipedia: "Contrast is the difference in visual properties that makes an object (or its representation in an image) distinguishable from other objects and the background." According to Vincent Flanders: "Without proper contrast, visitors to your site can't read the text and if they can't read it, they will leave it." Here's a website that explains the need for contrast — and it's done visually.

Of all the 16 mistakes, this one is the most mystifying. How is it possible not to notice you can't read the text on a page?

The image below is a piece of the menu on one of the subpages at Tampax.com. If you click the image, you'll see a full-size version. (The file was saved at 99%, so there's no loss in quality.)

Can you read it? Not really.

Harmony Central had an excellent forum entry called Why Do Web Designers use Light Gray Type on a White Background? The second comment (by "Slight Return") hits the nail on the head—designers know their content so they can read it even if they can't really see it.

Many otherwise excellent web designers fall victim to Gray, Light, and Bad Type or "GLBT." One site, that will remain somewhat anonymous, had an article about an interesting use of PHP (he's a programmer, not a web designer). Right away, I knew I was looking at a GLBT site.

Click the image below to see what I saw.

I ran the page through AccessColor and, if you click the image below, you'll see how badly the site failed to have sufficient contrast.

73.93% of the text fails to meet one of the contrast standards.

More Bad Contrast

Another example of bad contrast comes from The Wedding Lens (click picture for original size image).

Do you see the problem? If not, maybe this screen shot will help. If not, maybe a screenshot of the online Colour Contrast Analyser's results will help.

Learn About Contrast

There's no excuse for contrast problems. If you won't take my word for it, take Alistapart's. Between the two of us, you've got the omega and alpha of web design telling you to wo/man up and make your site readable. If you further doubt the importance of contrast, visit Contrast Rebellion.

For the final word, here's a very good article.

back to top

4. Using Web Design Elements That Get In The Way Of The Sale.

Would you do this?

You're a fundraiser getting ready to make the ask for a large sum of money. You're the best fundraiser on the planet because you have a pitch that can move the heavens.

You walk into the donor's office, introduce yourself and place an information packet in front of the donor.

As you start to make your big pitch, the donor reaches into the packet, extracts the pledge form you hope he'll sign and grabs a pen.

As the donor starts to sign the lucrative, long-term pledge you reach over across the table, grab the donor by the throat and yell. “Not so fast, jerkface! I haven't finished my pitch!!!”

You wouldn't do that, would you? Then why are you using design techniques that keep the visitor from getting to the sale? These design techniques are the web equivalent of grabbing someone by the throat because they violate the golden rule of doing business on the Web — "Don't do anything that gets in the way of the sale."

Here's an example of getting in the way of the sale — literally — as performed by Jakob Nielsen. Well, somebody at the Nielsen Norman Group. As the person who sent this mistake in to me said in his email:

So I get this link on a newsletter that informs me that Dr. Nielsen will be holding usability conferences around the globe - one in a city only a few hours drive from where I live. Clicking the link that's 2/3rds the way down the newsletter, I find myself on a nice, concise and easy to navigate page that provided me with all the information I need.

Sold on the idea of attending, I see and click one of the links on the left column menu under each conference city entitled "Registration" for the locale of my desire - only to find that I can't register.

When you click on the link, you're given the ubiquitous basic authentication username and password dialog with no instructions on how to enter said page. They want me to register for their conference, but they've thrown down a frustrating step into the process that forces a user to come back some other day.

I suspect this occurred by not testing all the links on a machine that wasn't already authenticated. Even if this encumbrance occurs for some other reason, it's not the best way to show one's expertise in the area of usability.

Some of the many, many other techniques that get in the way of the sale: Splash Pages, FlashSplash Pages (Video), animations, lack of focal point on the page, too much text, too little text, too many pictures, etc. See any of my books or any article on this site for more examples.

When people arrive at your site, it's because they've made a commitment. They've clicked a link or an ad and now they are at your site so you don't have to try to seduce them. Let them in your site.

On the other hand, seduction is necessary when you buy ads on other sites and search engines. You have to seduce people before they click.

back to top

5a. "My (Blog / Website / Facebook Fan Page / Twitter Account) Is Everything."

I was hanging out at a non-web design conference and two of my friends sat down and wanted to talk about their websites. One of them kept loudly saying, "My blog is everything" until he got tired of me trying to tell him otherwise.

It's possible that his blog "is everything," but you can only make that statement when you've tried a website, Facebook fan page, Twitter account, and whatever other marketing activities are appropriate. For all I know, his "everything" might actually be Google ads and a simple landing page. Believing there's only one, true path is dangerous.

5b. Thinking your website is your marketing strategy.

Unless you're an online shop selling t-shirts, cameras — you get the picture — your website is not your marketing strategy. Your website is part of your marketing strategy.

If you take orders over the phone, don't get rid of your phone banks. If you're successfully using direct mail, don't stop. Heck, if the Yellow Pages are working for you, continue to use them. The hard part, is to find where your website fits in your marketing strategy.

Here's an email I received from a dear friend who was consulting for an organization that was going to put all of its eggs into one web basket:

 I have to tell you that I attended a board meeting today for the organization whose website you checked out for me.

The board consists of high-end people who had flown in from all over the country! When push came to shove, they asked me what I thought about their ability to raise money by driving people to the website.

I shared your response with them. Silence in the room. And then a couple of other board members acknowledged that it needed work, affirming that they had the same impression but didn't have the expertise to say anything..

You can't put all your eggs in one electronic basket.

back to top

6. Have You Ever Seen Another Website? Really? Doesn't Look Like It.

I usually don't let bad design affect me, but there's one mistake that really gets under my skin. I don't understand how it's possible to create web sites that look like car wrecks on the information highway.

Sites like Accept Jesus, Forever Forgiven puzzle me. Hasn't anyone at this organization seen another web site? Have they been to Amazon.com or the ASPCA or even to Catholic.net (which sucks, but is infinitely better than AJFF)?

I have a selection of these types of sites in an article called Over-the-top Websites.

7. Navigational Failure.

All web navigation must answer these questions:

Where am I?
Where have I been?
Where can I go next?
Where's the Home Page?
Where's the Home Home Page?

Navigation must be simple and consistent.

Common mistakes include different types of navigation on the same site, a link to the current page on the current page (home page link on home page), poorly worded links so the visitor doesn't know where he'll go if he clicks, no links back to the home page and confusing links to the home page.

A problem that isn't discussed much is the order of the navigational items. Many sites set the link order based on their needs. In this example, the links make sense to this organization (Home, About Us) and they believe the graphics give a down-home feel. If the links were based on their users needs, the links would have a different order and the pictures would be replaced with something more helpful.

There are millions of ways to screw up navigation. This video of the Tampax web site shows you one unusual way to screw up navigation.

back to top

8. Using Mystery Meat Navigation.

While there are 10 million ways to screw up your navigation, the best way to screw it up is to use Mystery Meat Navigation (MMN). Here's my definition:

Mystery Meat Navigation occurs when, in order to find specific pages in a site, the user must mouse over unmarked navigational "buttons" — graphics that are usually blank or don't describe their function. JavaScript code then reveals what the real purpose of the button is and where it leads.

Wikipedia used to have a great definition of MMN:

Mystery meat navigation (also abbreviated MMN) is a term coined and popularized by author, web designer, and usability analyst Vincent Flanders to describe user interfaces (especially in web sites) in which it is inordinately difficult for users to discern the destinations of navigational hyperlinks—or, in severe cases, even to determine where the hyperlinks are. The typical form of MMN is represented by menus composed of unrevealing icons that are replaced with explicative text only when the mouse cursor hovers over them.

Flanders adopted the epithet mystery meat because, like the unidentifiable processed meat products historically served in many American public school cafeterias, MMN is unfathomable to the casual observer. Before conceiving the term mystery meat navigation, Flanders temporarily described the phenomenon as Saturnic navigation, a phrase named for the Saturn Corporation, whose web site formerly served as a high-profile example of this web usability problem.

Certain types of sites are allowed to use MMN: music, band, movie, art, experimental, fashion — sites where making an impression or being cool is mandatory.

Another exception is what I would call "cult sites" — sites that are so popular with a specific group that their audience automatically commits the icons to memory. The old version of Slashdot immediately comes to mind. The stupid Path app interface is another example.

The problem with MMN is it influences designers and companies who aren't smart enough to realize they're not in the music, art, movie, or fashion business. When a manufacturing company (video) starts using MMN, you know the apocalypse can't be too far off. The University of Calgary used MMN for unknown reasons. Of course, they changed their site — improving it in the process, but they can't hide their crimes. The video below shows you what the University of Calgary looked like when it used MMN.

It's not just small companies and universities that suddenly become brain damaged. Big corporations also act like lemmings. The example below is from an older version of Qualcomm. Yes, that Qualcomm. You know, the company with a market cap of 113 billion dollars.

back to top

9. Your Website Lacks Heroin Content.

In his classic book Naked Lunch, which I read when I was 15, William Burroughs described heroin as the ultimate product. Why? Because people would crawl through the sewers and beg to buy it. In the non-drug world, there are very few products that can be classified as having heroin's appeal.

How many web sites have Heroin Content?

Heroin Content's characteristics vary by type of site — but you'll know it when you see it. One global characteristic, though, is frequently updated content. The best way to get people to come back to your site again and again is by having content they need, and then updating this content on a regular basis.

How do you create Heroin Content? The answer depends on the likes and dislikes of your audience. Remember, it's what your audience wants that counts. What I consider Heroin Content is somebody else's Quinine Content.

Here are some thoughts about web content.

  • Does your content solve your customers' problems or does it create problems?
  • Does your content match your audience's expectations?
  • Have you determined the purpose of your site?
  • Do you know your target audience?
  • Ask yourself: "What content do I have that would cause anybody in their right mind to visit my site a second, third, or fourth time?" This is extremely important. You might con (seduce) someone to visit your site once, but why would they want to come back a second, third, or fourth time? If you can't answer this question, you really shouldn't have a web site.
  • Is the content technically correct?
  • Does your customer need to know the content you're presenting?
  • Is the content current and updated frequently?
  • Can people find the content they're looking for?
  • Does my site have Heroin Content?

I just got through reading that Bill Gates wants to start a blog. Why would anyone in their right mind want to read it? Do you think it will contain Heroin Content? As Seth Godin brilliantly points out, blogs only work when they meet four of the following five conditions:

  1. Candor
  2. Urgency
  3. Timeliness
  4. Pithiness
  5. Controversy

Content Trumps Design. PostSecret is poorly designed. You have purple links on a black background, small text that doesn't contrast well with the background, lime green headers, a page that goes on and on, content that only changes once a week and a poorly designed logo. The postcards are often hard-to-read, especially on a small monitor.

The content, however, is extraordinary and ever since I discovered the site, I've visited it every week. Well, every week but the week I had brain surgery <painful grin>. This site is proof that content is much more important than design. Yes, my comments about the design are accurate, but meaningless because the site has Heroin Content.

Before you start saying, "My site also has Heroin Content so I don't have to worry about the design," let me point out a small fact. Your site doesn't have Heroin Content. Digg.com has it, YouTube borrows a lot of it, and Google is another site that has Heroin Content.

back to top

10. Forgetting The Purpose Of Text.

After ten years, you'd think web designers would understand how to use text, but they don't. Here are some helpful hints.

Text is Text. Don't use graphics or Flash for text. The first reason is it increases the size of the page; the second reason is it isn't search engine friendly; the third reason is the graphics are often of poor quality and are aliased (jaggy); and fourth, mistakes are hard to correct as this example demonstrates (it was originally created in Flash).

Gimme contrast. Web designers have fallen in love with creating text that doesn't contrast with the background.

One of the strangest set of text contrasts I've ever seen was at the University of Idaho Children's Center (it's been fixed), with blue and black text on a green background. It certainly hurt the eyes. They also alternated the color of their links — now that's a bad design technique you've probably never seen before.

As I said, they fixed their page, but the video below shows you the bad version.

If you're not sure about the contrast between the text and background at your site, check out your text contrast at JuicyStudio or AccessColor.

Don't use small text. Designers are also fond of using small text (especially on Flash sites). Hey, we're all getting older and as I often say, "If people can't see it, they will flee it." Here's an architectural firm that uses small graphics as text (as saved by Archive.org).

Keep text simple. The next video explains this concept better than I can.

Text Mistakes

There are lots of ways to misuse text on your website. The following checklist lists some of the different ways. Oh. If you check any of the items…well, you know what it means.

back to top

11. Too Much Material On One Page.

Yes, it's called a web page, but that doesn't mean you have to cram all your material on one page — just like this page does <grin>.

It's very easy to keep adding material to your home page until it gets out of control.

With so much content vying for attention, it’s difficult for the eyes to find the focal point. People get confused and they leave. A long web page means you have failed to organize your site properly — probably a combination of not planning your site and poor navigation. An example of too much material on one page is Arngren.net.

Oh. Yes, this page is too long. You're learning good web design by looking at bad web design.

back to top

12. Confusing Web Design With A Magic Trick.

Web design is the reverse of a magic trick. In a magic trick, you show the audience your right hand and perform the trick with your left. In Web design, you tell them where you’re going first—and then go there. People have expectations about web sites and they don't like surprises. It will certainly confuse them and it could make them want to leave and find a site that's less confusing.

If you're a dentist, your visitors expect your web site to look like it belongs to a dentist — not to someone who is going to the opera.

Here are web site photos of two dentists. Only one looks like what you expect a dentist to look like.

Speaking of magic tricks, links should be clearly labeled so your visitors won't be surprised when they click. I made a mistake of not labeling a link on one of my pages and I received the following email:

Because I use your site for design considerations, I access your site at work. You probably know where I'm going with this, but…it might not be a bad idea to warn folks when the content is going to be "racy."

Don't get me wrong, Sports Illustrated's swimsuit issue is tame (and in my view, the models are gorgeous), but some of us work in an environment of hyper-sensitivity! It would be nice to know when my PC is going to be non-PC so I can at least glance over my shoulder. (I know I could look at the address on the status bar to get a clue, but…)

Here's a link to what he thought was racy.

If you use a vague link description, or just say "Click Here" and don't tell people where they'll end up, they could be horribly surprised (and/or shocked and/or disgusted) when they click the link.

back to top

13. Misusing Flash.

Just as a raincoat is a tool that can be used for good or evil purposes, Adobe Flash can also be used for good or evil. It all comes down to how it's used and who is using it.

Unfortunately, there's a tendency to misuse Flash and because of space I can't go into every detail. Here's a typical scenario:

You have to watch a boring, soundless, twenty second flash introduction with no option to skip it. If you're still around when the content loads, the pain doesn't stop. There is a lovely eight or ten second delay between when you click one of the navigation options and when the content actually arrives.

Hugh MacLeod brilliantly sums up everything that I feel is wrong with Flash in one simple cartoon that is Not Suitable For Most Workplaces.

Annoying Flash Techniques I Have Witnessed
  • Forgetting to put a “Skip Intro” button, forcing visitors to see your stupid FlashSplash page every time they visit. The problem could be “solved” by setting a cookie so visitors only see the animation once unless they click a button to “play it again, Sam.”
  • Putting a “Skip Intro” button on the page. Of course, we all realize that a “Skip Intro” button signifies that the content on the page is worthless. Good Web designers only put content that must be viewed on a page. By giving them the option to skip this material, you're saying it's not worth seeing. If it isn't worth seeing, why do you have it on your site in the first place?

    No, I'm not trying to have it both ways. An introductory Flash animation is a Splash page. Splash pages, as we learned long, long ago, are not necessary. If you must have a “Skip Intro” button, make it big enough so people can see it and have it available as soon as the animation starts. Don't wait ten seconds to load the button. Here's a useful Flash video about Skip Intro.

  • Making people listen to music. If you have (original) music in your Flash animation, give people the option to turn off the music.

    And if people turn the off the music on one page, it means they don't want to hear it on any other page. There are dozens of sites where the programmer hasn't figured out how to make the music stop on all pages. They have a “Stop the Music” button on each page. Arrgh!!! A good example that may not be work appropriate (I warned you) is a fashion site where if you turn off the music on the FlashSplash page and click "Enter", the music automatically comes back on. The designer should be whipped (unless s/he likes to be whipped).

  • Creating a “non-Flash” version of a site that still includes some Flash animation. If you have an HTML version of your Flash site, make sure there's no Flash. There are few things more stupid than using Flash in a non-Flash Web site.
  • Using Mystery Meat Navigation with Flash—taking one bad technique and making it four times worse.
Proper Uses of Flash I Have Witnessed

There are many sites who understand how Flash should be used. The New York Times brilliantly used Flash and MMN (may be the only brilliant use of MMN) on their feature "A Look at 1,000 Who Died" (registration is free, but required).

You can view the casualty list by last name, branch, date of death, home town, home state, gender, age, type of death, and "other." On the "Other" page, they could use some contrast. Black type on brown background is hard to read.

Here's another great use of Flash. Once again, it's from the New York Times:

Budget Puzzle: You Fix the Budget

Today, you’re in charge of the nation’s finances. Some of your options have more short-term savings and some have more long-term savings. When you have closed the budget gaps for both 2015 and 2030, you are done.

When Flash works, the results are powerful.

Flash Today

I've been against Flash before it was cool, but my complaints are about using it inappropriately. Today there is an ethnic cleansing-like movement to eliminate Flash from the face of the earth and the movement's leader was Apple's Steve Jobs. There are many technical issues with Flash like overheating CPUs, but the biggest force against Flash is HTML 5 and the web standards people who hate everything proprietary unless it comes from Apple. Going Straight: How To Ditch Flash and Embrace the Future of the Web has links to a lot of the background about Flash's weaknesses as well as tips on how to go Flash-free.

back to top

14. Misunderstanding The Use Of Graphics

Graphic mistakes make the list because they keep showing up again and again.

Like Flash, there are so many ways to misuse graphics. I'm amazed by the number of sites with ugly graphics and the number that still use animated GIFs.

Speaking of animated GIFs, I think the winner of the "Holy Mother of God Can We Cram More Animated GIFs On A Web Page?" is the Haiti News Network or American Beauty Border Collies.

Graphical Misuse / Misunderstandings

There are only about a trillion ways to misuse graphics and images on your website. The following checklist lists some of the ways you can misuse graphics. Oh. If you check any of the items…well, you know what it means.

Our site changes the WIDTH and HEIGHT attributes of the IMG tag to a smaller size rather than scale the image down to the proper size.

In this YouTube video, Kishwaukee College demonstrates the problem.

This type of scaling is different than what's used in responsive web design where the height and width attributes are eliminated so the image can scale up or down.

Graphics are only used when they are important. Remember. If visitors see a graphic, their tendency is to click it.

The most important fact I learned from using heatmaps.

If you have a graphic near your content, somebody will click the graphic.

This graphic from the old home page was problematic. As you see, 123 folks thought this graphic was a link. It’s not and I frustrated 123 visitors.

The graphic is a problem because it wasn't tied to a particular article on the site.

I assumed people wouldn't click, but they did click. I immediately went to the other graphics on the page and made them links. This fact was worth the price of the service (CrazyEgg).

The article Improving the User Experience (and conversion rate) with Heat Maps also reached similar conclusions.

We don't specify the exact size of a graphic using the HEIGHT and WIDTH attributes. For example, the code for the image above is:

<img src="http://cdn-webpagesthatsuck.com/does-my-web-site-suck-v2.jpg" alt="Does my web site suck" width="41" height="80" />

When you specify the height and width of the graphics, the browser doesn't have to calculate the size, which makes the page load faster.

NOTE: If you're using responsive web design techniques, this dictum doesn't apply. You delete the height= and width= attributes because you want the image to scale.

Graphic symbols are not logical, don't look appropriate and are stupid like this animated shopping cart.

Note: It's preferable that a symbol have a text label to identify it as this capture of Amazon.com in the United States demonstrates.

Also, make sure your shopping cart graphic is named correctly depending on the conventions of the country. In the UK, a shopping cart is called "basket," as this screen capture of Amazon.co.uk illustrates..

back to top

15. Mystical belief in the power of web standards, usability, tableless CSS and HTML 5.

There is nothing wrong with Web Standards, usability, tableless CSS and HTML 5 except they're being touted by…guess who?…people who offer web design services specializing in…guess what?…Web Standards, usability, tableless CSS and HTML 5.

These are simply tools. Remember, nobody gets excited about the tools used to build a house ("Please tell me what brand of hammers you used!"). People get excited about how the house looks and performs.

Yes, Web Standards can make your site search engine friendlier, reduce bandwidth, etc.

The article 9 Ways to Misunderstand Web Standards provides interesting insights into the problem. Speaking of problems, the text doesn't show up in IE6 (could also occur because of ad-blocking software I'm using). Works fine in Firefox. Hmm.

back to top

16. JavaScript

This is the most controversial of my selections because JavaScript is more often used for good than evil.

The Number One problem with JS is security. Many of the browser exploits— especially with Internet Explorer— can be traced to JavaScript. Microsoft's solution to many security holes is to "turn off scripting" (JavaScript).

The second problem is JS bloats up your page. My home page goes from 31,803 bytes to 71,488 because of the JavaScript I've added to track visitors to my page and what they do. Then, there's the issue of all those ads on this site. They're run by JS. No ads, no money. The benefits make JavaScript so wonderful, but there's a cost. I know, I know. I'm getting a lot of benefit out of JS, which brings me to problem three.

The third problem is that JavaScript, until recently, has been a browser resource hog that page take longer to load. Fortunately, all the browser makers are making inroads on rewriting their JavaScript engines. Microsoft rewrote, Firefox rewrote their engine, Google rewrote their JS engine, as well as Opera and Safari.

On the other hand, when it comes to apps — and mobile apps are the future — JavaScript sucks because they tend to make the apps run slow, The article Why mobile web apps are slow is incredibly important in our attempts to understand the issues with JavaScript.

The fourth problem is all the widgets in use today. All these widgets use JavaScript and most of these widgets are poorly written and cause delays in loading web pages.

The fifth problem comes from JS conflict. So many JS programs say they want to be the last item before the BODY tag. Who wins? Does it make a difference? Then there's the problem of other JS programs in the HEAD of the page. Most of us don't write JS so we copy and paste elements from scripting sites. I have had conflicts and it takes forever to find the problem when you're not the programmer.

back to top

JavaScript is in fact a trademark owned by Oracle

19 min 14 sec ago
ADVISORY: If you experience any trouble when clicking on the direct links to published records in Notices of Publication received via email and/or that appear in TSDR (Trademark Status and Document Retrieval), you can access the individual record by either 1) copying the link from the email or Notice of Publication in TSDR and pasting it into the url address of a supported browser (IE 10.0 or later and the latest versions of FireFox, Chrome or Safari); or 2) visiting the TMOG main page at https://tmog.uspto.gov/, selecting the Official Gazette issue date from the list of Issues in the upper left corner and then entering the 8 digit serial number in the box labeled “Search By” and clicking on the magnifying glass icon to execute the search. ADVISORY: If you experience any trouble when clicking on the direct links to published records in Notices of Publication received via email and/or that appear in TSDR (Trademark Status and Document Retrieval), you can access the individual record by either 1) copying the link from the email or Notice of Publication in TSDR and pasting it into the url address of a supported browser (IE 10.0 or later and the latest versions of FireFox, Chrome or Safari); or 2) visiting the TMOG main page at https://tmog.uspto.gov/, selecting the Official Gazette issue date from the list of Issues in the upper left corner and then entering the 8 digit serial number in the box labeled “Search By” and clicking on the magnifying glass icon to execute the search. Image Mark Ser No Reg No Status Filing Owner Class(es) Goods and Services Image Mark Ser No Reg No Status Filing Owner Class(es) Goods and Services Image Mark IR No IR Date Ser No Ref No Owner Class(es) Goods and Services Image Mark IR No IR Date Ser No Ref No Owner Class(es) Goods and Services Reference No. Filing Date Intl Reg No. Intl Reg Date Status Reference No. Filing Date Intl Reg No. Intl Reg Date Status

Ex-Employees Say Home Depot Left Data Vulnerable

19 min 14 sec ago

HTTP/1.1 302 Found Date: Sat, 20 Sep 2014 02:51:34 GMT Server: Apache Set-Cookie: NYT-S=deleted; expires=Thu, 01-Jan-1970 00:00:01 GMT; path=/; domain=www.stg.nytimes.com Set-Cookie: NYT-S=0M28bNnD83KtDDXrmvxADeHHkt/92BTFt4deFz9JchiAIUFL2BEX5FWcV.Ynx4rkFI; expires=Mon, 20-Oct-2014 02:51:34 GMT; path=/; domain=.nytimes.com Location: http://www.nytimes.com/2014/09/20/business/ex-employees-say-home-depot-left-data-vulnerable.html?_r=0 Content-Length: 0 Cneonction: close Content-Type: text/html; charset=UTF-8 HTTP/1.1 200 OK Server: Apache Cache-Control: no-cache Content-Type: text/html; charset=utf-8 Transfer-Encoding: chunked Date: Sat, 20 Sep 2014 02:51:34 GMT X-Varnish: 848957361 848942132 Age: 268 Via: 1.1 varnish X-Cache: HIT X-API-Version: 5-5 X-PageType: article Connection: close 002294

Sections Home Search Skip to content Skip to navigation View mobile version Technology|Ex-Employees Say Home Depot Left Data Vulnerable http://nyti.ms/1v1Jxtc See next articles See previous articles Photo

The data breach at Home Depot compromised the credit cards of 56 million of its customers. Above, a store in Seminole, Fla. Credit Joe Raedle/Getty Images Continue reading the main story Share This Page Continue reading the main story

The risks were clear to computer experts inside Home Depot: The home improvement chain, they warned for years, might be easy prey for hackers.

But despite alarms as far back as 2008, Home Depot was slow to raise its defenses, according to former employees. On Thursday, the company confirmed what many had feared: The biggest data breach in retailing history had compromised 56 million of its customers’ credit cards. The data has popped up on black markets and, by one estimate, could be used to make $3 billion in illegal purchases.

Yet long before the attack came to light this month, Home Depot’s handling of its computer security was a record of missteps, the former employees said. Interviews with former members of the company’s cybersecurity team — who spoke on the condition they not be named, because they still work in the industry — suggest the company was slow to respond to early threats and only belatedly took action.

Continue reading the main story Related Coverage

In recent years, Home Depot relied on outdated software to protect its network and scanned systems that handled customer information irregularly, those people said. Some members of its security team left as managers dismissed their concerns. Others wondered how Home Depot met industry standards for protecting customer data. One went so far as to warn friends to use cash, rather than credit cards, at the company’s stores.

Then, in 2012, Home Depot hired a computer engineer to help oversee security at its 2,200 stores. But this year, as hacks struck other retailers, that engineer was sentenced to four years in prison for deliberately disabling computers at the company where he previously worked.

Company officials said the malware used against Home Depot had not been seen before and would have been difficult to detect. Home Depot said on Thursday that it had patched any holes and that it is now safe for customers to shop there.

Any card used there between April and Sept. 2 might be vulnerable to being used fraudulently. Stephen Holmes, a Home Depot spokesman, said the company improved its security this year by encrypting register systems and switching to a new smart-chip-based payment standard in all stores.

“Our guiding principle is to do what’s right by our customers,” Mr. Holmes said. The company maintains “robust security systems,” he said.

Thefts like the one that hit Home Depot — and an ever-growing list of merchants including Albertsons, UPS, Goodwill Industries and Neiman Marcus — are the “new normal,” according to security experts. These people say retailers have not only been complacent about security, they have also been reluctant to share information with one another.

Government officials estimate that as many as 1,000 retailers have been infiltrated by variations of the malware that first struck another big retailer, Target, late last year, and then Home Depot this year. They say many companies do not even know they have been breached.

“This is happening to so many companies now, it is getting hard to keep track,” said Paul Kocher, the founder and chief scientist at the Cryptography Research division of Rambus.

Still, security experts were flabbergasted that Home Depot, one of the world’s largest retailers, was caught so flat-footed after the breach at Target, which resulted in the theft of data on more than 40 million cards before the holiday season.

After the Target theft, Home Depot’s chief executive, Frank Blake, assembled a team to determine how to protect the company’s network from a similar attack, said one person briefed on the project. In January, Home Depot brought experts in from Voltage Security, a data security company in California, these people said. By April, the company started introducing in some of its stores enhanced encryption that scrambled payment information the moment a card was swiped.

But criminals were already deep in Home Depot’s systems. By the time the company learned on Sept. 2 from banks and law enforcement that it had been breached, hackers had been stealing millions of customers’ card information, unnoticed for months. The rollout of the company’s new encryption was not completed until last week.

The retail industry is rushing to respond by forming threat-sharing associations — Home Depot was a founding member of one such group created earlier this year — and adopting new encryption and payment system they hope will thwart hackers.

But getting those efforts up and running could take months.

Several people who have worked in Home Depot’s security group in recent years said managers failed to take such threats as seriously as they should have. They said managers relied on outdated Symantec antivirus software from 2007 and did not continuously monitor the network for unusual behavior, such as a strange server talking to its checkout registers.

Also, the company performed vulnerability scans irregularly on the dozen or so computer systems inside its stores and often scanned only a small number of stores. Credit card industry security rules require large retailers like Home Depot to conduct such scans at least once a quarter, using technologies approved by the Payment Card Industry Security Standards Council, which develops technical requirements for its members’ data security programs. The P.C.I. Council requires that approved, third-party quality security assessors perform routine tests to ensure that merchants are compliant.

And yet, two former employees said, while Home Depot data centers in Austin, Tex., and Atlanta were scanned, more than a dozen systems handling customer information were not assessed and were off limits to much of the security staff. A spokeswoman for the P.C.I. Council in Wakefield, Mass., declined to comment on Home Depot specifically.

“Scanning is the easiest part of compliance,” said Avivah Litan, a cybersecurity analyst at Gartner, a research firm. “There are a lot of services that do this. They hardly cost any money. And they can be run cheaply from the cloud.”

Home Depot said the industry standards included an exception from scanning store systems that are separated from larger corporate networks, and it said the company had complied with P.C.I. standards since 2009.

In 2012, Home Depot hired Ricky Joe Mitchell, a security engineer, who was swiftly promoted under Jeff Mitchell, a senior director of information technology security, to a job in which he oversaw security systems at Home Depot’s stores. (The men are not related.)

But Ricky Joe Mitchell did not last long at Home Depot. Before joining the company, he was fired by EnerVest Operating, an oil and gas company, and, before he left, he disabled EnerVest’s computers for a month. He was sentenced to four years in federal prison in April.

Several former Home Depot employees said they were not surprised the company had been hacked. They said that over the years, when they sought new software and training, managers came back with the same response: “We sell hammers.”

A version of this article appears in print on September 20, 2014, on page A1 of the New York edition with the headline: Ex-Employees Say Home Depot Left Data Vulnerable .

More on nytimes.com

Site Index

Keyless SSL: The Nitty Gritty Technical Details

19 September 2014 - 7:00am
Keyless SSL: The Nitty Gritty Technical Details

All content © 2014

CloudFlare

. Proudly published with

Ghost

.

Microsoft OCR Library for Windows Runtime

19 September 2014 - 7:00am

This blog was written by Jelena Mojasevic, Program Manager at Microsoft.

We are pleased to announce that Microsoft OCR Library for Windows Runtime has been released as a NuGet package. The library empowers you to easily add text recognition capabilities in your Windows Phone 8/8.1 and Windows 8.1 Store apps.

OCR technology enables various scenarios – copying text from images, text search in images, translation… It was designed with flexibility and performance in mind, as it allows for OCR of high variety of image types and has numerous performance optimizations. The library runs entirely on client, supports 21 languages and enables processing images from various sources – camera, local file or network.

To get started, open your app project in Visual Studio and select PROJECT | Manage NuGet Packages, then search for Microsoft OCR. The documentation is available at MSDN in the same format as for Platform API. The library is free and there will be no fees for runtime licenses of commercial applications developed with the library.

The OCR Library extracts text and layout information from the image. When you add the OCR Library to an application, you control how your application interprets the returned text, for example you can recognize patterns such as email addresses, phone numbers, and URLs, and your app can launch common actions such as sending an email, making a phone call, or visiting a web site.

It’s really simple to run OCR on an image, demonstrated in code snippets below:

 

The RecognizeAsync method’s arguments are the image dimensions and byte array of image pixels in BGRA format. The extracted text and layout info are contained within OcrResult:

For more info about the OCR Library, visit MSDN page and download OCR library sample app. See the NuGet documentation for all the ways you can download and install the NuGet package in your project.

If you’d like to share your feedback with the Microsoft OCR team, please send a mail.

To Get More Out of Science, Show the Rejected Research

19 September 2014 - 7:00am

HTTP/1.1 302 Found Date: Fri, 19 Sep 2014 09:19:40 GMT Server: Apache Set-Cookie: NYT-S=deleted; expires=Thu, 01-Jan-1970 00:00:01 GMT; path=/; domain=www.stg.nytimes.com Set-Cookie: NYT-S=0MXjTg1ZZh0YvDXrmvxADeHzZ9eixQf2dCdeFz9JchiAIUFL2BEX5FWcV.Ynx4rkFI; expires=Sun, 19-Oct-2014 09:19:40 GMT; path=/; domain=.nytimes.com Location: http://www.nytimes.com/2014/09/19/upshot/to-get-more-out-of-science-show-the-rejected-research.html?_r=0 Content-Length: 0 nnCoection: close Content-Type: text/html; charset=UTF-8 HTTP/1.1 200 OK Server: Apache Cache-Control: no-cache Content-Type: text/html; charset=utf-8 Transfer-Encoding: chunked Date: Fri, 19 Sep 2014 09:19:40 GMT X-Varnish: 1052118142 Age: 0 Via: 1.1 varnish X-Cache: MISS X-API-Version: 5-5 X-PageType: article nnCoection: close 0022f2

Sections Home Search Skip to content Skip to navigation View mobile version The Upshot|To Get More Out of Science, Show the Rejected Research http://nyti.ms/1mj6bgR See next articles See previous articles Photo

Credit Carl Wiens Continue reading the main story Share This Page Continue reading the main story

In 2013, the federal government spent over $30 billion to support basic scientific research. These funds help create knowledge and stimulate greater productivity and commercial activity, but could we get an even better return on our investment?

The problem is that the research conducted using federal funds is driven — and distorted — by the academic publishing model. The intense competition for space in top journals creates strong pressures for novel, statistically significant effects. As a result, studies that do not turn out as planned or find no evidence of effects claimed in previous research often go unpublished, even though their findings can be important and informative.

For instance, a top psychology journal refused to consider studies that failed to replicate a disputed publication claiming to find evidence of extrasensory perception. In addition, the findings that do get published in these journals often just barely reach the statistical significance thresholds required for publication — a pattern that suggests selective reporting and publishing of results. Not surprisingly, other scientists often cannot reproduce published findings, which undermines trust in research and wastes huge amounts of time and money. These practices also create a shaky knowledge base for science, preventing scholars from effectively building on prior research.

This pattern of publication bias and failed replications, which is drawing attention in fields from psychology to medicine, has prompted great alarm within the scientific community. Now there are signs that these concerns have spread to policy makers. The Obama administration has asked for public comment on how the federal government can “leverage its role as a significant funder of scientific research to most effectively address” the replication crisis in science — a question that should be carefully considered given the evidence that current policies are not working.

One approach is to require researchers to share data, especially from studies conducted with public support. For instance, the National Institutes of Health and the National Science Foundation already require grantees to share data from their research. These sorts of requirements encourage transparency but, even if widely adopted, are unlikely to appreciably reduce bias in which studies are actually published.

Continue reading the main story

Others advocate requiring the registration of trials before data has been collected. For instance, some social scientists have voluntarily begun to preregister analysis plans for experiments to minimize concerns about selective reporting. Unfortunately, the demand for statistically significant results is still likely to create publication bias. For example, federal law and journal policies now require registration of clinical trials, but publishing of trial results has been found to be selective, to frequently deviate from protocols and to emphasize significant results. Access to trial data could be increased, but such a step is again unlikely to change which studies are published in the most influential journals.

Instead, my colleagues and I propose a radically different publishing model: Ask journal editors and scientific peers to review study designs and analysis plans and commit to publish the results if the study is conducted and reported in a professional manner (which will be ensured by a second round of peer review).

This procedure encourages authors and reviewers to develop the strongest possible designs — including those that replicate previously published studies — and eliminates perverse incentives to find or emphasize significant results after the fact. A new scientific format called Registered Reports using this approach has already been adopted at numerous journals across the social and natural sciences.

In a new white paper, I propose that the American Political Science Association offer options for articles in a Registered Reports-style format. Researchers in other academic disciplines and scientific associations are starting to do the same.

Unfortunately, overcoming the inertia of the current system will be difficult, which is why altering the incentives created by federal science policy is so important.

Scientists would change their ways much more rapidly if federal funding encouraged publishing in journals that used Registered Reports or other formats intended to minimize publication bias. Conversely, journals would be more likely to change their policies if it would help them attract research from top scientists. Appropriately enough, the best way to encourage scientific innovation might be to rethink how we organize the scientific enterprise itself.

Brendan Nyhan is an assistant professor of government at Dartmouth College. Follow him on Twitter at @BrendanNyhan.

The Upshot provides news, analysis and graphics about politics, policy and everyday life. Follow us on Facebook and Twitter.

More on nytimes.com

Site Index

Particle detector finds hints of dark matter in space

19 September 2014 - 7:00am

Researchers at MIT’s Laboratory for Nuclear Science have released new measurements that promise to shed light on the origin of dark matter.

The MIT group leads an international collaboration of scientists that analyzed two and a half years’ worth of data taken by the Alpha Magnetic Spectrometer (AMS) — a large particle detector mounted on the exterior of the International Space Station — that captures incoming cosmic rays from all over the galaxy.

Among 41 billion cosmic ray events — instances of cosmic particles entering the detector — the researchers identified 10 million electrons and positrons, stable antiparticles of electrons. Positrons can exist in relatively small numbers within the cosmic ray flux.

An excess of these particles has been observed by previous experiments — suggesting that they may not originate from cosmic rays, but come instead from a new source. In 2013, the AMS collaboration, for the first time, accurately measured the onset of this excess.

The new AMS results may ultimately help scientists narrow in on the origin and features of dark matter — whose collisions may give rise to positrons.

The team reports the observed positron fraction — the ratio of the number of positrons to the combined number of positrons and electrons — within a wider energy range than previously reported. From the data, the researchers observed that this positron fraction increases quickly at low energies, after which it slows and eventually levels off at much higher energies.

The team reports that this is the first experimental observation of the positron fraction maximum — at 243 to 307 gigaelectronvolts (GeV) — after half a century of cosmic ray experiments.

“The new AMS results show unambiguously that a new source of positrons is active in the galaxy,” says Paolo Zuccon, an assistant professor of physics at MIT. “We do not know yet if these positrons are coming from dark matter collisions, or from astrophysical sources such as pulsars. But measurements are underway by AMS that may discriminate between the two hypotheses.”

The new measurements, Zuccon adds, are compatible with a dark matter particle with mass on the order of 1 teraelectronvolt (TeV) — about 1,000 times the mass of a proton.

Zuccon and his colleagues, including AMS’s principal investigator, Samuel Ting, the Thomas D. Cabot Professor of Physics at MIT, detail their results in two papers published today in the journal Physical Review Letters and in a third, forthcoming publication.

Catching a galactic stream

Nearly 85 percent of the universe is made of dark matter — matter that somehow does not emit or reflect light, and is therefore invisible to modern telescopes. For decades, astronomers have observed only the effects of dark matter, in the form of mysterious gravitational forces that seem to hold together clusters of galaxy that would otherwise fly apart. Such observations eventually led to the theory of an invisible, stabilizing source of gravitational mass, or dark matter.

The AMS experiment aboard the International Space Station aims to identify the origins of dark matter. The detector takes in a constant flux of cosmic rays, which Zuccon describes as “streams of the universe that bring with them everything they can catch around the galaxy.”

Presumably, this cosmic stream includes leftovers from the violent collisions between dark matter particles.

According to theoretical predictions, when two dark matter particles collide, they annihilate, releasing a certain amount of energy that depends on the mass of the original particles. When the particles annihilate, they produce ordinary particles that eventually decay into stable particles, including electrons, protons, antiprotons, and positrons.

As the visible matter in the universe consists of protons and electrons, the researchers reasoned that the contribution of these same particles from dark matter collisions would be negligible. However, positrons and antiprotons are much rarer in the universe; any detection of these particles above the very small expected background would likely come from a new source. The features of this excess — and in particular its onset, maximum position, and offset — will help scientists determine whether positrons arise from astrophysical sources such as pulsars, or from dark matter.

After continuously collecting data since 2011, the AMS team analyzed 41 billion incoming particles and identified 10 million positrons and electrons with energies ranging from 0.5 to 500 GeV — a wider energy range than previously measured.

The researchers studied the positron fraction versus energy, and found an excess of positrons starting at lower energies (8 GeV), suggesting a source for the particles other than the cosmic rays themselves. The positron fraction then slowed and peaked at 275 GeV, indicating that the data may be compatible with a dark matter source of positrons.

“Dark matter is there,” Zuccon says. “We just don’t know what it is. AMS has the possibility to shine a light on its features. We see some hint now, and it is within our possibility to say if that hint is true.”

If it turns out that the AMS results are due to dark matter, the experiment could establish that dark matter is a new kind of particle, says Barry Barish, a professor emeritus of physics and high-energy physics at the California Institute of Technology.

“The new phenomena could be evidence for the long-sought dark matter in the universe, or it could be due to some other equally exciting new science,” says Barish, who was not involved in the experiments. “In either case, the observation in itself is what is exciting; the scientific explanation will come with further experimentation.”

This research was funded in part by the U.S. Department of Energy.

A Long, Ugly Year of Depression That’s Finally Fading

19 September 2014 - 7:00am

Date / sep 19, 2014 / Category / Moz, Personal, Psychology, Startups

Yesterday morning I woke up early to speak at the Business of Software conference in Boston. It was my first time there, and it’s an exceptional group. Then, after some meetings, I spoke in the afternoon at Hubspot’s Inbound conference (thankfully just across a long skybridge that connects Boston’s World Trade Center from its Convention Center). This morning, I woke up even earlier and caught a plane to Denver, where I presented on a panel with Moz’s investor, Brad Feld as well as Ben Huh of Cheezburger and Bart Lorang of FullContact at Denver’s Startup Week. Tomorrow, I’ll be reprising my role of organizer for Foundry’s CEO Summit (despite no longer being CEO).

Right now, I’m supposed to be at a dinner for Denver Startup week, but I’m all people’d out. I need to be an introvert. And I need to get back to blogging. And I really need to tell this story.


via Maria Louiza Biri on Behance

These past few days, I’ve met dozens of new people and talked to lots of old friends and colleagues, too. The exchanges always start the same: “How are you doing?” I get a pang of dread every time I hear it.

That’s because, when I’m asked, I try, most of the time, to be honest. I shrug my shoulders, look at the ground, and give a half-hearted smile alongside a verbal response like “not too bad,” or “could be worse,” or “I’ve been better,” or “eh, kinda sucky.” Compared to the 13 months between May of 2013 and June of 2014, I’m a lot better. I was seriously depressed for a long time, and a lot of people close to me knew it even when I didn’t know myself. I wrote about that some in my post Can’t Sleep; Caught in the Loop and I talked about it a little in my Dichotomy of 2013 post, too, but let’s call a spade a spade. Rand had depression.

Depression is why, in the last 18 months, I blogged the least I have since starting my career. It’s almost certainly part of why my engagement in all sorts of things has suffered. It’s why I could hardly bear to be away from Geraldine for more than a day or two at at time. And, it’s a big part of why I stepped down as Moz’s CEO.

That depression, I believe, stems from shame. I was and remain, ashamed of myself due to a series of wholly avoidable missteps I made at Moz. Brad likes to say that every company he and the Foundry team have ever invested in go through tough times and most have near-death experiences before they emerge into their mature form. Even with my darkest lenses on, I can’t say that Moz was ever “near-death” in the past couple years. But I can say that we – that I – made dumb move after dumb move, and when those moves revealed themselves to be dumb, I pushed us to double-down on them, compounding our problems. By sharing that story I hope to, as I’ve done in posts like this before, offload my pain through transparency and, hopefully, help save a few of you reading this post from making the same mistakes.

A Wrong Assumption

In 2011, Adam Feldstein, Moz’s Chief Product Officer, and I got together in a room and whiteboarded a series of goals and wireframes that we collectively dubbed “Moz Analytics.” It was to be a next-step in the evolution of Moz’s software suite, and a successor to our old “SEOmoz Pro” campaign product that tracked keyword rankings, crawl stats, links, and competitive data. When we sat down with Brad and Foundry in 2012, Moz Analytics was already underway, though we knew the project was understaffed given the seemingly massive task at hand. Our pitch was simple – over the next 5 years, we felt that marketers from many different backgrounds would be sharing the workload around efforts like SEO, social media, content marketing, online branding, competitive tracking, link building, etc. Basically – the disciplines that fall under “inbound marketing” and sit at the top of the customer acquisition funnel would all come together and be run by the same person (or people).

These marketers, we reasoned, would need a comprehensive suite in which they could manage all their SEO and social and content and brand marketing efforts in one place. We wanted to raise money to build that suite and grow our customer base beyond just “SEO” to these other disciplines, too.

My memory’s imperfect, but as I recall, in those early days, nearly everyone at Moz and at Foundry was on the same page – the idea made sense, and Moz was a company that could get this done.

The Worst Possible Way to Build Software

When the Foundry investment closed, we redoubled our efforts to build Moz Analytics. We hired more aggressively (and briefly had a $12,000 referral bonus for engineers that ended up bringing in mostly wrong kinds of candidates along with creating some internal culture issues), and spent months planning the fine details of the product. That product planning led to an immense series of wireframes and comps (visual designs of what the product would look like and how it would function) that numbered into the hundreds of screens. Our engineers estimated 6-9 months of work, with a launch date in July of 2012.

That July date came and went pretty fast. I was frustrated to find out that, based on our speed, October was more realistic, but I figured we had time, and we could accelerate some of innovation through acquisition. We spent a bit of our new VC money on Followerwonk and, later, GetListed (now Moz Local), along with AudienceWise.

Guess what? The date slipped again. And again. And again. From October to November to January (2013) to March (2013) to May (2013) and finally to October 2013 when the actual release occurred.

Many times, someone from our engineering team sat down with me during a 1:1 and told me that we were building software in the worst possible way, and that big bang releases are incredibly tricky to pull off, and always take much longer than expected, especially when you have an existing userbase to migrate. Anthony (our CTO) suggested numerous times that we instead build up to the features we eventually wanted on the existing SEOmoz Pro platform, but I felt that the marketing launch could never be as successful and couldn’t coincide with our rebrand – something that had been part of the plan all along. Moz Analytics and the shift from SEOmoz to Moz were always meant to go together I reasoned, and while delay hurt us, we were still growing, and tons of folks were still enjoying and getting value from our existing software. So long as we could release this amazing behemoth in the next 60 days (and I always thought we could until the last date push from May to October), I figured it was worth the wait.

What made me so foolish? Why was I so bullheaded? The answer’s obvious — I was arrogant. We had pulled off big releases in the past that everyone doubted – first with Linkscape (our web index, that no one believed we could build in 9 months on a $1mm budget), then with Open Site Explorer (which took 9 months as well, but only launched 1 month late), and then with SEOmoz Pro (which also launched a month late and with some serious bugginess for the first few months thereafter). I thought this would be another launch like those – a little late, but worth the wait.

In April of 2013, we had a very rough board meeting and a series of rough executive team discussions. We ended up replacing the engineering lead for Moz Analytics (who, humbly and awesomely, stayed with Moz another full year working as an individual contributor on the team) and after 2 weeks, Dudley, who took over, informed us that the earliest he thought the team could deliver the completed Moz Analytics would be in October of that year, moving out the expected timeframe from what we thought was 30-60 days away, to almost 6 more months. Worse still, we could only hit that date if we removed almost a dozen big features, including the entire “Content” section of the app – one of the most important, requested, and valuable pieces.

I could barely believe it, but had to accept reality. The big bang strategy had failed. The new, more iterative approach on the work to date would have to suffice. Building on top of the old codebase was no longer even an option. It was a shitty, shitty time, and when we announced the new schedule to the Moz team, I think morale at the company hit an all-time low.

Below is an email I sent to the team in May, just before we’d been scheduled to launch Moz Analytics and Moz.com together. It’s overly dramatic by a long shot (the risk of things going bad for the company was massively overstated, both in my own head and this email), but it may lend a sense for how I felt after more than a year of thinking launch was happening any day now (please also forgive the bad Lord of the Rings analogy, based on some prior emails about the launch)…

BTW – we never came close to having to worry about a layoffs situation, though even bringing up the word in an email (even just to say “we’re doing stuff to prevent having to worry about that”) created a lot of fear. It would take months of repeating “no layoffs” before I undid the harm that word in this email caused. Hopefully that can be one lesson you take away – “layoffs” is a Pandora’s Box-type word at a startup. Don’t use it unless you’re really being transparent (and not just fearful and overly panicked as I was).

The Rebrand that Worked

One of the decisions we made in May of 2013, after Dudley informed us of the new launch date for Moz Analytics, was to decouple the rebrand from the product launch. That was probably a smart move (and one we likely should have done much earlier). Our marketing and inbound engineering team (who handled the entire redesign of the website and the new signup funnel) executed brilliantly. I’ve still never seen a more exciting week for Moz than the week we announced the rebrand and started gathering signups for the new product. The positivity, joy, and outpouring of support from our community gave us all a renewed energy. I recall collecting the team all in one very full room (the lobby of the old Mozplex) and announcing metrics from the day’s work. Tens of thousands of new folks had already signed up to get beta access to Moz Analytics. Our site had its highest visit day ever. We received more press than we’ve ever gotten. I just looked – I had more than 350 messages of congratulations and outreach in my inbox in a 24 hour period.

Over the next few months, the rebrand continued to perform mostly positively. One slight downside was a slight loss in search traffic while the engines caught up with all the redirects, and another was how we’d chosen to manage our conversion funnel. Because we wanted our audience focused on the new product and not-too-attached to the old SEOmoz Pro, we first showed off Moz Analytics, collected an email address for the beta list signup and only then let folks sign up for the old product if they still wanted. That meant slightly less growth and slightly higher churn from those cohorts (likely because they anticipated something bigger and better to come).

At the end of September, more than 90,000 people had expressed interest in Moz Analytics. Even though we were tired, frustrated by our past mistakes, and reticent to get too excited, we felt a bit hopeful. Even with low conversion rates (we estimated as little as 5% of an opt-in list that specifically signed up for a beta product), it could be a huge bump for the business after a slow summer.

The Software Launch That Didn’t Work

When Moz Analytics finally did launch, it wasn’t pretty. Dudley and the engineers had told us to expect a buggy, rushed product, and unfortunately, that’s what happened. In addition the bugs, we got a lot of complaints about UI & UX, about missing features, and about the relatively low value add that all the new things we’d built into the platform provided. This wasn’t from everyone – there was plenty of praise too – but it was heavy enough that, in the months that followed, we’d struggle to stay at a flat trajectory (not where a startup that’s raised lots of VC money should be).

Below is a slide deck I shared at the last All-hands company meeting of my CEO tenure at Moz:

Historical Slide Deck from Moz’s Nov. 2013 Allhands Meeting from Rand Fishkin

I thought it would be a transparent, authentic way to share how things were going at the time and the journey we had ahead. Unfortunately, it didn’t go over that way and I received a lot of concerned feedback, in particular from Matt Brown and Sarah Bird, that the tone was too negative, too harsh, and, while it might be representative of my truth, wasn’t representative of reality. The depression was playing a part in that, but I didn’t understand it, yet. That “Loop” that was playing in my head over and over every night, costing me sleep, eating at my happiness and resolve, was also warping reality. I couldn’t see the forest or the trees – just the swamp.

Distractions that Took Us Off Course

Moz Analytics wasn’t merely a hard, painful project that had a lot of early stumbling points, it was also a distraction from a lot of the important work we could have been doing instead to help our customers. The Mozbar languished. Open Site Explorer languished. Mozscape languished (despite a tremendous amount of effort that, hopefully, will be paying off soon and has been built much more iteratively).  Our other research tools languished. Without attention from our team, some of these products came to be eclipsed by competitors who focused exclusively on one particular piece of the products in our subscription. It was heartbreaking to hear over and over again how one of my SEO friends had stopped using some particular Moz tool because they’d found tool X instead to take its place.

Had we been listening to our customers, iterating on the projects and products that mattered to them, and not consuming all of our development time and energy on long-delayed, poorly launched megasuite that did lots of things they didn’t need, we’d have been in a much different place at the end of 2013.

Learning our Lessons

Today, 9.5 months into 2014, we’ve been making those investments, and they’ve paid off. August was our best month since June of 2013. Churn rate is down. Monthly signups are up. The new Mozbar is beloved. Fresh Web Explorer’s alerts have been a hit. Keyword Difficulty & SERPs Analysis got some upgrades.  The new features in Moz Analytics like better onboarding, estimates for keyword referral data and the new action dashboard have increased rates of usage, vesting, and overall customer satisfaction. Due to the nature of SaaS businesses, 2014 will be a rough-looking year. We’ll probably grow 10% or less in overall revenue. But 2015 looks promising, even to a skeptic like me. A recent analysis by Scott Krager on SERPs showed us comparing favorably on some metrics to Hubspot, who just filed to go public.

That said, I am far from satisfied. The content section that was supposed to come out with the first release of Moz Analytics still hasn’t launched – in fact, dev work on it still hasn’t begun (that’s how much catch-up work we’ve needed to do). Years of effort into a new version of the Mozscape index are still a quarter or two away. There’s a new Open Site Explorer coming soon, but it’s not out yet. But, I think our biggest lag has been one of leadership — and one that’s now rectified.

Not being CEO has some good things about it, but it has some big frustrations too. I’ll share those in a future post. I don’t believe we would have made this change when we did if it weren’t necessary. I really wish that the CEO change could have come at a different time – a time more like now rather than like how things were in January. But I think maybe it took new leadership to help us turn things around (and that’s very hard for me to say, because I desperately want to believe that I could have led us here, too).

A Hopeful, But Different Future

People say it takes a big person to admit mistakes, and admit that they don’t have what it takes to lead. I don’t feel very big. In fact, I feel like what my Dad always told me I was – a high potential, low achiever kinda kid. High potential – I could have made those right decisions, or at least not invested for so long when the writing was on the wall. But instead, I’ve caused us to underachieve against our possibilities. And despite that, Mozzers have continued to support me. Our community has continued to support me. Brad and Michelle have continued to support me.

In July of 2013, I held my performance review with Sarah. It’s a tradition we’ve had, to review one another, and in thinking about the past and writing this post, I dug through my inbox to see what big things were happening that frustrating summer. I feel like this review (which I’ve excerpted just a few bits from) is telling:

We hadn’t yet talked about a potential move for me out of the CEO role, but there, at the end of one of the sections, in my own writing, are the words “If this company is to survive and grow, it needs a better leader.”

I feel hopeful because today, we have a better leader. I feel hopeful because she’s not an outsider with a “great resume,” installed by investors, but someone who’s been through all the same pains and struggles and hard lessons-learned that the rest of us have, and unlike me, she doesn’t overly internalize or overly dramatize bad results. As she’s expressed to me many times – hard problems get her out of bed in the morning. She loves working on them. I wish that I could have been the person to get us through some of these hard problems, and make the road, especially the first 6 months of this year, a little less hard for Sarah.

But, most of all, more than our investors, more than myself or my family, more so than even the Moz team, I felt like I let down the tens of thousands of marketers who believed in Moz. In the depths of my depression, it wasn’t Foundry or Ignition or Sarah or any other Mozzer that I was hitting my head against wall over – it was our community. I feel like I made a promise that I didn’t live up to these last few years. I’m so, so sorry. I promise to try harder smarter.

Depression

Depressed Rand is weird. Don’t get me wrong, regular Rand is weird, too. But depressed Rand magnifies the bad 10X and minimizes the good. He refuses to even acknowledge good news and, because he’s a pretty smart guy, he can usually argue for why that good news is actually just temporary and will turn to shit any minute. The weird part is, I think depressed Rand is actually a very authentic version of myself. When I felt depressed, I upheld TAGFEE – particularly the values of transparency and authenticity – as the reasons why I could and should be such a raging, all-consuming, negative naysayer.

For months, I worked with CEO coaches, and then therapists to try and disconnect my personal happiness from Moz’s company performance. I thought that was a critical goal that I needed to do in order to get back to a better place. Eventually, this Spring, I gave it up. I decided that in order to be happy, Moz had to do better, and I put more of my mental and emotional energy there, rather than lamenting a rewiring I couldn’t make my head perform.

The moment of change happened after a doctor’s visit, in a very weird, inexplicable way. A few months prior to the doctor’s visit, I had tried a chocolate truffle laced with THC (don’t panic, personal Marijuana use is legal in Washington State). I’ve had sciatica in my left leg for years and if you’ve hung out with me, you’ve probably seen me stand up uncomfortably or use a scarf or sweater as lumbar support to sit down. For 6 hours after I tried the pot truffle, I felt horrible – crazy paranoid and high and awful. Then, for the next 3 days, my leg didn’t hurt at all. For the first time in six years. I told my doctor about this, and she said “the funny thing is, Marijuana doesn’t have any pain-killing properties. It just lessens tension, anxiety, and stress for some people.”

Bam. Moment of epiphany. My pain is tied to anxiety and stress (probably) and since I’m not particularly excited about trying pot truffles again after those first 6 hours of hell, I need to work on those things in order to fix my leg. And not just my leg. There’s likely all sorts of maladies, mental and physical, that I might suffer in years ahead if I can’t get this under control. That night, I felt different. I looked at our daily numbers email in the morning, and I didn’t come up with some reason in my head why they were bad even though they looked good. Like the corn in Hyperbole & a Half’s famous comic, suddenly everything wasn’t hopeless bullshit.

Don’t worry if this doesn’t make sense to you. It doesn’t make sense to me, either. I know I fucked up. I know I let down a lot of people. I know I was depressed. And I know that right now, I’m OK. I could be worse. Some days are still kinda sucky, but not mega-sucky anymore. The future feels like it’s back to being a possibility, rather than just an area of thought I need to avoid if I want to sleep tonight. In fact, I think I’m going to sleep pretty well tonight.

«

Flip – A ReactJS Game

19 September 2014 - 7:00am
Flip - A ReactJS Game | Henley Edition

HomeAboutProjects

Building upon my experience with the ReactJS Slide Puzzle, I thought I'd take the concept of a React-based game one step further. Flip is not very practical, and uses a lot of potentially buggy CSS3 and ES5 methods, but I mean, it's just a game. Besides, it shows some of what's possible using React. React has been a real pleasure to use, despite some minor workflow issues related to JSX (as it matures, I imagine it will become easier to use).

If you're not using IE (sorry IE), try clicking on the game board to zoom in. The effect is thanks to Hakim El Hattab's zoom.js.

The full source is available on my github. Click below to play!

Debian Security Advisory: DSA-3025-1 apt

17 September 2014 - 1:00pm

It was discovered that APT, the high level package manager, does not properly invalidate unauthenticated data (CVE-2014-0488), performs incorrect verification of 304 replies (CVE-2014-0487), does not perform the checksum check when the Acquire::GzipIndexes option is used (CVE-2014-0489) and does not properly perform validation for binary packages downloaded by the apt-get download command (CVE-2014-0490).

For the stable distribution (wheezy), these problems have been fixed in version 0.9.7.9+deb7u3.

For the unstable distribution (sid), these problems have been fixed in version 1.0.9.

We recommend that you upgrade your apt packages.

iOS 8, thoroughly reviewed

17 September 2014 - 1:00pm
Enlarge / iOS 8 is here, and it's a big deal.

Andrew Cunningham

"Huge for developers. Massive for everyone else."

That was Apple's tagline for iOS 8 when the software was announced at the company's Worldwide Developers Conference back in June. Overuse of superlatives is a pet peeve of mine, but after using iOS 8 for a couple of months, I have to say that they're warranted in this case. iOS 7 was a comprehensive makeover for an operating system that needed to reclaim visual focus and consistency. iOS 7.1 improved stability and speed while addressing the new design's worst shortcomings and most egregious excesses. And iOS 8 is the update that turns its attention from the way everything looks to the way it works.

Just as iOS 6's look had begun to grow stale by the time 2013 rolled around (six years is a pretty good run, though), iOS' restrictions on third-party applications and UI customization now feel outdated. Sure, back in 2007, slow processors and small RAM banks required a strict, Spartan approach to what apps could do and the ways they could interact. But now, our smartphones and tablets have become powerful mini-computers in their own right. Competing platforms like Android, Windows, and Windows Phone have all demonstrated that it's possible to make these little gadgets more computer-y without tanking performance or battery life.

Apple still holds the keys to many aspects of the iPhone and iPad user experience, but compared to past versions of the software iOS 8 represents an opening of floodgates. Don't like Apple's software keyboard? Replace it. Want sports scores and updates on your eBay auctions in your Notification Center? Here's a widget, throw 'em in there. Want to use a social network or a cloud storage service that Apple hasn't explicitly blessed and baked into the OS? Cool. Here are some APIs for that.

We're going to give you a thorough rundown of iOS 8's new features today, but the most important thing to know about the software is that you're now in the driver's seat. Well, maybe not quite the driver's seat; there's still quite a bit you can't customize or change. But Apple is definitely letting you reach over and steer the car.

In this review, we'll be talking mostly about features available to all iOS 8 devices, and to hardware that you already have in your hands right now. Several software features in the new operating system are exclusive to the new iPhone 6 and iPhone 6 Plus, and discussion of those features will wait until we review those devices. Outlets that received pre-release hardware from Apple have already run their iPhone 6 reviews, but we'll be picking them up at retail along with everybody else—look for ours to go up sometime early next week.

Stuff Goes Bad: Erlang in Anger

17 September 2014 - 1:00pm
September 17, 2014 by

The Heroku Routing team does a lot of work with Erlang, both in terms of development and maintenance, to make sure the platform scales smoothly as it continues to grow.

Over time we've learned some hard-earned lessons about making systems that can scale with some amounts of reliability (or rather, we've definitely learned what doesn't work), and about what kind of operational work we may expect to have to do in anger.

This kind of knowledge usually remains embedded within the teams that develop it, and tends to die when individuals leave or change roles. When new members join the team, it gets transmitted informally, over incident simulations, code reviews, and other similar practices, but never in a really persistent manner.

For the past year or so, bit by bit, I've tried to grab the broad lines of this knowledge and to put it into a manual, that we're proud to release today.

From the introduction:

This book intends to be a little guide about how to be the Erlang medic in a time of war. It is first and foremost a collection of tips and tricks to help understand where failures come from, and a dictionary of different code snippets and practices that helped developers debug production systems that were built in Erlang.

This is our attempt at bridging the gap between most tutorials, books, training sessions, and actually being able to operate, diagnose, and debug running systems once they've made it to production.

This manual adds to the Routing team's efforts to interact with the Erlang (and polyglot) community at large, sharing knowledge with teams from all over the place. It is available in PDF for free, under a Creative Commons License, at erlang-in-anger.com

It comes just in time for the Chicago Erlang conference, dedicated to real world applications in Erlang, where you'll be able to talk to a few members of Heroku's Routing team, and a bunch of regulars from the Erlang community.

We hope this will prove useful to the community!

← Back to Heroku Engineering Blog List

Dremel Releases a Mass-Market 3D Printer

17 September 2014 - 1:00pm

With a combination of accessible features, smart packaging, and a $999 price point, it’s obvious that the Dremel 3D Idea Builder is a machine aimed squarely at the mass market.

The printer, announced today at MakerCon in New York City, is the first 3D printer to be released by a major tool manufacturer, and represents further maturation of at-home additive manufacturing. With initial sales being handled by traditional tool-sales outlets Home Depot, Amazon, and Canadian Tire, it promises to help expose 3D printing to a new range of users.

The Idea Builder itself is a single-extruder printer in a self-contained package. Lightweight with a solid feel, it has a sleek plastic exterior, a detachable blue lid, and two removable panels on each side (presumably in case things get too toasty inside). Its build area measures 230mm x 150mm x 140mm (9” x 5.9” x 5.5”), with a clever removable bed to simplify model extraction. There’s no heated platform, so this machine is PLA only.

The molded interior of the machine includes a recessed semi-cylindrical filament spool holder on the left side of the build platform. Placing the filament inside the machine reduces the footprint and gives the overall impression of fewer moving parts. The removable lid keeps loose hair and jewelry out of the build area during printing, but allows for easy access during filament loading, print removal and maintenance. The front of the machine has a clear Plexiglas door that snaps shut with a couple of tiny magnets.

A polished consumer product from the first touch, the Dremel design team has clearly thought through the unboxing experience. Inside the packaging is a full color, easy-to-follow quick-start guide, two sheets of Dremel-branded BuildTak, and a black and white printed instruction manual. As you would expect from a company like Dremel, the manual is quite comprehensive and well organized. It’s especially gratifying to see a glossary of terms to explain 3D printing to a new audience of makers.

Dremel has a long history of working with partners around the world to manufacture their tools and products, so it’s no surprise that the Idea Builder was conceived in partnership with Chinese manufacturer Flashforge, makers of the popular “Creator” Replicator clones. The Idea Builder is based on the Flashforge Dreamer, whose electronics utilize the ARM Contex-M4 CPU processor, instead of the ATmega chips used in the Flashforge Creator line. The company is also working with Autodesk to create design software options for their customers.

The Dremel 3D software interface and features are similar to Makerware or Cura and runs on both Mac and Windows. It displays the build area with a 3D rendering of the build volume, has options to move, rotate and scale parts, and will highlight areas that will need support material. The software lacks options to change print temperatures, infill percentages, add rafts, or use custom gcode files — handicaps Dremel purposefully implemented to help simplify the printing experience for new users. However, it’s been mentioned that these missing features may be added in future iterations of the software.

The slicing engine is fast and offers high, medium, and low-resolution settings that translate to 0.1mm, 0.2mm and 0.3mm layer heights and pre-assigned infill levels.

Although the Idea Builder software lacks advanced settings it has succeeded in creating a streamlined consumer-ready user experience and ecosystem. Dremel is tapping into its established Racine, Wisconsin-based customer service network for free customer support from “Dremel Experts” via a variety of platforms, including phone, Skype and email. And it is using its distribution clout to get product into new channels.

With Dremel’s entry into 3D printing, it’s clear that the race to mainstream 3D printing is heating up. The Idea Builder’s attractive price point and features make it worth considering for a variety of user types. So how did this machine perform in our serious testing? We’ll be posting our full review in the next issue of Make: magazine, along with 25 other new 3D printers. And look for the release of our test models later this week so you can play along at home.

Related

The Traveling Salesman with Simulated Annealing, R, and Shiny

17 September 2014 - 1:00pm

I built an interactive Shiny application that uses simulated annealing to solve the famous traveling salesman problem. You can play around with it to create and solve your own tours at the bottom of this post. Here’s an animation of the annealing process finding the shortest path through the 48 state capitals of the contiguous United States:

How does the simulated annealing process work?

We start by picking an arbitrary initial tour from the set of all valid tours. From that initial tour we “move around” and check random neighboring tours to see how good they are. There are so many valid tours that we won’t be able to test every possible solution, but a well-designed annealing process eventually reaches a solution that, if it is not the global optimum, is at least good enough. Here’s a step-by-step guide:

  1. Start with a random tour through the selected cities. Note that it’s probably a very inefficient tour!
  2. Pick a new candidate tour at random from all neighbors of the existing tour. This candidate tour might be better or worse compared to the existing tour (i.e. shorter or longer)
  3. If the candidate tour is better than the existing tour, accept it as the new tour
  4. If the candidate tour is worse than the existing tour, still maybe accept it, according to some probability. The probability of accepting an inferior tour is a function of how much longer the candidate is compared to the current tour, and the temperature of the annealing process. A higher temperature makes you more likely to accept an inferior tour
  5. Go back to step 2 and repeat many times, lowering the temperature a bit at each iteration, until you get to a low temperature and arrive at your (hopefully global, possibly local) minimum. If you’re not sufficiently satisfied with the result, try the process again, perhaps with a different temperature cooling schedule

The key to the simulated annealing method is in step 4: even if we’re considering a tour that is worse than the tour we already have, we still sometimes accept the worse tour temporarily, because it might be the stepping stone that gets us out of a local minimum and ultimately closer to the global minimum. The temperature is usually pretty high at the beginning of the annealing process, so that initially we’ll accept more tours, even the bad ones. Over time, though, we lower the temperature until we’re only accepting new tours that improve upon our solution.

If you look at the bottom 2 graphs of the earlier USA animation, you can see that at the beginning the “Current Tour Distance” jumps all over the place while the temperature is high. As we turn the temperature down, we accept fewer longer tours and eventually we converge on the globally optimal tour.

What’s the point?

That’s all well and good, but why do we need the annealing step at all? Why not do the same process with 0 temperature, i.e. accept the new tour if and only if it’s better than the existing tour? It turns out if we follow this naive “hill climbing” strategy, we’re far more likely to get stuck in a local minimum. Histograms of the results for 1,000 trials of the traveling salesman through the state capitals show that simulated annealing fares significantly better than hill climbing:

Simulated annealing doesn’t guarantee that we’ll reach the global optimum every time, but it does produce significantly better solutions than the naive hill climbing method. The results via simulated annealing have a mean of 10,690 miles with standard deviation of 60 miles, whereas the naive method has mean 11,200 miles and standard deviation 240 miles.

And so, while you might not think that Nikolay Chernyshevsky, college football coaches, and Chief Wiggum would be the best people to offer an intuition behind simulated annealing, it turns out that these characters, along with cliche-spewers everywhere, understand the simple truth behind simulated annealing: sometimes things really do have to get worse before they can get better.

Make your own tour with the interactive Shiny app

Here’s the Shiny app that lets you pick up to 30 cities on the map, set some parameters of the annealing schedule, then run the actual simulated annealing process (or just click ‘solve’ if you’re lazy). Give it a shot below! Bonus points if you recognize where the default list of cities comes from…

The app is hosted at ShinyApps.io, which is currently in alpha testing, so I’m not entirely sure how reliable it will be. If you want to run the app on your local machine, it’s very easy, all you need to do is paste the following into your R console:

1 2 3 install.packages(c("shiny", "maps", "geosphere"), repos="http://cran.rstudio.com/") library(shiny) runGitHub("shiny-salesman", "toddwschneider") Code on GitHub

The full code is available at https://github.com/toddwschneider/shiny-salesman

Around the World in 80,000 Miles

Here’s another animated gif using a bunch of world capitals. The “solution” here is almost certainly not the global optimum, but it’s still fun to watch!

The Design and Implementation of the FreeBSD Operating System, 2nd ed.

17 September 2014 - 1:00pm

The most complete, authoritative technical guide to the FreeBSD kernel’s internal structure has now been extensively updated to cover all major improvements between Versions 5 and 11. Approximately one-third of this edition’s content is completely new, and another one-third has been extensively rewritten.

Three long-time FreeBSD project leaders begin with a concise overview of the FreeBSD kernel’s current design and implementation. Next, they cover the FreeBSD kernel from the system-call level down–from the interface to the kernel to the hardware. Explaining key design decisions, they detail the concepts, data structures, and algorithms used in implementing each significant system facility, including process management, security, virtual memory, the I/O system, filesystems, socket IPC, and networking.

This Second Edition

• Explains highly scalable and lightweight virtualization using FreeBSD jails, and virtual-machine acceleration with Xen and Virtio device paravirtualization

• Describes new security features such as Capsicum sandboxing and GELI cryptographic disk protection

• Fully covers NFSv4 and Open Solaris ZFS support

• Introduces FreeBSD’s enhanced volume management and new journaled soft updates

• Explains DTrace’s fine-grained process debugging/profiling

• Reflects major improvements to networking, wireless, and USB support

Readers can use this guide as both a working reference and an in-depth study of a leading contemporary, portable, open source operating system. Technical and sales support professionals will discover both FreeBSD’s capabilities and its limitations. Applications developers will learn how to effectively and efficiently interface with it; system administrators will learn how to maintain, tune, and configure it; and systems programmers will learn how to extend, enhance, and interface with it.

Marshall Kirk McKusick writes, consults, and teaches classes on UNIX- and BSD-related subjects. While at the University of California, Berkeley, he implemented the 4.2BSD fast filesystem. He was research computer scientist at the Berkeley Computer Systems Research Group (CSRG), overseeing development and release of 4.3BSD and 4.4BSD. He is a FreeBSD Foundation board member and a long-time FreeBSD committer. Twice president of the Usenix Association, he is also a member of ACM, IEEE, and AAAS.

George V. Neville-Neil hacks, writes, teaches, and consults on security, networking, and operating systems. A FreeBSD Foundation board member, he served on the FreeBSD Core Team for four years. Since 2004, he has written the “Kode Vicious” column for Queue and Communications of the ACM. He is vice chair of ACM’s Practitioner Board and a member of Usenix Association, ACM, IEEE, and AAAS.

Robert N.M. Watson is a University Lecturer in systems, security, and architecture in the Security Research Group at the University of Cambridge Computer Laboratory. He supervises advanced research in computer architecture, compilers, program analysis, operating systems, networking, and security. A FreeBSD Foundation board member, he served on the Core Team for ten years and has been a committer for fifteen years. He is a member of Usenix Association and ACM.

Commander Keen source code released

16 September 2014 - 7:00pm
README.md

This repository contains the source for Commander Keen in Keen Dreams. It is released under the GNU GPLv2+. See LICENSE for more details.

The release of the source code does not affect the licensing of the game data files, which you must still legally acquire. This includes the static data included in this repository for your convenience. However, you are permitted to link and distribute that data for the purposes of compatibility with the original game.

This release was made possible by a crowdfunding effort. It is brought to you by Javier M. Chavez and Chuck Naaden with additional support from:

  • Dave Allen
  • Kirill Illenseer
  • Michael Jurich
  • Tom Laermans
  • Jeremy Newman
  • Braden Obrzut
  • Evan Ramos
  • Sam Schultz
  • Matt Stath
  • Ian Williams
  • Steven Zakulec
  • et al

Boeing-SpaceX Team Split Space Taxi Award

16 September 2014 - 7:00pm

Boeing Co. (BA) and Elon Musk’s Space Exploration Technologies Corp. will share a multibillion-dollar federal contract to help restart U.S. manned spaceflights and reduce reliance on Russian rockets, a congressional leader said.

The two companies will split the award being unveiled by the National Aeronautics and Space Administration later today, said Representative Eddie Bernice Johnson of Texas, the senior Democrat on the U.S. House Science Committee. NASA is planning an announcement on the program at 4 p.m. in Washington.

The funding plan caps a competition for the right to build the first U.S. manned craft since NASA retired the space shuttle fleet in 2011. The agency now uses Russia’s Soyuz rockets to get people to the International Space Station, an arrangement that costs about $70 million a seat and is entangled in tensions with President Vladimir Putin over the crisis in Ukraine.

Boeing, SpaceX and a third contender, closely held Sierra Nevada Corp., declined to comment before NASA’s announcement, spokesmen said.

NASA is charting a new direction 45 years after sending humans to the moon, looking to private industry to take over human missions near Earth with reusable craft. Commercial operators would develop space tourism while the space agency focuses on far-off missions to Mars or asteroids.

Spending on the Commercial Crew program may reach $3.42 billion through the end of the decade, according to budget documents posted on the agency’s website. NASA sought $848.3 million for the commercial spaceflight program in fiscal 2015.

The contract advances Musk’s ambitions for his Hawthorne, California-based SpaceX, the first private company to deliver cargo to the space station, to become a force in the global aerospace industry. Musk, 43, who also leads electric-car maker Tesla Motors Inc. (TSLA), has set an ultimate goal of sending astronauts to Mars.

SpaceX’s Dragon v2 capsule, which seats seven, was designed with an eye to interplanetary travel, able to land vertically anywhere on Earth “with the precision of a helicopter,” according to the company’s website, instead of parachuting into the ocean like early U.S. spacecraft in the 1960s and ’70s.

Boeing’s seven-passenger CST-100 has roots in the Apollo lunar-missions era, and its return to Earth would be cushioned by air bags and parachutes, according to the company’s website. Chicago-based Boeing is the only competitor to complete all of NASA’s design milestones on time.

To contact the reporters on this story: Julie Johnsson in Chicago at jjohnsson@bloomberg.net; James Rowley in Washington at jarowley@bloomberg.net

To contact the editors responsible for this story: Ed Dufner at edufner@bloomberg.net; Katherine Rizzo at krizzo5@bloomberg.net John Lear

Press spacebar to pause and continue. Press esc to stop.