Perhaps he should examine some of the Facebook publicly available scans that have been done by "academic researchers", e.g., in South Korea. It is not rocket science to crawl Facebook.
Facebook's data is accessible. As is any website's back-end database. You are kidding yourself if like the author you think you can put something like Facebook on the public internet and have it be "private". If people with the skills -- and there are plenty of them; surprise, they don't all work at Facebbok -- put in the effort they can get the data. All they need is a reason/motivation to do so.
But let's have some more embarassing "tech journalism" (uninformed pontification) from a once respected journal. Put more nails in the coffin of the notion of "professional journalism".
Google is really good at seamlessly integrating the companies they acquire to give the appearance to users that Google develops much more than they actually do. The Google brand becomes synonymous with development that they actually did not themselves do.
I think what he means is getting funded is commoditized.
But that will not continue indefinitely. And that is because eventually someone will want to know how the company plans to make money. When the company cannot tell them, they stop investing. And it spreads back up the chain. The whole thing collapses.
I'll bet she was thinking of the data mining and analysis possibilities opened up by allowing users to input information in the form of "notes". Of course, that's just a guess.
Let's look at this in simple terms. Let's look for evidence of a "business model".
Website becomes popular.
Website gets personal details from users: email, maybe photos, address, etc., plus it gets a list of "friends" for each user.
Website shows display ads.
(Website gives birth to games company. Games company goes public then loses half its value in two months.)
Does website have a "business model"?
Wait, we're not done.
Website monitors everything users do on the website as well as, to the extent they can, the other websites users visit, using web beacons scattered across the web (Like buttons, aka "Facebook Connect").
Website shows display ads.
Do we have a business model yet?
But wait, there's more.
Website goes public and raises a heap of easy cash.
Website acquires a web browser, produces a mobile phone and begins monitoring everything users do on the web and every conversation they have with their friends.
Website shows display ads.
Do we have a business model yet?
Or is this just lots and lots of spying, information collection and dreaming that this is somehow useful for business?
Display ads are not a business model that will grow a business, unless the business is itself a display ad company.
(And Google already acquired those guys years ago.)
They made money in the past, as more and more users kept signing up and using the site (more new eyeballs on the display ads). We're looking to the future now. What happens when the number of new users starts to slow down? As a public company, there is increased pressure for it to perform and to grow. Increasing the enormous net worth of the CEO and some insiders and then gradually fizzling out is not, one would think, the objective of a corporation that goes public. The usual reason companies raise money through IPO's is to grow the company.
The ideal to strive for is what we call the "Goldielocks effect".
Not knowing too little. Not knowing too much. Knowing just enough to get the job done.
In computing there will always be multiple solutions, and tradeoffs. What you want to aim for is "The simplest solution possible."
In my experience, this is really hard for most developers.
Maybe the smart thing to do is find someone who makes it look easy and trust their judgment. Again, my experience is that most developers are reluctant to do this.
The question is not how much someone knows, it's whether they know "enough" to get the job done. Goldielocks.
Are we sure that "=" and not "==" is the correct operator?
If "==" was appropriate then there would be no problem of ambiguity. And there would be no need for the clarification.
Because meanings could never change with context. There could be no "misinterpretation". Only the truth table result of "false".
Which is more important in human communication: case-sensitivity or context-sensitivity?
It's a little like advertising perhaps. Advertising has become more of an art form than a tool. Awards given for achievement in creating ads are based on the perceived artistic value of an ad, not on its market effectiveness.
Web development is viewed as an art form by web developers. The web is not a tool to them. It's a canvas.
But the reality is that for many users in many cases, the web is a tool. They just want to accomplish some task, and they are not going to pay attention to artistic value.
Maybe a good example is Amazon. Many web developers criticise the site's design. But Amazon is doing just fine. Because users do not visit Amazon for an "experience". They visit it to buy things.
Maybe there should be two versions of every website: 1. an artisitic one aimed at "user experience" where the developer could display their design skills and 2. another aimed at getting some task(s) done, quickly and efficiently. The latter might follow some universal standard. No thinking involved in its "design", just following a spec.
The user could choose. The problem for the author of this blog post was she had no choice.
At the level that Camper (or Amazon) is operating on, the marketing department holds all the cards when it comes to the web site. The web devs mostly decide how to implement, but they're operating at the behest of marketing.
In general, do marketing people know how to create websites?
If not, how can they even know what is possible to create using HTML, CSS, etc.?
If the answer is "they look at what the competition is doing," then how did the marketers at the competitor know what is possible?
It has to start somewhere. Who was behind the web back in 1993? Marketing departments? Are marketers the ones who know what can be done with HTML, etc., and what cannot?
If a marketing department asks a web developer to implement something that the developer knows will be an annoyance to end users, and then he decides to tell them it is not possible, does the marketing department not accept this answer? "Look, we know how to make websites, we know this can be done and we'd do it ourselves if we had the time, but we're busy doing marketing. Either you do your job and build this site as we ask, or we'll find someone else."
So, at some stage, some web developer somewhere makes a decision.
I remember reading the confession of a talented developer who wrote, using mini scheme, stealth malware to serve pop-ups. His skills were so good that he could disable all competing malware; the competition was helpless. The NY Attorney General later shut down his employer on consumer protection grounds. The developer was not typically an author of malware, and knew what he was doing was wrong, but his excuse for working with this outfit was that he needed a job.
Without that developer making a choice, the malware company would never have known it was possible to do what they were able to do with the help of this particular talented developer. The use of mini scheme, self modifying code and disabling all competing malware were not in his "job description". He showed them what was possible. And surely they loved him for it. But how about the users infected with the malware, who had to see his employer's pop-ups every day with no way to "turn it off"? What would they think of his work?
> how can they even know what is possible to create using HTML, CSS, etc.?
Anything is possible. It doesn't mean a particular idea is good (see: the topic website), but anything is certainly possible.
> It has to start somewhere. Who was behind the web back in 1993? Marketing departments?
For big companies? Yes.
> Are marketers the ones who know what can be done with HTML, etc., and what cannot?
Implementation is beside the point. Even if the Camper "experience" in the original link loaded quickly and was implemented perfectly, it would be bad.
> If a marketing department asks a web developer to implement something that the developer knows will be an annoyance to end users, and then he decides to tell them it is not possible, does the marketing department not accept this answer?
This is the difference between a good marketing department and a bad one. The good ones will take the feedback and the bad ones won't. It's also the difference between a good organization and a bad one -- if the org makes it a habit not to talk to engineers until the idea has gone through revision after revision, UX, etc, then there's too much inertia to overcome (say, 3 months of designing, UX development, intended to be launched in tandem with a meatspace campaign, as an example).
For giant companies, the web site is a piece of their action, and often times not the largest piece. The web team (the ones who implement) are pinned to the timelines of other rollouts (in-store campaigns, billboards, magazine ads, tv ads, and so forth). So while a certain idea might not be best, there may not be time to change it -- or (consider this) the web experience might not be the most important to a company that does 80% of their volume in meatspace.
Thinking that the web site & web team should be the gatekeepers of customer experience in a multichannel business that isn't focused online is a myopic view. In spirit I'm right there with you dude, but in practice (can you tell I've worked at giant companies?) it doesn't work that way.
If you're with me in spirit, I take that to mean I'm not "wrong", I'm just unrealistic, a dreamer, etc.
I think web developers have a lot of power over how the web operates. Much more than marketing departments.
In the spirit of making money, I'm right there with you. Web developers have to eat.
But to think the matter of the web's usability, or unusability (what the blog post described), is out of their hands, and solely in the hands of marketing departments, I don't buy it. Marketing has the budget, they do not have the skills, or even the knowledge.
I see numerous examples year after year that show that both large and small companies do not have the first clue how stuff works or what the implications are on end users. Developers present them with a proposition and the company writes a check. When some egregious practice comes to the attention of the press, the companies often have no idea what they were even paying for -- they do not understand what was being done.
One need only look at SEO and the types of websites it produces. It's quite a stretch to try to hold marketing departments responsible for this state of affairs.
What's the sense in spewing out endless chunks of code when no one (even the authors later on) can take the time to read it and understand it?
Why not review the code that's already been written (like the open source code Apple and other walled gardens rely on to build their systems)? This is the sort of tinkering that will help us develop alternatives.
To find the doors, and build new ones, we have to read old code, not simply write new code.
Perhaps he should examine some of the Facebook publicly available scans that have been done by "academic researchers", e.g., in South Korea. It is not rocket science to crawl Facebook.
Facebook's data is accessible. As is any website's back-end database. You are kidding yourself if like the author you think you can put something like Facebook on the public internet and have it be "private". If people with the skills -- and there are plenty of them; surprise, they don't all work at Facebbok -- put in the effort they can get the data. All they need is a reason/motivation to do so.
But let's have some more embarassing "tech journalism" (uninformed pontification) from a once respected journal. Put more nails in the coffin of the notion of "professional journalism".