Content Ever be Profitable?

THE CURRENT WORRIES

1. Content Suppliers

The Ethos of Free Content

Content Suppliers is the underprivileged sector of the Internet. They all lose money (even sites which offer basic, standardized goods – books, CDs), with the exception of sites profering sex or tourism. No user seems to be grateful for the effort and resources invested in creating and distributing content. The recent breakdown of traditional roles (between publisher and author, record company and singer, etc.) and the direct access the creative artist is gaining to its paying public may change this attitude of ingratitude but hitherto there are scarce signs of that. Moreover, it is either quality of presentation (which only a publisher can afford) or ownership and (often shoddy) dissemination of content by the author. A really qualitative, fully commerce enabled site costs up to 5,000,000 USD, excluding site maintenance and customer and visitor services. Despite these heavy outlays, site designers are constantly criticized for lack of creativity or for too much creativity. More and more is asked of content purveyors and creators. They are exploited by intermediaries, hitch hiker sand other parasites. This is all an off-shoot of the ethos of the Internet as a free content area.

Most of the users like to surf (browse, visit sites) the net without reason or goal in mind. This makes it difficult to apply to the web traditional marketing techniques.

What is the meaning of “targeted audiences” or “market shares” in this context? If a surfer visits sites which deal with aberrant sex and nuclear physics in the same session – what to make of it?

Moreover, the public and legislative backlash against the gathering of surfer’s data by Internet ad agencies and other web sites – has led to growing ignorance regarding the profile of Internet users, their demography, habits, preferences and dislikes.

“Free” is a key word on the Internet: it used to belong to the US Government and to a bunch of universities. Users like information, with emphasis on news and data about new products. But they do not like to shop on the net – yet. Only 38% of all surfers made a purchase during 1998.

It would seem that users will not pay for content unless it is unavailable elsewhere or qualitatively rare or made rare. One way to “rarefy” content is to review and rate it.

2. Quality-Rated Content

There is a long term trend of clutter-breaking website-rating and critique. It may have a limited influence on the consumption decisions of some users and on their willingness to pay for content. Browsers already sport “What’s New” and “What’s Hot” buttons. Most Search Engines and directories recommend specific sites. But users are still cautious. Studies discovered that nouser, no matter how heavy, has consistently re-visited more than 200 sites, a minuscule number. Some recommendation services often produce random – at times, wrong – selections for their users. There are also concerns regarding privacy issues. The backlash against Amazon’s “readers circles” is an example. Web Critics, who work today mainly for the printed press, publish their wares on the net and collaborate with intelligent software which hyperlinks to web sites, recommends them and refers users to them. Some web critics (guides) became identified with specific applications – really, expert systems -which incorporate their knowledge and experience. Most volunteer-based directories (such as the “Open Directory” and the late “Go” directory) work this way.

The flip side of the coin of content consumption is investment in content creation, marketing, distribution and maintenance.

3. The Money

Where is the capital needed to finance content likely to come from?

Again, there are two schools:

According to the first, sites will be financed through advertising – and so will search engines and other applications accessed by users.

Certain ASPs (Application Service Providers which rent out access to application software which resides on their servers) are considering this model.

The recent collapse in online advertising rates and click-through rates raised serious doubts regarding the validity and viability of this model. Marketing gurus, such as Seth Godin went as far as declaring “interruption marketing” (=ads and banners) dead.

The second approach is simpler and allows for the existence of non-commercial content.

It proposes to collect negligible sums (cents or fractions of cents) from every user for every visit (“micro-payments”). These accumulated cents will enable the site-owners to update and to maintain them and encourage entrepreneurs to develop new content and invest in it. Certain content aggregators (especially of digital textbooks) have adopted this model (Questia, Fathom).

The adherents of the first school point to the 5 million USD invested in advertising during 1995 and to the 60 million or so invested during 1996.

Its opponents point exactly at the same numbers: ridiculously small when contrasted with more conventional advertising modes. The potential of advertising on the net is limited to 1.5 billion USD annually in 1998, thundered the pessimists. The actual figure was double the prediction but still woefully small and inadequate to support the internet’s content development. Compare these figures to the sale of Internet software (4 billion), Internet hardware (3 billion), Internet access provision (4.2 billion in 1995 alone!).

Even if online advertising were to be restored to its erstwhile glory days, other bottlenecks remain. Advertising encourages the consumer to interact and to initiate the delivery of a product to him. This – the delivery phase – is a slow and enervating epilogue to the exciting affair of ordering online. Too many consumers still complain of late delivery of the wrong or defective products.

The solution may lie in the integration of advertising and content. The late Pointcast, for instance, integrated advertising into its news broadcasts, continuously streamed to the user’s screen, even when inactive (it had an active screen saver and ticker in a “push technology”). Downloading of digital music, video and text (e-books) leads to the immediate gratification of consumers and increases the efficacy of advertising.

Whatever the case may be, a uniform, agreed upon system of rating as a basis for charging advertisers, is sorely needed. There is also the question of what does the advertiser pay for? The rates of many advertisers (Procter and Gamble, for instance) are based not on the number of hits or impressions (=entries, visits to a site). – but on the number of the times that their advertisement was hit (page views), or clicked through.

Finally, there is the paid subscription model – a flop to judge by the experience of the meagre number of sites of venerable and leading newspapers that are on a subscription basis. Dow Jones (Wall Street Journal) and The Economist. Only two.

All this is not very promising. But one should never forget that the Internet is probably the closest thing we have to an efficient market. As consumers refuse to pay for content, investment will dry up and content will become scarce (through closures of web sites). As scarcity sets in, consumer may reconsider.

Your article deals with the future of the Internet as a medium. Will it be able to support its content creation and distribution operations economically?

If the Internet is a budding medium – then we should derive great benefit from a study of the history of its predecessors.

The Future History of the Internet as a Medium

The internet is simply the latest in a series of networks which revolutionized our lives. A century before the internet, the telegraph, the railways, the radio and the telephone have been similarly heralded as “global” and transforming. Every medium of communications goes through the same evolutionary cycle:

Anarchy

The Public Phase

At this stage, the medium and the resources attached to it are very cheap, accessible, under no regulatory constraints. The public sector steps in : higher education institutions, religious institutions, government, not for profit organizations, non governmental organizations (NGOs), trade unions, etc. Be deviled by limited financial resources, they regard the new medium as a cost effective way of disseminating their messages.

The Internet was not exempt from this phase which ended only a few years ago. It started with a complete computer anarchy manifested in ad hoc networks, local networks, networks of organizations (mainly universities and organs of the government such as DARPA, a part of the defence establishment, in the USA). Non commercial entities jumped on the bandwagon and started sewing these networks together (an activity fully subsidized by government funds). The result was a globe encompassing network of academic institutions. The American Pentagon established the network of all networks, the ARPANET. Other government departments joined the fray, headed by the National Science Foundation (NSF) which withdrew only lately from the Internet.

The Internet (with a different name) became semi-public property – with access granted to the chosen few.

Radio took precisely this course. Radio transmissions started in the USA in 1920. Those were anarchic broadcasts with no discernible regularity. Non commercial organizations and not for profit organizations began their own broadcasts and even created radio broadcasting infrastructure (albeit of the cheap and local kind) dedicated to their audiences. Trade unions, certain educational institution sand religious groups commenced “public radio” broadcasts.

The Commercial Phase

When the users (e.g., listeners in the case of the radio, or owners of PCs and modems in the case of the Internet) reach a critical mass – the business sector is alerted. In the name of capitalist ideology (another religion, really) it demands “privatization” of the medium. This harps on very sensitive strings in every Western soul: the efficient allocation of resources which is the result of competition. Corruption and inefficiency are intuitively associated with the public sector (“Other People’s Money” – OPM). This, together with the ulterior motives of members of the ruling political echelons (the infamous American Paranoia), a lack of variety and of catering to the tastes and interests of certain audiences and the automatic equation of private enterprise with democracy lead to a privatization of the young medium.

The end result is the same: the private sector takes over the medium from “below” (makes offers to the owners or operators of the medium that they cannot possibly refuse) – or from “above” (successful lobbying in the corridors of power leads to the appropriate legislation and the medium is “privatized”). Every privatization – especially that of a medium – provokes public opposition. There are (usually founded) suspicions that the interests of the public are compromised and sacrificed on the altar of commercialization and rating. Fears of monopolization and cartelization of the medium are evoked – and proven correct in due course. Otherwise, there is fear of the concentration of control of the medium in a few hands. All these things do happen – but the pace is so slow that the initial fears are forgotten and public attention reverts to fresher issues.

A new Communications Act was enacted in the USA in 1934. It was meant to transform radio frequencies into a national resource to be sold to the private sector which was supposed to use it to transmit radio signals to receivers. In other words: the radio was passed on to private and commercial hands. Public radio was doomed to be marginalized.

The American administration withdrew from its last major involvement in the Internet in April 1995, when the NSF ceased to finance some of the networks and, thus, privatized its hitherto heavy involvement in the net.

A new Communications Act was legislated in 1996. It permitted “organized anarchy”. It allowed media operators to invade each other’s territories. Phone companies were allowed to transmit video and cable companies were allowed to transmit telephony, for instance. This was all phased over a long period of time – still, it was a revolution whose magnitude is difficult to gauge and whose consequences defy imagination. It carries an equally momentous price tag – official censorship. “Voluntary censorship”, to be sure, somewhat toothless standardization and enforcement authorities, to be sure – still, a censorship with its own institutions to boot. The private sector reacted by threatening litigation – but, beneath the surface it is caving in to pressure and temptation, constructing its own censorship codes both in the cable and in the internet media.

Institutionalization

This phase is the next in the Internet’s history, though, it seems, few realize it.

It is characterized by enhanced activities of legislation. Legislators, on all levels, discover the medium and lurch at it passionately. Resources which were considered “free”, suddenly are transformed to “national treasures not to be dispensed with cheaply, casually and with frivolity”.

It is conceivable that certain parts of the Internet will be “nationalized” (for instance, in the form of a licensing requirement) and tendered to the private sector. Legislation will be enacted which will deal with permitted and disallowed content (obscenity ? incitement ? racial or gender bias ?) No medium in the USA (not to mention the wide world) has eschewed such legislation. There are sure to be demands to allocate time (or space, or software, or content, or hardware) to “minorities”, to “public affairs”, to “community business”. This is a tax that the business sector will have to pay to fend off the eager legislator and his nuisance value.

All this is bound to lead to a monopolization of hosts and servers. The important broadcast channels will diminish in number and be subjected to severe content restrictions. Sites which will refuse to succumb to these requirements – will be deleted or neutralized. Content guidelines (euphemism for censorship) exist, even as we write, in all major content providers (CompuServe, AOL, Yahoo!-Geocities, Tripod, Prodigy).

The Bloodbath

This is the phase of consolidation. The number of players is severely reduced. The number of browser types will settle on 2-3 (Netscape, Microsoft and Opera?). Networks will merge to form privately owned mega-networks. Servers will merge to form hyper-servers run on supercomputers in “server farms”. The number of ISPs will be considerably cut. 50 companies ruled the greater part of the media markets in the USA in 1983. The number in 1995 was 18. At the end of the century they will number 6.

This is the stage when companies – fighting for financial survival – strive to acquire as many users/listeners/viewers as possible. The programming is shall owed to the lowest (and widest) common denominator. Shallow programming dominates as long as the bloodbath proceeds.

From Rags to Riches

Tough competition produces four processes:

1. A Major Drop in Hardware Prices

This happens in every medium but it doubly applies to a computer-dependent medium, such as the Internet.

Computer technology seems to abide by “Moore’s Law” which says that the number of transistors which can be put on a chip doubles every 18 months. As a result of this miniaturization, computing power quadruples every 18 months and an exponential series ensues. Organic-biological-DNA computers, quantum computers, chaos computers – prompted by vast profits and spawned by inventive genius will ensure the continued applicability of Moore’s Law.

The Internet is also subject to “Metcalf’s Law”.

It says that when we connect N computers to a network – we get an increase of N to the second power in its computing processing power. And these N computers are more powerful every year, according to Moore’s Law. The growth of computing powers in networks is a multiple of the effects of the two laws. More and more computers with ever increasing computing power get connected and create an exponential 16 times growth in the network’s computing power every 18 months.

2. Content Related Fees

This was prevalent in the Net until recently. Even potentially commercial software can still be downloaded for free. In many countries television viewers still pay for television broadcasts – but in the USA and many other countries in the West, the basic package of television channels comes free of charge.

As users / consumers form a habit of using (or consuming) the software – it is commercialized and begins to carry a price tag. This is what happened with the advent of cable television: contents are sold for subscription or per usage (Pay Per View – PPV) fees.

Gradually, this is what will happen to most of the sites and software on the Net. Those which survive will begin to collect usage fees, access fees, subscription fees, downloading fees and other, appropriately named, fees. These fees are bound to be low – but it is the principle that counts. Even a few cents per transaction may accumulate to hefty sums with the traffic which characterizes some web sites on the Net (or, at least its more popular locales).

3. Increased User Friendliness

As long as the computer is less user friendly and less reliable (predictable) than television – less of a black box – its potential (and its future) is limited. Television attracts 3.5 billion users daily. The Internet stands to attract – under the most exuberant scenario – less than one tenth of this number of people. The only reasons for this disparity are (the lack of) user friendliness and reliability. Even browsers, among the most user friendly applications ever -are not sufficiently so. The user still needs to know how to use a keyboard and must possess some basic acquaintance with the operating system. The more mature the medium, the more friendly it becomes. Finally, it will be operated using speech or common language. There will be room left for user “hunches” and built in flexible responses.

4. Social Taxes

Sooner or later, the business sector has to mollify the God of public opinion with offerings of political and social nature. The Internet is an affluent, educated, yuppie medium. It requires literacy and numeracy, live interest in information and its various uses (scientific, commercial, other), a lot of resources (free time, money to invest in hardware, software and connect time). It empowers – and thus deepens the divide between the haves and have-nots, the developed and the developing world, the knowing and the ignorant, the computer illiterate.

In short: the Internet is an elitist medium. Publicly, this is an unhealthy posture. “Internetophobia” is already discernible. People (and politicians) talk about how unsafe the Internet is and about its possible uses for racial, sexist and pornographic purposes. The wider public is in a state of awe.

So, site builders and owners will do well to begin to improve their image: provide free access to schools and community centres, bankroll internet literacy classes, freely distribute contents and software to educational institutions, collaborate with researchers and social scientists and engineers. In short: encourage the view that the Internet is a medium catering to the needs of the community and the underprivileged, a mostly altruist endeavour. This also happens to make good business sense by educating and conditioning a future generation of users. He who visited a site when a student, free of charge – will pay to do so when made an executive. Such a user will also pass on the information within and without his organization. This is called media exposure. The future will, no doubt, will be witness to public Internet terminals, subsidized ISP accounts, free Internet classes and an alternative “non-commercial, public” approach to the Net. This may prove to be one more source of revenue to content creator sand distributors.

Leave a Reply