Friday, June 6, 2014

The History of HTML

June 06, 2014 0 Comments
 The History of HTML
 

A markup language combines text as well as coded instructions on how to format that text and the term "markup" originates from the traditional practice of 'marking up' the margins of a paper manuscript with printer's instructions. Nowadays, however, if you mention the term 'markup' to any knowledgeable web author, the first thing they are likely to think of is 'HTML'. 
 
HTML —which is short for HyperText Markup Language— is the official language of the World Wide Web and was first conceived in 1990. HTML is a product of SGML (Standard Generalized Markup Language) which is a complex, technical specification describing markup languages, especially those used in electronic document exchange, document management, and document publishing. HTML was originally created to allow those who were not specialized in SGML to publish and exchange scientific and other technical documents. HTML especially facilitated this exchange by incorporating the ability to link documents electronically using hyperlinks. Thus the name Hypertext Markup Language.
However, it was quickly realized by those outside of the discipline of scientific documentation that HTML was relatively easy to learn, was self contained and lent itself to a number of other applications. With the evolution of the World Wide Web, HTML began to proliferate and quickly spilled over into the mainstream.



Soon, companies began creating browsers —the software required to view an HTML document, i.e., a web page— and as they gained popularity it gave rise to competition and other web browsers. It may surprise some that back in late 1995, Netscape —which now plays a distant second to the King Kong of browsers, Internet Explorer— was the dominant browser on the market. In fact, Netscape was the first browser to support Javascript, animated gifs and HTML frames.
Thus began the so-called 'browser wars' and, along with seeing who could implement more 'bells and whistles' than the other guy, browser makers also began inventing proprietary HTML elements that only worked with their browsers. Some examples of these are the <marquee>...</marquee> tags (scrolling text) which originally only worked with Internet Explorer and the <blink>...</blink> tags (blinking text) which still only works with Gecko-based browsers such as Firefox.

A side effect of all this competition was that HTML became fragmented and web authors soon found that their web pages looked fine in one browser but not in another. Hence it became increasingly difficult and time consuming to create a web page that would display uniformly across a number of different browsers. (This phenomenon remains to some extent to this very day.)



Meanwhile, an organization known as the World Wide Web Consortium (W3C for short) was working steadily along in the background to standardize HTML. Several recommendations were published by the W3C during the late 1990s which represented the official versions of HTML and provided an ongoing comprehensive reference for web authors. Thus the birth of HTML 2.0 in September 1995, HTML 3.2 in January 1997 and HTML 4.01 in December 1999.
By now, Internet Explorer (IE) had eclipsed Netscape Navigator as the browser to use while surfing the net due to its superior capabilities but also largely due to the fact that the IE came bundled with the Windows operating system. Essentially when people bought computers using the Windows OS, it had the 'internet installed on it'. This tended to suit people just fine since the typical newcomer to computers was someone who was tentatively striking forth to take on this intimidating new-fangled technology that was crammed to the rafters with indecipherable acronyms, software help files that made no sense and buggy programs. Hence, the more 'instant' solutions this new technology offered, the better it was.


 As the World Wide Web approached adulthood hosting a wide variety of would-be and professional web page authors, it became increasingly apparent that cyberspace was filling up with a lot of badly written HTML.

This was due to some laziness and inexperience but was also the product of another instant solution involving web authoring tools, most particularly WYSIWYG editors, which tended to produce bloated and messy source code. As the browser wars continued —although by now it was pretty much of a massacre— the lead browser had developed capabilities akin to a junkyard dog which could gobble up any half-baked web page that it came across. This was all very fine and well but the resources (program source code, RAM on the user's computer, etcetera) required to run a browser that can consume just about anything was exhorbitant compared to what could be. And as the market dictated the shape of things to come, future browsers were bound follow the lead dog thus encouraging more junk code to fill up the web.

To remedy this situation, the W3C came up with a more regimental form of HTML with the intention to create a rigid standard to which web authors were encouraged to conform. This was supporting an effort to eventually 'clean up' or streamline the World Wide Web and ultimately replace presentational elements such as font with another documentational structure known as Cascading Style Sheets (CSS). In theory, once this transformation occurred, the web would place less demand on the next generation of web browsers and most specifically it would accomodate the low processing power of new portable devices such as PDAs. Hence the birth of the next generation of HTML called XHTML, the ' X ' representing that this version of HTML was based on XML (eXtensible Markup Language) instead of SGML.

the History of JavaScript

June 06, 2014 0 Comments

JavaScript, not to be confused with Java, was created in 10 days in May 1995 by Brendan Eich, then working at Netscape and now of Mozilla. JavaScript was not always known as JavaScript: the original name was Mocha, a name chosen by Marc Andreessen, founder of Netscape. In September of 1995 the name was changed to LiveScript, then in December of the same year, upon receiving a trademark license from Sun, the name JavaScript was adopted. This was somewhat of a marketing move at the time, with Java being very popular around then.
In 1996 - 1997 JavaScript was taken to ECMA to carve out a standard specification, which other browser vendors could then implement based on the work done at Netscape. The work done over this period of time eventually led to the official release of ECMA-262 Ed.1: ECMAScript is the name of the official standard, with JavaScript being the most well known of the implementations. ActionScript 3 is another well-known implementation of ECMAScript, with extensions (see below).
The standards process continued in cycles, with releases of ECMAScript 2 in 1998 and ECMAScript 3 in 1999, which is the baseline for modern day JavaScript. The "JS2" or "original ES4" work led by Waldemar Horwat (then of Netscape, now at Google) started in 2000 and at first, Microsoft seemed to participate and even implemented some of the proposals in their JScript.net language.
Over time it was clear though that Microsoft had no intention of cooperating or implementing proper JS in IE, even though they had no competing proposal and they had a partial (and diverged at this point) implementation on the .NET server side. So by 2003 the JS2/original-ES4 work was mothballed.
The next major event was in 2005, with two major happenings in JavaScript’s history. First, Brendan Eich and Mozilla rejoined Ecma as a not-for-profit member and work started on E4X, ECMA-357, which came from ex-Microsoft employees at BEA (originally acquired as Crossgain). This led to working jointly with Macromedia, who were implementing E4X in ActionScript 3(ActionScript 3 was a fork of Waldemar's JS2/original-ES4 work).
So, along with Macromedia (later acquired by Adobe), work restarted on ECMAScript 4 with the goal of standardizing what was in AS3 and implementing it in SpiderMonkey. To this end, Adobe released the "AVM2", code named Tamarin, as an open source project. But Tamarin and AS3 were too different from web JavaScript to converge, as was realized by the parties in 2007 and 2008.
Alas, there was still turmoil between the various players; Doug Crockford — then at Yahoo! — joined forces with Microsoft in 2007 to oppose ECMAScript 4, which led to the ECMAScript 3.1 effort.
While all of this was happening the open source and developer communities set to work to revolutionize what could be done with JavaScript. This community effort was sparked in 2005 when Jesse James Garrett released a white paper in which he coined the term Ajax, and described a set of technologies, of which JavaScript was the backbone, used to create web applications where data can be loaded in the background, avoiding the need for full page reloads and resulting in more dynamic applications. This resulted in a renaissance period of JavaScript usage spearheaded by open source libraries and the communities that formed around them, with libraries such as Prototype, jQuery, Dojo and Mootools and others being released.
In July of 2008 the disparate parties on either side came together in Oslo. This led to the eventual agreement in early 2009 to rename ECMAScript 3.1 to ECMAScript 5 and drive the language forward using an agenda that is known as Harmony.
All of this then brings us to today, with JavaScript entering a completely new and exciting cycle of evolution, innovation and standardisation, with new developments such as the Nodejs platform, allowing us to use JavaScript on the server-side, and HTML5 APIs to control user media, open up web sockets for always-on communication, get data on geographical location and device features such as accelerometer, and more. It is an exciting time to learn JavaScript.

the history of In the Mughal

June 06, 2014 0 Comments
After the greate period of the Gupta Empire and the reign of the Sultanate of Delhi, India saw the emergence of the largest ever empire with the rise of the Mughal rule in the country. The founder of this new state in India was Zahir-Ud-din Muhammad Babur, a descendant of Jenghis Khan and Timur the Lame. Babur had been thrown out of Central Asia earlier by the Uzbeks, but he managed to gain control of Afghan territories and then set his eyes on India by conquering which he could become more powerful and richer.
In 1518 and 1524 he attacked India and in 1525 he led a well organized army to Delhi. In the battle of Panipat, in 1526, he defeated Ibrahim Lodi, the last of the Delhi Sultans. The next year he defeated t
In the Mughal dynasty he founded, six emperors were famous – Babur (1526 –1530), Humayun (1530 – 1556), Akbar (1556 – 1605), Jehangir (1605 – 1627), Shah Jehan (1627 –1658), and Aurangazeb (1658 –1707). Of these, Akbar and Shah Jehan were two of the most important emperors in the history of India.

The history of Nokia

June 06, 2014 0 Comments

The history of Nokia

Nokia was originally founded as a paper manufacturer by Fredrik Idestam in 1865. After having established a groundwood pulp mill in South-western Finland, Idestam in 1868 constructed a second mill in the nearby town of Nokia: having better resources for the generation of hydropower production. In 1971 Ideastam along with close friend Leo Mechelin transformed the firm into a share company, thereby founding the Nokia Company.
In the late 19th century Nokia added electricity generation to its business activities. After setting up the Finnish Cable Works in 1912, Nokia began to branch out into electronics in the 60s. Having developed its first electronic device in 1962 (a pulse analyser for use in nuclear power plants) Nokia began development on radio telephones in 1963 for the army and emergency services and by 1987 Nokia became the third largest TV manufacturer in Europe
 n 1979 the company established the radio telephone company Mobira Oy as a joint venture with the Finnish TV maker Salora. Having established a firm business footing Nokia released the Nordic Mobile Telephone (NMT) service, the world’s first international cellular network. As the mobile phone industry expanded throughout the 1970s and early 1980s Nokia introduced their first car phone in 1982, the Mobira Senator.
In 1987 Nokia introduced the Mobira Cityman, the first handheld mobile phone for NMT networks. Although the phone was both heavy at 800g and expensive at €4560 it was well received – and is now considered a classic – in large part thanks to Mikhail Gorbachev after he was photographed using one. After having established themselves as major players in the mobile phone industry GSM was adopted as the European standard for digital mobile technology.
Nokia launched the 2100 series in 1994, the first to feature the Nokia Tune ringtone. After having established itself as one of the most frequently played and widely recognised pieces of music in the world the Nokia 2100 went on to sell 20 million phones worldwide (Nokia’s target had been 400,000). In addition to the ringtone Nokia in 1997 introduced the game of Snake: a game that is now replicated on over half a million phones.
By 1998 Nokia established itself as the world leader in mobile phones sales. Between 1996 and 2001 Nokia’s turnover increased by almost 500 percent from €6.5bn to €31bn. The exploding world-wide demand for mobile phones through the 90s caused a major logistics crisis for many mobile phone operators; however Nokia was, and still is today, renowned as being the best operator for handling such logistics.
Nokia in 1999 released the Nokia 7110, capable of rudimentary web-based functions, including email. Further developments in mobile technology meant that in 2001 Nokia launched its first phone with a built-in camera (Nokia 7650) and in 2002 their first video capture phone (Nokia 3650). Though it was in 2002 with Nokia’s first 3G phone (Nokia 6650) that mobile technology was to experience a radical technological shift. Here on phones were able to browse the web, download music, watch TV and provide listless other services.

Nokia was to sell its billionth phone in 2005 as mobile phone subscriptions surpassed 2bn in this same period. In 2007 Nokia was internationally recognised as the fifth most valued brand in the world.
In both 2009 and 2010 the Dow Jones Indexes ranked Nokia as the worlds most sustainable technology company as they set about developing their business methods and strategies in accordance with new environmental standards.
 In October 2009 Nokia posted its first quarterly loss in more than a decade, largely thought to be a repercussion of HTC releasing the first phone to use Google’s Android operating system: the HTC Dream (as of today 60 percent of mobile phones are powered by Android). After a year of struggling to keep pace with iPhone and Android devices Nokia hired former Microsoft executive Stephen Elop as chief executive in September 2009.

In October of 2010 Elop outlined plans to make 1800 job cuts and to streamline Nokia’s Smartphone operations. After admitting its inferiority to Microsoft’s operating system Nokia moved away from Symbian and established a partnership with Microsoft.
Having spent 2010 onwards making thousands of job cuts and enduring the failed successes of its Lumia 800 Nokia were superseded by Samsung as the largest producer of mobile phones.
Nokia has more recently announced the new Lumia 920 as the flagship for Microsoft’s new operating system and have signed a deal to sell and lease back what were its headquarters for the past 16 years.

the history of Windows

June 06, 2014 0 Comments
It’s the 1970s. At work, we rely on typewriters. If we need to copy a document, we likely use a mimeograph or carbon paper. Few have heard of microcomputers, but two young computer enthusiasts, Bill Gates and Paul Allen, see that personal computing is a path to the future.
In 1975, Gates and Allen form a partnership called Microsoft. Like most start-ups, Microsoft begins small, but has a huge vision—a computer on every desktop and in every home. During the next years, Microsoft begins to change the ways we wor


The dawn of MS‑DOS

In June 1980, Gates and Allen hire Gates’ former Harvard classmate Steve Ballmer to help run the company. The next month, IBM approaches Microsoft about a project code-named "Chess." In response, Microsoft focuses on a new operating system—the software that manages, or runs, the computer hardware and also serves to bridge the gap between the computer hardware and programs, such as a word processor. It’s the foundation on which computer programs can run. They name their new operating system "MS‑DOS."
When the IBM PC running MS‑DOS ships in 1981, it introduces a whole new language to the general public. Typing “C:” and various cryptic commands gradually becomes part of daily work. People discover the backslash (\) key.
MS‑DOS is effective, but also proves difficult to understand for many people. There has to be a better way to build an operating system. MS‑DOS stands for Microsoft Disk Operating System

Follow Us @soratemplates