Wednesday, September 9, 2009

Why typewriters beat computers

By Neil Hallows
They're clunky, dirty and can't access the internet, yet every year thousands of people buy typewriters when they could probably afford a computer. Why?
When asked how he writes, Frederick Forsyth has a simple answer. "With a typewriter."
He admits this is to avoid the more difficult business of describing his creative process, but it also means he can celebrate old friends.
There was the steel-cased portable he used as a foreign correspondent in the 1960s. "It had a crease across the lid which was done by a bullet in Biafra. It just kept tapping away. It didn't need power, it didn't need batteries, it didn't need recharging. One ribbon went back and forward and back until it was a rag, almost, and out came the dispatches."
Then the blooming thing blew up and they told me that it was my fault, and it wasn't, it just burnt out
Maureen HugginsAnd after 50 years and a dozen novels including The Day of the Jackal, why change now, he asks.
"I have never had an accident where I have pressed a button and accidentally sent seven chapters into cyberspace, never to be seen again. And have you ever tried to hack into my typewriter? It is very secure."
Although he laughs as he says it, Mr Forsyth identifies the continuing attraction of a typewriter for thousands of people. They find a computer distracting, unreliable or just plain terrifying, and they have a love for the tangible. As he puts it, "I like to see black words on white paper rolling up in front of my gaze".
Mr Forsyth's novels are so popular that he could write them in the sand and publishers would still queue up for his business. But who else is still pounding rather than pressing their keyboard?
Just as quick
The Japanese multinational Brother sold 12,000 electronic typewriters last year in the UK, which is its biggest market in Europe.

Unruffled by the threat of disk defragmentation errorsBrother UK sales director Phil Jones says customers are generally older people, although his company also sells a number to students. He says typewriters remain a cheap way to develop keyboard skills. The most basic model costs around £80.
"Typewriters are much more straightforward to use than computers as they only have one function - typing," says Mr Jones.
And typing is the only thing that Maureen Huggins wants her machinery to do.
A reporter for 52 years, she uses a manual typewriter for her work at the Norfolk Courts Press Agency. Stories are then faxed to newspapers and broadcasters.
Mrs Huggins tried using a computer about 15 years ago and the memory is still raw. "I had four pages of instructions I had to learn, to send [my previous employers] the stories. Then the blooming thing blew up and they told me that it was my fault, and it wasn't, it just burnt out."
She says she can produce her stories at least as quickly as her rivals, because the risk of technical failure is virtually nil - she keeps a spare typewriter at hand - and because the typewriter encourages her to get the story right first time.
This may sound like an impossibly Spartan ideal, where cut and paste is done with scissors and glue, and deleted words remain on the page as angry little blobs. But for some left jaded and distracted by their smarty-pants computers, it is tempting.
The writer Will Self is a convert. He went back to using a manual typewriter several years ago. "I think the computer user does their thinking on the screen, and the non-computer user is compelled, because he or she has to retype a whole text, to do a lot more thinking in the head," he said in a recent interview.
That doesn't necessarily mean that Royals and Underwoods are elbowing PCs and Macs off the desk with their jabby little carriage return levers. But even for the technologically savvy, they have their uses.
Green screen style
Richard Polt, a philosophy professor at Xavier University in Cincinnati, Ohio, collects old typewriters but is sufficiently computer literate to run an attractive website devoted to them. He remembers struggling over a book he was writing.

The green screen look - almost as retro, and available to download"There are so many distractions with the internet, it is also so easy to change and delete what you have written. It is too easy to dither."
So he turned to one of his 175 old typewriters.
"I didn't compose most of the book on a typewriter, but every once in a while I would put out a few pages on a typewriter, a first draft, and it was kind of refreshing."
He knows of others, especially in the early stages of creative writing, who have benefited in a similar way.
This can be achieved without a typewriter. The right software can turn the flashiest computer display into a technological boot camp - an early 80s green screen stripped of dancing paper clips and easy escape routes to the internet.
Useful, perhaps, but not beautiful. Turning a computer into a prop from Ashes to Ashes will never have the aesthetic charm of a Remington Noiseless. It is this historical, emotional pull which draws a particular kind of student or aspiring writer to the typewriter.
I don't know why, but they usually seem to be men, and their heroes are hard, brilliant men from the last century. Posing on their blogs with an antique machine, all that separates them from Hemingway are two dozen cocktails and his ability to write.
Car boot sales
If there really is a move back to typewriters, it probably won't come in time to save what is left of the market.
Brother UK's Mr Jones admits he is "surprised" that people are still buying typewriters, and "amazed" his company sells a handful for more than £500, which would buy a laptop. Typewriter sales are falling 10% a year at the company, which is better known for its printers, faxes and sewing equipment.

Often overlooked - the typewriter's place in women's liberationPerhaps more surprisingly, demand in developing countries is also falling sharply. Godrej Industries, an engineering and consumer products conglomerate, owns the last manual typewriter factory in India.
Senior general manager Sorab Barekh says two-thirds are exported, with various typefaces, to Africa and South East Asia. They tend to be used by remote government outposts which have a poor electricity supply. Sales are falling so fast that more than half a century of production might cease within three years, he says.
But for a long time yet, anyone who wants a typewriter will be able to pick one up at a car boot sale for roughly the price of a replacement ribbon.
And even when the last carriage return rings, their legacy will go on. The "qwerty" keyboard was devised as a means of keeping commonly recurring sequences of letters like "th" and "an" apart so typewriter keys did not stick.
There is argument about whether a different configuration of letters would enable faster typing, but an educationalist who tried to change it in the last century likened the challenge to reversing the Ten Commandments.
The biggest inheritance from the typewriter, however, is the fact that lots of people reading this at work will be women. Typing classes mushroomed at the end of the 19th Century, and this helped many women to enter paid work for the first time. By 1901, Britain had 166,000 female clerks, up from 2,000 half a century before.
It was a limited emancipation. The new employees (often called "type-writers" themselves) were accused of stealing jobs from men, depressing wages and sexually tempting the boss, and their chance of career progression was often nil. But for women to have any job outside the home was revolutionary.
So while the pen may be mightier than the sword, the typewriter was once mightier than both.

NOD32 Keys


September 15,2009

UserName:EAV-21141770
Password:u8jst8evdn

UserName:EAV-21141788
Password:ubdke4vbj5

UserName:EAV-21114882
Password:s6rttmstuu

Username:EAV-20650833
Password:ntxhmrjsch

Username:EAV-20650854
Password:253f8d3d8d

Username:EAV-18065729
Password:nh522sudks


September 09, 2009

Username:EAV-20831172
Password:aea6jnxu8j
Username:EAV-20806371
Password:53epdv4mtx
Username:EAV-21247677
Password:hrvfkxk3sd
Username:EAV-20830846
Password:p45sbe3ax3


September 08, 2009

Username:EAV-21247677
Password:hrvfkxk3sd
Username:EAV-20830846
Password:p45sbe3ax3
UserName:EAV-21097603
Password:bc7j4n2rj5
UserName:EAV-20977108
Password:uetk8tmx7e

Visit daily for daily new keys.

Monday, August 25, 2008

The first hard disk - 50 years ago

In 1956 IBM invented the first computer disk storage system, the 305 RAMAC (Random Access Method of Accounting and Control). This system had a storage of 5 MB and it had fifty disks, each being 61 cm (24″) in diameter.
It wasn’t necessarily a very portable device, so maybe we should stop complaining about those ‘only 120 GB’ laptop drives.

When was the first computer invented?

Unfortunately this question has no easy answer because of all the different types of classifications and types of computers. Therefore this document has been created with a listing of each of the first computers starting with the first programmable computer leading up to the computers of today. Keep in mind that early inventions such as the abacus, calculators, tablet machines, difference machine, etc. are not accounted for in this document.
First programmable computer
The Z1 originally created by Germany's Konrad Zuse in his parents living room in 1936 to 1938 is considered to be the first electrical binary programmable computer.
See our Z1 dictionary definition for additional information about this computer.
The first digital computer
Short for Atanasoff-Berry Computer, the ABC started being developed by Professor John Vincent Atanasoff and graduate student Cliff Berry in 1937 and continued to be developed until 1942 at the Iowa State College (now Iowa State University). On October 19, 1973, US Federal Judge Earl R. Larson signed his decision that the ENIAC patent by Eckert and Mauchly was invalid and named Atanasoff the inventor of the electronic digital computer.
See our ABC dictionary definition for additional information about this computer.
The ENIAC was invented by J. Presper Eckert and John Mauchly at the University of Pennsylvania and began construction in 1943 and was not completed until 1946. It occupied about 1,800 square feet and used about 18,000 vacuum tubes, weighing almost 50 tons. Although the Judge ruled that the ABC computer was the first digital computer many still consider the ENIAC to be the first digital computer.
See our ENIAC dictionary definition for additional information about this computer.
Because of the Judge ruling and because the case was never appealed like most we consider the ABC to be the first digital computer. However, because the ABC was never fully functional we consider the first functional digital computer to be the ENIAC.
The first stored program computer
The early British computer known as the EDSAC is considered to be the first stored program electronic computer. The computer performed its first calculation on May 6, 1949 and was the computer that ran the first graphical computer game.
See our EDSAC dictionary definition for additional information about this computer.
The first personal computer
In 1975 Ed Roberts coined the term personal computer when he introduced the Altair 8800. Although the first personal computer is considered to be the Kenback-1, which was first introduced for $750 in 1971. The computer relied on a series of switches for inputting data and output data by turning on and off a series of lights.
The Micral is considered the be the first commercial non-assembly computer. The computer used the Intel 8008 processor and sold for $1,750 in 1973.
The first workstation
Although never sold the first workstation is considered to be the Xerox Alto, introduced in 1974. The computer was revolutionary for its time and included a fully functional computer, display, and mouse. The computer operated like many computers today utilizing windows, menus and icons as an interface to its operating system.
The first laptop or portable computer
The first portable computer or laptop is considered to be the Osborne I, a portable computer developed by Adam Osborne that weighed 24 pounds, a 5-inch display, 64 KB of memory, two 5 1/4" floppy drives, and a modem.
IBM PCD later released the IBM portable in 1984, it's first portable computer that weighed in at 30 pounds. IBM PCD later announced in 1986 it's first laptop computer, the PC Convertible, weighing 12 pounds. And in 1994 introduces the IBM ThinkPad 775CD, the first notebook with an integrated CD-ROM.
The first PC (IBM compatible) computer
In 1953 IBM shipped its first electric computer, the 701. Later IBM introduced its first personal computer called the "IBM PC" in 1981. The computer was code named and still sometimes referred to as the "Acorn" and had a 8088 processor, 16 KB of memory, which was expandable to 256 and utilizing MS-DOS.
The first PC clone
The first PC clone was developed by Compaq, the "Compaq Portable" was release in March 1983 and was 100% compatible with IBM computers and software that ran on IBM computers.
See the below other major computer companies first for other IBM compatible computers
The first Apple computer
Steve Wozniak designed the first Apple known as the Apple I computer in 1976.
The first computer company
The first computer company was the Electronic Controls Company and was founded in 1949 by J. Presper Eckert and John Mauchly, the same individuals who helped create the ENIAC computer. The company was later renamed to EMCC or Eckert-Mauchly Computer Corporation and released a series of mainframe computers under the UNIVAC name.
The first multimedia computer
In 1992 Tandy Radio Shack becomes one of the first companies to release a computer based on the MPC standard with its introduction of the M2500 XL/2 and M4020 SX computers.
Other major computer company firsts
Below is a listing of some of the major computers companies first computers.
Compaq - March 1983 Compaq released its first computer and the first 100% IBM compatible computer the "Compaq Portable."Digital - In 1960 Digital Equipment Corporation released its first of many PDP computers the "PDP-1."Dell - In 1985 Dell introduced its first computer, the "Turbo PC."Hewlett Packard - In 1966 Hewlett Packard released its first general computer, the "HP-2115."NEC - In 1958 NEC builds its first computer the "NEAC 1101."Toshiba - In 1954 Toshiba introduces its first computer, the "TAC" digital computer.
Additional information:
See our computer dictionary definition for additional information about computers as well as related links and information.

The First Generation Computers

Do you remember this computer?
It is the Bendix G-15 General Purpose Digital Computer, a First Generation computer introduced in 1956.
Another picture (66k). And another (105k). And you can download larger versions of the following pictures on this page by clicking on them. (But be aware, they vary in size between 0.5MB and 1.5MB and downloading will be slow).
Why this interest in the Bendix G-15?
Against the odds, the Western Australian branch of The Australian Computer Museum Inc has rescued one from the scrap heap. That's it, over on the right.
It is in pretty good condition, considering its age, and we hope one day we can get it working again. We also have various programming, operating and technical manuals, and schematics. They have been scanned and you can download them here.
This web site started life in 1998 as a sort of begging letter, seeking more information about the maintenance procedures. We have since been told that there was no formal maintenance manual and that our documentation is complete so far as maintaining the machine is concerned. Still, if you can help with some of the other items we are missing or add anything at all to our store of knowledge about the Bendix G-15,
First Generation Computers.
The first generation of computers is said by some to have started in 1946 with ENIAC, the first 'computer' to use electronic valves (ie. vacuum tubes). Others would say it started in May 1949 with the introduction of EDSAC, the first stored program computer. Whichever, the distinguishing feature of the first generation computers was the use of electronic valves.
My personal take on this is that ENIAC was the World's first electronic calculator and that the era of the first generation computers began in 1946 because that was the year when people consciously set out to build stored program computers (many won't agree, and I don't intend to debate it). The first past the post, as it were, was the EDSAC in 1949. The period closed about 1958 with the introduction of transistors and the general adoption of ferrite core memories.
OECD figures indicate that by the end of 1958 about 2,500 first generation computers were installed world-wide. (Compare this with the number of PCs shipped world-wide in 1997, quoted as 82 million by Dataquest).
Two key events took place in the summer of 1946 at the Moore School of Electrical Engineering at the University of Pennsylvania. One was the completion of the ENIAC. The other was the delivery of a course of lectures on "The Theory and Techniques of Electronic Digital Computers". In particular, they described the need to store the instructions to manipulate data in the computer along with the data. The design features worked out by John von Neumann and his colleagues and described in these lectures laid the foundation for the development of the first generation of computers. That just left the technical problems!
One of the projects to commence in 1946 was the construction of the IAS computer at the Institute of Advanced Study at Princeton. The IAS computer used a random access electrostatic storage system and parallel binary arithmetic. It was very fast when compared with the delay line computers, with their sequential memories and serial arithmetic.
The Princeton group was liberal with information about their computer and before long many universities around the world were building their own, close copies. One of these was the SILLIAC at Sydney University in Australia.
I have written an emulator for SILLIAC. You can find it here, along with a link to a copy of the SILLIAC Programming Manual.
First Generation Technologies
In 1946 there was no 'best' way of storing instructions and data in a computer memory. There were four competing technologies for providing computer memory: electrostatic storage tubes, acoustic delay lines (mercury or nickel), magnetic drums (and disks?), and magnetic core storage.
A high-speed electrostatic store was the heart of several early computers, including the computer at the Institute for Advanced Studies in Princeton. Professor F. C. Williams and Dr. T. Kilburn, who invented this type of store, described it in Proc.I.E.E. 96, Pt.III, 40 (March, 1949). A simple account of the Williams tube is given here.
The great advantage of this type of "memory" is that, by suitably controlling the deflector plates of the cathode ray tube, it is possible to redirect the beam almost instantaneously to any part of the screen: random access memory.
Acoustic delay lines are based on the principle that electricity travels at the speed of light while mechanical vibrations travel at about the speed of sound. So data can be stored as a string of mechanical pulses circulating in a loop, through a delay line with its output connected electrically back to its input. Of course, converting electric pulses to mechanical pulses and back again uses up energy, and travel through the delay line distorts the pulses, so the output has to be amplified and reshaped before it is fed back to the start of the tube.
The sequence of bits flowing through the delay line is just a continuously repeating stream of pulses and spaces, so a separate source of regular clock pulses is needed to determine the boundaries between words in the stream and to regulate the use of the stream.
Delay lines have some obvious drawbacks. One is that the match between their length and the speed of the pulses is critical, yet both are dependent on temperature. This required precision engineering on the one hand and careful temperature control on the other. Another is a programming consideration. The data is available only at the instant it leaves the delay line. If it is not used then, it is not available again until all the other pulses have made their way through the line. This made for very entertaining programming!
A mercury delay line is a tube filled with mercury, with a piezo-electric crystal at each end. Piezo-electric crystals, such as quartz, have the special property that they expand or contract when the electrical voltage across the crystal faces is changed. Conversley, they generate a change in electrical voltage when they are deformed. So when a series of electrical pulses representing binary data is applied to the transmitting crystal at one end of the mercury tube, it is transformed into corresponding mechanical pressure waves. The waves travel through the mercury until they hit the receiving crystal at the far end of the tube, where the crystal transforms the mechanical vibrations back into the original electrical pulses.
Mercury delay lines had been developed for data storage in radar applications. Although far from ideal, they were an available form of computer memory around which a computer could be designed. Computers using mercury delay lines included the ACE computer developed at the National Physical Laboratory, Teddington, and its successor, the English Electric DEUCE.
A good deal of information about DEUCE (manuals, operating instructions, program and subroutine codes and so on) is available on the Web and you can find links to it here.
Nickel delay lines take the form of a nickel wire. Pulses of current representing bits of data are passed through a coil surrounding one end of the wire. They set up pulses of mechanical stress due to the 'magnetostrictive' effect. A receiving coil at the other end of the wire is used to convert these pressure waves back into electrical pulses. The Elliott 400 series, including the 401, 402, 403 used nickel delay lines. Much later, in 1966, the Olivetti Programma 101 desk top calculator also used nickel delay lines.
The magnetic drum is a more familiar technology, comparable with modern magnetic discs. It consisted of a non-magnetic cylinder coated with a magnetic material, and an array of read/write heads to provide a set of parallel tracks of data round the circumference of the cylinder as it rotated. Drums had the same program optimisation problem as delay lines.
Two of the most (commercially) successful computers of the time, the IBM 650 and the Bendix G-15, used magnetic drums as their main memory.
The Massachusetts Institute of Technology Whirlwind 1 was another early computer and building started in 1947. However, the most important contribution made by the MIT group was the development of the magnetic core memory, which they later installed in Whirlwind. The MIT group made their core memory designs available to the computer industry and core memories rapidly superceded the other three memory technologies.
Where Does the Bendix G-15 Fit In?
Table 1 shows, in chronological order between 1950 and 1958, the initial operating date of computing systems in the USA. This is not to suggest that all of these computers were first generation computers, or that no first generation computers were made after 1958. It does give a rough guide to the number of first generation computers made.
Bendix introduced their G-15 in 1956. It was not the first Bendix computing machine. They introduced a model named the D-12, in 1954. However, the D-12 was a digital differential analyser and not a general purpose computer.
We don't know when the last Bendix G-15 was built, but about three hundred of the computers were ultimately installed in the USA. Three found their way to Australia. The one we have was purchased by the Department of Main Roads in Perth in 1962. It was used in the design of the Mitchell Freeway, the main road connecting the Northern suburbs to the city.
The G-15 was superceded by the second generation (transistorised) Bendix G-20.
Table 2 shows the computers installed or on order, in Australia, about December 1962. The three Bendix G-15s were in Perth (Department of Main Roads), Sydney (A.W.A. Service Bureau) and Melbourne (E.D.P Pty Ltd).
Overview of the G-15
The Bendix G-15 was a fairly sophisticated, medium size computer for its day. It used a magnetic drum for internal memory storage and had 180 tube packages and 300 germanium diode packages for logical circuitry. Cooling was by internal forced air.
Storage on the Magnetic Drum comprised 2160 words in twenty channels of 108 words each. Average access time was 4.5 milliseconds. In addition, there were 16 words of fast-access storage in four channels of 4 words each, with average access time of 0.54 milliseconds; and eight words in registers consisting of 1 one-word command register, 1 one-word arithmetic register, and 3 two-word arithmetic registers for double-precision operations.
A 108-word buffer channel on the magnetic drum allowed input-output to proceed simultaneously with computation.
Word size was 29 bits, allowing single-precision numbers of seven decimal digits plus sign during input-output and twenty nine binary digits internally, and double-precision numbers of fourteen decimal digits plus sign during input-output, fifty eight binary digits internally.
Each machine language instruction specified the address of the operand and the address of the next instruction. Double-length arithmetic registers permitted the programming of double-precision operations with the same ease as single-precision ones.
An interpreter called Intercom 1000 and a compiler called Algo provided simpler alternatives to machine language programming. Algo followed the principles set forth in the international algorithmic language, Algol, and permitted the programmer to state a problem in algebraic form. The Bendix Corporation claimed to be the first manufacturer to introduce a programming system patterned on Algol.
The basic computation times, in milliseconds, were as follows (including the time required for the computer to read the command prior to its execution). The time range for multiplication and division represents the range between single decimal digit precision and maximum precision.
Single-Precision Double-Precision
Addition or Subtraction ०.54 ०.81
Multiplication or Division 2.43 to 16.7 2.43 to 33.1
External Storage was provided on searchable paper tape (2,500 words per magazine) and, optionally, on one to four, magnetic tape units with 300,000 words per tape unit reel.
More detail about the Bendix G-15 General Purpose Digital Computer.

AVIDAC, Argonne's first digital computer, began operation in January 1953. It was built by the Physics Division for $250,000. Pictured is pioneer Argonne computer scientist Jean F. Hall.

Generations of Computers

Generations of Computers
The history of computer development is often referred to in reference to the different generations of computing devices. Each generation of computer is characterized by a major technological development that fundamentally changed the way computers operate, resulting in increasingly smaller, cheaper, more powerful and more efficient and reliable devices. Read about each generation and the developments that led to the current devices that we use today.
First Generation - 1940-1956:
Vacuum TubesThe first computers used vacuum tubes for circuitry and magnetic drums for memory, and were often enormous, taking up entire rooms. They were very expensive to operate and in addition to using a great deal of electricity, generated a lot of heat, which was often the cause of malfunctions. First generation computers relied on machine language to perform operations, and they could only solve one problem at a time. Input was based on punched cards and paper tape, and output was displayed on printouts.
The UNIVAC and ENIAC computers are examples of first-generation computing devices. The UNIVAC was the first commercial computer delivered to a business client, the U.S. Census Bureau in 1951.
Second Generation - 1956-1963:
TransistorsTransistors replaced vacuum tubes and ushered in the second generation of computers. The transistor was invented in 1947 but did not see widespread use in computers until the late 50s. The transistor was far superior to the vacuum tube, allowing computers to become smaller, faster, cheaper, more energy-efficient and more reliable than their first-generation predecessors. Though the transistor still generated a great deal of heat that subjected the computer to damage, it was a vast improvement over the vacuum tube. Second-generation computers still relied on punched cards for input and printouts for output.
Second-generation computers moved from cryptic binary machine language to symbolic, or assembly, languages, which allowed programmers to specify instructions in words. High-level programming languages were also being developed at this time, such as early versions of COBOL and FORTRAN. These were also the first computers that stored their instructions in their memory, which moved from a magnetic drum to magnetic core technology.
The first computers of this generation were developed for the atomic energy industry.
Third Generation - 1964-1971:
Integrated CircuitsThe development of the integrated circuit was the hallmark of the third generation of computers. Transistors were miniaturized and placed on silicon chips, called semiconductors, which drastically increased the speed and efficiency of computers.
Instead of punched cards and printouts, users interacted with third generation computers through keyboards and monitors and interfaced with an operating system, which allowed the device to run many different applications at one time with a central program that monitored the memory. Computers for the first time became accessible to a mass audience because they were smaller and cheaper than their predecessors.
Fourth Generation - 1971-Present:
MicroprocessorsThe microprocessor brought the fourth generation of computers, as thousands of integrated circuits were built onto a single silicon chip. What in the first generation filled an entire room could now fit in the palm of the hand. The Intel 4004 chip, developed in 1971, located all the components of the computer - from the central processing unit and memory to input/output controls - on a single chip.
In 1981 IBM introduced its first computer for the home user, and in 1984 Apple introduced the Macintosh. Microprocessors also moved out of the realm of desktop computers and into many areas of life as more and more everyday products began to use microprocessors.
As these small computers became more powerful, they could be linked together to form networks, which eventually led to the development of the Internet. Fourth generation computers also saw the development of GUIs, the mouse and handheld devices.
Fifth Generation - Present and Beyond:
Artificial IntelligenceFifth generation computing devices, based on artificial intelligence, are still in development, though there are some applications, such as voice recognition, that are being used today. The use of parallel processing and superconductors is helping to make artificial intelligence a reality. Quantum computation and molecular and nanotechnology will radically change the face of computers in years to come. The goal of fifth-generation computing is to develop devices that respond to natural language input and are capable of learning and self-organization.

Thursday, August 21, 2008

नमस्कार

तपाईंहरुको माझमा यो नयाँ ब्लग लिएर आएको छु । आशा छ मेरो अर्को ब्लग लाई जस्तै माया र ममता दिएर सहयोग गर्नुहुनेछ । यस ब्लगमा कम्प्युटर सँग सम्बन्धित नयाँ तथा पुराना तथ्यहरु लाई पस्कने प्रयास गर्ने छु। धन्यवाद