當前位置

首頁 > 商務英語 > 計算機英語 > 計算機電子數據處理的介紹

計算機電子數據處理的介紹

推薦人: 來源: 閱讀: 1.63W 次

計算機電子數據處理是以計算機替代人工處理例行性的數據,併產生報表以支持組織的作業活動。接下來小編爲大家整理了計算機電子數據處理的介紹,希望對你有幫助哦!

計算機電子數據處理的介紹

  History

The first commercial business computer was developed in the United Kingdom in 1951, by the Joe Lyons catering organisation. This was known as the 'Lyons Electronic Office' - or LEO for short. It was developed further and used widely during the 1960s and early 1970s. (Joe Lyons formed a separate company to develop the LEO computers and this subsequently merged to form English Electric Leo Marconi and then International Computers Ltd.)

Early commercial systems were installed exclusively by large organisations. These could afford to invest the time and capital necessary to purchase hardware, hire specialist staff to develop bespoke software and work through the consequent (and often unexpected) organisational and cultural changes.

At first, individual organisations developed their own software, including data management utilities, themselves. Different products might also have 'one-off' bespoke software. This fragmented approach led to duplicated effort and the production of management information needed manual effort.

High hardware costs and relatively slow processing speeds forced developers to use resources 'efficiently'. Data storage formats were heavily compacted, for example. A common example is the removal of the century from dates, which eventually lead to the 'millennium bug'.

Data input required intermediate processing via punched paper tape or card and separate input to computers, usually for overnight processing. Data required validation in batches. All of this was a repetitive, labour intensive task, removed from user control and error-prone. Invalid or incorrect data needed correction and resubmission with consequences for data and account reconciliation.

Data storage was strictly serial on magnetic tape: the use of data storage within readily accessible memory was not cost-effective.

Results would be presented to users on paper. Enquiries were delayed by whatever turn round was available.

  Today

As with other industrial processes commercial IT has moved in all respects from a bespoke, craft-based industry where the product was tailored to fit the customer; to multi-use components taken off the shelf to find the best-fit in any situation. Mass-production has greatly reduced costs and IT is available to the smallest company or one-man band - or school-kid.

LEO was hardware tailored for a single client. Today, Intel Pentium and compatible chips are standard and become parts of other components which are combined as needed. One individual change of note was the freeing of computers and removable storage from protected, air-filtered environments. Microsoft and IBM at various times have been influential enough to impose order on IT and the resultant standardisations allowed specialist software to flourish.

Software is available off the shelf: apart from Microsoft products such as Office, or Lotus, there are also specialist packages for payroll and personnel management, account maintenance and customer management, to name a few. These are highly specialised and intricate components of larger environments, but they rely upon common conventions and interfaces.

Data storage has also standardised. Relational databases are developed by different suppliers to common formats and conventions. Common file formats can be shared by large main-frames and desk-top personal computers, allowing online, realtime input and validation.

In parallel, software development has fragmented. There are still specialist technicians, but these increasingly use standardised methodologies where outcomes are predictable and accessible. At the other end of the scale, any office manager can dabble in spreadsheets or databases and obtain acceptable results (but there are risks).