Year Part Of A Date example essay topic

1,192 words
Almost everyone has heard the term "Y 2 k", but most have no clue what it means. This research paper will explain what it means, what is to blame, and possible effects the problem could have in the future. Although the issue seems a simple, even pointless one on the surface, once examined, you can discover many separate causes and implications and how widespread this problem actually is. The genuine meaning of the Year 2000 Bug also known as Y 2 k (Y representing "Year", "k" stands for thousand), or Millennium Time Bomb is simple to understand. It represents the computer's inability to differentiate between the years 2000 and 1900. It can be explained like this: when the year part of a date is expressed using two digits, as is the standard in computing, the possible year value ranges from 00 to 99.

When humans add 1 to 99, they will get the answer of 100. However, if you tell a computer to do the same thing in the space of 2 digits, it will come up with either 0 or 00. If you take the date 3/19/99 and tell the computer to add 1 to the year part, the result would look like 3/19/00. To us humans, the date will suggest that the year is 2000, but to a computer the year will be interpreted as 1900. This is where the problem starts. The first question many ask when first learning about the Year 2000 problem is "How was this allowed to happen" The thought that so much damage could be done, by so many people, over such a long period of time and has gone so undetected, is hard to believe.

And there is no single answer to that question; it's a combination of several business and technological problems. Back when hard drive space was expensive in the 60's and 70's, it was important to cut down on the amount of stored data in an application. Character storage space was cut, as well as numeric storage space, and dates were too. At the time it was considered uneconomical to store the full four digits of a year, when only the two-digi year part was actually needed. Programs were able to continue processing dates as normal without ever understanding the concept of a century.

Before long, the storage of dates with only two digits for the year was adopted as the standard throughout the industry, and was input in lines of code necessary to run the applications. At the time some people would have raised doubts about the long-term effect of this space-saving solution. But they were probably told that their systems would not be in place for more than a few years. How were they to know that 20 years later these systems would still be in place Software generally runs forever unless it is stopped if something turns it off or its operating environment changes and the software can't operate, like the date going passed the specified range. Even if it's environment changed, the application might still ignore the error and keep processing information that is wrong, until it is stopped. Even as storage space became less expensive and there wasn't a need to write a 2-digit year, programs were still written which still used this format.

Partly out of habit, and partly out of the need for compatibility with existing applications. Also, when we think about it, when is it necessary to write the full 4-digit year Here is where laziness was involved. Cheques, application forms, etc. don't require writing the century, so why would the earlier programmers want to type in the full date when they could get away with only 2 For this reason in the 70's, some applications were written with only one digit to represent the year. Of course, these had to be re-written with the turning of the decade, the US Air Force found out in 1979.

Back then the change could be performed by a few programmers because computing was not as prevalent as it is today. The matter of the inability to correctly tell the date may seem insignificant, but it's those 2 digits that may be the start for the largest and most costly exercise ever undertaken by any industry in the world. The first reasonable cost estimation is that its about $1 to fix each line of code, and with 600 billion lines of code to be fixed, you get a rough idea of the money involved. Other estimations have been up to 1 trillion dollars. An example of what could happen due to the date malfunction is how a finance company might withdraw money from a client's account over a 5-year period, to pay back a loan.

If the payments started on 9/10/99 and was to end on 9/10/04, the computer might interpret that as September 9, 1904 and immediately stop taking money from the client's account, thinking the expiry date of the loan has passed. This could also work the other way for a company such as a power supplier, where a computer could falsely calculate bills to be 100 years outstanding. Some false ideas about what could happen at the turn of the century have people believing that computers controlling the power supply will shut down, elevators will crash, home appliances will be useless, the world as we know it will end, and there is nothing we can do to stop it. Although it is a serious problem, it's not nearly as dangerous as some would believe. As for the matter of the lights going out-electric supply and delivery systems are not heavily reliant on computers and electronic controls.

They rely on computer systems that can be manually operated if the need arises, as often is the case in power outages. There is no reason for the millennium bug to have any effect on elevators; they aren't date-dependant in any way. And for home appliances going haywire-they couldn't care less what year it is. Another myth involving Y 2 k is that it's like nothing the computing world has ever seen before, which is false.

The US Air Force's decade rollover problem was a good example of this happening in the past. This is actually the kind of problem software engineers solve regularly, the only difference this time is that its on a worldwide scale. The fact of the matter is that the Year 2000 issue has always been there. Programmers have been aware of it for years. Unfortunately, these programmers had the attitude of, "I won't be here in 15 years so it doesn't concern me". So, the problem has gone unnoticed in the information technology industry.

It's only because the change of the century is almost here and that companies stand to lose large amounts of money, that the issue is finally receiving the attention it deserves.