Software application development has only been around since the late 1970s. Compared to other industries and professions, the software industry is still very young. Ever since organizations began to use computers to support their business tasks, the people who create and maintain those “systems” have become more and more sophisticated and specialized. This specialization is necessary because as computer systems become more and more complex, no one person can know how to do everything.
One of the “specialties” to arise is the Business Analyst. Although some organizations have used this title in non-IT areas of the business, it is an appropriate description for the role that functions as the bridge between people in business and IT. A Business Analyst is a person who acts as a liaison between business people who have a business problem and technology people who know how to create automated solutions.
The use of the word “Business” is a constant reminder that any application software developed by an organization should further improve its business operations, either by increasing revenue, reducing costs, or increasing service level to the customers.
History of the Business Analyst Role
In the 1980s, when the software development life cycle was well accepted as a necessary step, people doing the work typically came from a technical background and were working in the IT organization. They understood the software development process and often had programming experience. They used textual requirements along with ANSI flowcharts, dataflow diagrams, database diagrams, and prototypes. The biggest complaint about software development was the length of time required to develop a system that didn’t always meet the business needs. Business people had become accustomed to sophisticated software and wanted it better and faster.
In response to the demand for speed, a class of development tools referred to as CASE (Computer Aided Software Engineering) was invented. These tools were designed to capture requirements and use them to manage a software development project from beginning to end. They required a strict adherence to a methodology, involved a long learning curve, and often alienated the business community from the development process due to the unfamiliar symbols used in the diagrams.
As IT teams struggled to learn to use CASE tools, PCs (personal computers) began to appear in large numbers on desktops around the organization. Suddenly anyone could be a computer programmer, designer and user. IT teams were still perfecting their management of a central mainframe computer and then suddenly had hundreds of independent computers to manage. Client-server technologies emerged as an advanced alternative to the traditional “green screen,” keyboard-based software.
The impact on the software development process was devastating. Methodologies and classic approaches to development had to be revised to support the new distributed systems technology and the increased sophistication of the computer user prompted the number of software requests to skyrocket.