1         Design Organizations

2         Leadership in Design Organizations

3         Human Resources

4         Employer : Employees and Employment

5         Data Systems

6         Information Systems

7         Information Documents

8         Information Resources of Organizations

9         Languages

10       Project Management

11       Decision making, Problem solving

12       Design processes

13       Quality Conscience

14       Risk Management

15       Finance

16       Reports






Organizations are established to achieve certain objectives more effectively and more economically, than individuals could do so by acting themselves. An organization is defined as an amalgam or aggregation of human and material resources as a distinct entity or system. An Organization is generally an on going effort, based on an approach that may be adaptive to many different purposes. Compared to this Enterprise is a prime effort, and may remain as one time effort or an individualized activity.

Organizations are established for:

1 Producing and Executing physical things like goods, structures and commodities, e.g. manufacturing units, contracting, assembly, etc.

2 Managing and Servicing various types of systems, projects and setups, e.g. security, insurance, internet, surgical, healthcare, etc.

3 Designing and Distinguishing the means, procedures and objectives, e.g. architectural, interior design, legal, marriage counselling, project consultancy.

Externally an organization may have one of these activities as the core work, but its internal working often consists of a mix of all the activities. Simple organizations have very clearly delineated working. Small organizations avoid all minor activities for the sake of economy and efficiency, or out-source such activities. Large organizations conduct all the three categories of activities in various measures.

Basic Departments of Organizations: Planning, Manufacturing and Service

An organization is strongly one of these activities oriented. The scale and nature of differ for small to large organizations.

Small organizations: A small Planning organization like a designer or legal advisor delivers only the judgement or concept and so need not participate in the action actualization (production) or post action fall out (services). In small Manufacturing organizations planning is occasional, handled by the owner or plant manager, and services are outsourced, so both have no distinct departments. A small scale Service organization depends on the job-work or sourced assignments and is not bothered for planning or production aspects.

Large organizations: A large Planning setup needs to test run its concepts through modelling, prototyping or through public surveys. Such organizations continuously supervise their job not just for overseeing but for satisfaction and tactical improvisations (servicing). Planning professionals like designers always visit past works to prove to ex-clients that they care for their creations. Large Manufacturing units, derive a lot of advantage through customised planning, retrofitting, innovative adaptation, etc. so have an in-house planning team. Similarly large systems though produced in plant, require transportation to site, on site work like erection, test operating the system, warrantied operations and such services. Large Service units not only repair components (production) but also reestablish the systems by innovative adaptation (planning).


Deliverables from an organization

An organization creates a deliverable product, renders a service or a formulates a concept. Commercially these Deliverables take the form of products, projects, reports, plan of actions, advisory, solution, jobs, assignment, order, commission, etc. The deliverables are mostly for external clients, but sometimes are for internal clients like other departments within the organization.

Organization delivers:

1         to an external client the goods, services or concepts, who inturn compensates the organization.

2         to an internal person or department something that was deliberately (ordered-planned) created, but for which no definite compensation may be available.

3         to an external agency an item, service or concept, as a favour, for which no definite compensation may be available.

4         entities that are promoted within the organization (including its sites and at clients’ places). These are conceived to monitor and regulate the working of the organization, its image in the market and core-competence in the field. These could be in the form of products, procedures, styles, judgements, confirmation, rejections, or assurance that every thing operates at desired or predefined level.

Organizations prefer activities that provide a direct gain. For this purpose organizations have a formal or informal setup to continuously audit their activities. Here the cost-benefit of all external as well as internal deliverables are assessed.

The evaluation results:

         Deliverables are recognised, neatly categorised and reclassified, to match the type of client.

Professionals charge payable-extra fees over their basic fees. Similarly manufacturers charge extra for delivery, site installation, test-run and the warrantees. Service organizations charge extra for replaced components. Such distinctions help an organization and clients to know the value of deliverables. An organization may discount its own deliverable but may charge extra for the other facilitations.

         Client definition helps an organization to differentiate between external and internal clients. It is comparatively easy to assess the cost-benefit ratio for external clients. However, internal clients have interdependency and sometimes external bearings, which makes cost-benefit judgement difficult. The confusion in the later case may force the organization to out-source such activities.

         An audit identifies departments with high public exposure (that offer too many freebies), These are reorganised either by moving them to the internal zones or as isolated entities.

         An audit also helps in tagging transactions by internal clients who have frivolous demands. This ultimately helps in realistic cost-benefit assessment.

In design organizations technical talents like draftspersons, model makers, site supervisors, messengers, etc. form a common pool. These may be used frivolously.

         Clients are encouraged or forced to move to certain category of work with assurance of linked advantages (e.g. legal ownership, guarantee / warrantee) and satisfaction (service and operational support).

TV and car manufacturers provide a costless guarantee or extended warrantee for their products to achieve brand faithfulness. Doctors and other estate developers etc. provide free advice, ideas, consultancy etc. to know a client, but soon enough, the client becomes a recipient of charged product or service.


Assignment Handling

Organizations get assignments from both, internal and external clients. Large and complex assignments that require distinctive effort are also called Projects. Projects consist of smaller units or jobs and require routine efforts.

Every organization develops its own methods of project handling. A Project is led by a core group of experts or a team leader. In Design organizations each project requires distinctive human skills. In Manufacturing organizations there is a heavy dependence on tools, equipment and plants, so the jobs are defined for their efficient use. Service organizations are governed time as key element, so their jobs are set in time modules.

A Job is a trade, skill or schedule specific work module. It allows individualised attention and effective use of the available resources. Its efficiency of execution or operation can be examined and upgraded independently of other jobs. Jobs are handled on continuous as well as batch bases.

Organizations that repeatedly handle very large and complex assignments develop specific departments. Such specific job handling capacities are universal across that class of organizations. So spare capacities are offered to others, and excess work is outsourced. Jobs of routine nature are handled productively within the organization, but novel needs are better outsourced.

Other activities of the organizations

Prime activity of any organization is to earn a gain, but simultaneously many Conventional activities also occur within the organization.

Conventional activities are:

1 Sustenance of the organization itself as a functional entity.

1.1Determination and Evaluation of aims, policies, goals

1.2Planning and deployment of financial resources

1.3Planning and Acquisition of other facilities

1.4Procurement and Upkeep of assets

1.5Personnel Management.

2 Peripheral activities that add to the advantages for the organization.

2.1Public relations

2.2Client relations

2.3Other relations (contractors, suppliers, co-professionals, associates, consultants, free lancers, etc.)

2.4Facilitating the execution of assignments (raw material procurement, materials handling, erection, execution, manufacturing processes, testing)

2.5Tasks’ evaluation (quality controls, testing, certification)

2.6Marketing (goods, services), billing, money collection

2.7Servicing (post execution or delivery, servicing, maintenance, guarantees)

3 Efficiency and productivity of the organization.

3.1Determination and definition of procedures

3.2Standardization of inputs, outputs and procedures

3.3Information collection, Inquisitions, investigations and surveys,

3.4Installation and management of information (data) storage, manipulation and retrieval devices

3.5Publications and dissemination of organization's output (data, concepts, ideas) material.


Elements of Jobs or Projects in Design Organizations

Design organizations operate with assignments, which have Six basic elements:

1 Person/ s who:                              assign the task, determine roles, perform the tasks, oversee or supervise the task performers.

2 Task body:                                    physical things like: parts, objects, raw materials, etc., and non physical things like: concepts, ideas, themes, etc.

3 Information, data:                         external inputs: from clients, organization's own search, and internal inputs: from archived data, evaluations, judgments, from employees’ knowhow, site reports, feedback, by manipulation of various inputs

4 Tools, plants, equipments:            space, location facilities, methodology, formulations, processes, schedules, acquisition, replacement

5 Services:                                       conveyance, communication, storage, data management, welfare, resources management, public relations, goodwill creation, information dissemination

6 Time:                                              Schedules of delivery, servicing, rate of operation, rate of returns.

3.2                              LEADERSHIP IN



A convener of the design organization is normally the prime leader of the unit. Organization is launched when its convener has one or several authorities, such as:

‘Leadership is the ability of an individual to influence, motivate, and enable others to contribute toward the effectiveness and success of the organizations of which they are members,’ a person in a position or office of authority, such as a President or a chairperson.

Formal authority to lead an organization is acquired by the capacity to reimburse or compensate people who work for the organization.

● Technical authority derives from superior knowledge, expertise, skill, experience, etc.

Personal authority is a function of Personality attributes such as: age, sex, race, bearing, determination, will power, appearance, charisma, height, weight, etc.

Conveners of the organization, who lack any of these features, try to make it up by other means. Formal authority can be procured by having a financier partner or associate, or an official appointment. Technical authority can be secured by hiring technically qualified associates or employees. Personal authority can be modified by having an indirect or remote mode of management.

Quality of leadership must vary according to the nature of work in the organization, but nominally it is the quality of leadership that defines the work style of the organization. To achieve the first object, organizations separate out the domain of leadership for the functioning of the organization from the domain of leadership required to handle a project. The second aspect requires the leader to be as versatile as the project demands.

Organizations that handle highly variable situations or non-repeating projects need a very Radical leader. On the other hand organizations with routine projects will function well under a Methodical leader. An Autocratic leader overrides the situational differences and imposes a preconceived style. The autocratic leader expects complete obedience. Such a leadership is works well for projects that are critical in time, resources and extent. A Democratic leader would rather mould the situation, so that it can be handled within the ambience of the personal (leadership) qualities. Employees get full support, status and due recognition, and as a result show responsible behaviour and self-discipline. Democratic leaders are ideal for projects involving large user base. A Bohemian leader develops a style to suit the situation on hand, and are often very useful in tackling continuously variable situations. A Custodial leader has extra ordinary economic resources so makes employees dependent on the organization with security and benefits. The resulting performance is barely adequate.


Authority and Responsibility in Organizations

Leadership in organization is recognised in terms of authority and responsibility. Authority refers to the right or prerogative of requiring action over others, or simply a right to command, whereas, Responsibility means being prepared for the consequences of application of authority. A leader passes on a part of the authority to selected subordinates, and makes them responsible for their actions. By sharing the responsibility a leader strengthens the ultimate authority.

A leader establishes a rational link between the authority and responsibility. Leaders create a well-balanced structure of authority and responsibility within their organization through selective participation of subordinates. A logical and transparent relationship between authority and responsibility motivates other subordinates to belong to the process.

Members of the organization take on responsibilities with different concepts, as an assignment, as a perceived duty, as something to reimburse the favours or the compensations, as a share of power or prestige, or even as a compulsion. The responsibilities unless accompanied by adequate handout or recognition of the authority, causes unpredictable responses.

A complex organization will have many layers of leaders, not only with specific responsibility and authority but also with a unique leadership quality. To provide a unified structure to these diverse qualities, and manage them dispassionately, a coordinator or manager is required. A coordinator's job is to dispense the authorities and responsibilities in a formal and ceremonial manner. A coordinator or the manager usually has the power to hire, fire or favour any subordinate. ‘Managers are people who do things right, but leaders are people who do the right thing’- Warren Bennis, ‘On Becoming a Leader’.


Work Culture and Work Climate

Each organization has its own distinctive work culture. The work culture is a historical formation. Work culture in an organization emerges from the revered formal and informal systems of the past. It is a combination of the collective history, the continuum of leadership, residual effects of events and crises, and the physical spread of the organization in the society. This results in traditions, routines, taboos, pride, prejudices, etc. that permeate in the organization. The cultural setting of the organization impacts the behaviour of its members.

The work climate results from the recent working of the organization. A climate reflects the quality of current leadership. An organizational climate is directly related to the leadership and management style of the leader, based on the values, attributes, skills, and actions, as well as the priorities of the leader. It is seen as the empathy the organization creates in its members, clients and collaborators. An individual or a short term leader cannot easily create or change the work culture because it very deep rooted. Work culture influences the characteristics of the Work climate by its effect on the actions and thought processes of the leader. A leader can hope to mould the work culture by improvisation of the work climate.


Specialization or core competence of the organizations

Organizations come into being with specific aims. All organizations intend to specialize in tasks that are analogous to their aims. But specialization is acquired through repetition of opportunities. Specialization leads to an economy in the operations. It also upgrades the organization's capacity to deal with larger or complex tasks. Specialization, is perceived as an innovative activity, that causes enough synergies, to make the organization behave like self correcting or continuously adjusting biological entity. An Autocratic leader may stimulate an organization towards an acute specialization in only one or few fields, whereas a Bohemian leader may dissipate the energy and de-focus the goals of the organization. A Democratic leader will continuously review and revise the aims of the organization, and plan the resources, to make the organizations creative.


Creativity in Organizations

Design Organizations thrive on new ideas, concepts, innovations, etc. A creative environment comes about by many factors. There should be teamwork spirit, willingness to help each other, commitment and dedication to assigned tasks, trust with fellow workers. Personnel should have access to appropriate resources, including facilities, equipment, information, funds, and people. If work is challenging or tasks are intriguing than there is an attraction to handle it. Staff members should have some control on tasks they carry out and freedom in deciding how to accomplish a task. A manager or leader who sets clear goals and is able to communicate well with subordinates, encourages creativity. Existence of defined and surprise rewards encourages creative efforts. A collaborative atmosphere sets in, when the staff shares the vision and goals of the organization.

In any well knit organization, creativity comes about, through several layers of activities, carried out by individuals with many different talents and personality traits. ‘It operates like a relay race, but the participants have no idea who will take over, at which level and when’. Often the racers have no idea, whether they were running forward or backward, i.e. towards or away from the finish line or goal.

Organizations become and remain creative when roles that personnel are required to play, are very definite. Where there is a knowledgeable and visible structure, one knows who is going to take over at what time and at which level. A creative idea or concept will be accommodated, supported and carried through, if necessary, by even changing the goals of the organization. The leaders of such organization are sensitive, and have a ready mechanism to improvise the goals of their organization on a continuing basis.

Creativity fails to spread in an organization because there is:

           1         Fear of ridicule.

           2         Fear of theft of idea (loss of authorship or patent)

           3         Lack of time

           4         Lack of competence to further the idea

           5         Lack of power and resources to further the idea

           6         Lack of buyers / takers of different ideas

           7         Lack of compensation

In organizations where obnoxious quality control checks, evaluations, secret reportage, etc. abound, personnel come under pressure and become sterile. Promotions other than on qualitative criteria, allocation of resources other than on needs based assessments, recognition of wrong members, delayed or inadequate compensation, etc. are some other factors that vitiate the working of an organization.

A good leader makes the personnel realize that real measure of creativity is in the gains or advantage an organization gets. This is a difficult proposition, as it requires a very high degree of transparency in accounting and auditing processes. Everyone must clearly realize what an effort will cost, and how much benefit the organization will get out of it. Creativity is both a personal and group pursuit. A personal innovation must have confirmation of the larger group, and the group’s achievement must remain impersonal.

Creativity is not in specialization (capacity to excel in limited fields) nor in generalization (capacity to handle many different situations) in any field. Specialization means being consistently proficient in sustaining the technical superiority, whereas Generalization means being efficient or productive, but not at the cost of quality.

3.3                     HUMAN RESOURCES

Personnel are the most important asset for any organization. Personnel as Human resource are not only immensely manipulable, but up-gradable to seemingly infinite levels of efficiency. An organization hires people with required education, skill, experience, inclination and personality trait. Organizations recognize, support and even reformat these qualities through formal training and by providing opportunistic exposures. To hire and exploit the human resources organizations use Job assignment as the key method and pay incentives. Members of the organization are motivated in different ways to modify or upgrade their expertise.

Exploiting individual talents and traits

Organizations fully exploit the individual talents and traits. First, persons with only required qualities are sought. Second, better compensation is offered for hiring specific qualities. Third, incentives are offered for the readiness to reformat the talents and traits. Fourth, employees who are unable to convert are punished or shifted out of the organization.

Small organizations do not have the capacity to reformat the talents and personality traits of individual staff members, either by retraining or by providing opportunistic exposure, to match the occasional requirements. Small organizations, as a result, resort to frequent hiring and firing of employees.

Large organizations reshuffle their staff consistently to adjust to the fluctuating needs. Large organizations handle large volume of work, and so can effectively reposition the personnel for reformatting the talent. For large organization it is more efficient to retrain a person, than hire a stranger, and with that disturb the normal work culture of the unit.

Categories of personnel in Organizations

Personnel of the organization can be classified into three broad categories.

1         Chores that require little innovativeness, and which can as well be assigned to machines, are handled by workers.

2         Assignments that require some degree of thought are carried out by technicians.

3         Tasks that require creativity are handled by experts or professionals.

These three categories constitute a layered arrangement. There are no specific models as to which category of staff, numerically must form the dominant layer. Organizations involved in Professional work have the third category as the dominant layer. Production organizations have the first category as the dominant layer. Whereas Service organizations such as concerned with testing, evaluation, data management, administration, etc., have the second category as the dominant layer.


Assigning a job

Organizations cut their projects, assignments etc. into manageable lots or jobs of various skill and resources-based specialities, and assign these to individuals as distinct roles. The leader of the organization or chief of a project continually shifts a job from one to another person, to achieve optimum results. The organization becomes innovative and creative through such shifting of personnel. Jobs are assigned to remove the tedium of repetitions or to provide new exposures. Jobs are also given out, to infuse new thoughts, work methods, and utilize different resources (plant, equipments, tools, talents). Jobs are presented as an opportunity, challenge, and incentive to a person or a team.

In design organizations personnel are identified in terms of their talent, such as: accountants, administrative staff, draftsperson, junior designers, job-leaders, associates, and so on. In medium to large organizations common pools of human and other resources are formed. Project chiefs or job-based foremen draw their requirements of human and other resources from such pools.

Personality traits

An organization is formed by people of many different talents and personality traits that are reflected in their attitude and conduct. These traits are not exclusive categories, and under appropriate conditions a person also takes on other characteristics.

           Dream-weavers are prolific generators of ideas and new concepts, but lack the skill to detail them. The dream-weavers are mercurial and often have a fear of failure. A dream weaver must be an extrovert otherwise never gets acknowledged.

          Technocrats have a talent of visualizing structured entities. For them an entity is conceivable, if only it is structured and so practicable. Technocrats are fastidious, uncompromising, and hardheaded. A technocrat though may get entwined while detailing the parts, and may lose the grasp of the holistic scheme.

           Exponents enjoy advocating ideas or schemes, without bothering either its authorship or practicability. They feel that the public attention received through the advocacy is the measure of their skill and success.

           Patrons are not necessarily resourceful people, but are ready to support any new activity that takes them away from their routine chores, provides a novel experience, and keeps them busy. A person may become a sponsor by virtue of the position and powers to allocate resources. Such people are motivated by strategic gains through various sponsorships.

           Arrangers or fixers are expert manipulators, and keenly look for a chance to jump into any difficult situation to manage it. As a risk taker they collect lots of benefits, and very fast.

           Conservatives are very over careful by personality. Conservatism is due to a struggle less life or due to old age lethargy. They detest change, but if instrumental of causing even minor innovation, take a great pride.




Employment is a process of mutual choice or selection: the employer chooses the employee to engage and the employee selects the employer to work with. The extent of choice and the power to make it or not, is rarely equal. However, all democratic Governments’ laws that relate to such matters have a basic presumption that both (employer and employee) are on equal footing.

Employment is a contractual relationship wherein compensation is offered for the type of services to be rendered. But in reality non-equalities occur due to discriminations of sex, race, region of origin, age, language, social status, etc. Some discriminations though scientifically supportable are not tenable in normal law. Our constitution, however, overrides, provides, dictates or recognizes ‘reservations in employment’ for specified classes of people, to eliminate and correct certain historical effects.

In an organization people worthy to be employed, come through many process of elimination. Elimination processes have two basic functions: first, reduce the lot from which to choose, two, select the most appropriate out of the remaining manageable lot. The first function is so subtly carried out that often applicants do not become aware of it. As for example, an advertisement that appears in a newspaper or media being accessed in particular town or region, would generate replies from that set of people, eliminating all others.

Selection of an employee

It is based on following aspects:

1 Objective requirements (intellectual)        Skill, experience, training, work related abilities.

2 Subjective requirements                           Personality traits, initiative, speed of reaction, temperament, memory, power of reasoning.

3 Physical requirements                               Age, height, muscle power, health history, abnormality of body limbs and sense-abilities

4 Other requirements                                               Past record, references, readiness to accept the terms of employment.

Once a person is employed, the management body of the organization continuously monitors the performance. Organizations relate the performance of an employee to the profitability. This is more so in Design organization where human resources are very important assets, unlike in manufacturing units where productivity of machines and raw material costs have greater significance. An Employer sees performance as a tool for future efficiency to be gained at a specific cost, whereas an Employee perceives performance as immediate compensation, personal fulfilment, future promotion and skill gain.

Performance of an employee

Performance is a product of many factors such as individual ability, personality traits, input effort, sincerity, perception of the role, motivating factors, etc. Yet, performance can be conditioned as the enhanced capacity to deal with more complex or new problems, share of responsibility, greater authority, etc. An employee can be motivated for gain, comfort, increased learning, or even enhanced motivation. An employee is considered well motivated if can obtain or has a chance to any of these.



Employment is an ongoing process. The relationship between the employer and employee continues to evolve. The original conditions of employing a person such technological relevance, equipments, nature of projects, economics of resources deployment, personal efficiencies, work-culture, all change with passage of time. The employer and the employee begin to see each other very differently after a period of time. The employment -the process of being employed is reassessed by both. This has two facets: an employee wishes to cease working with the employer, or the employer wants to terminate the employee. The employer and employee, both however, are handicapped by other factors.

Employee wishes to cease working with the Employer

           The reasons are: insufficient motivation, unsatisfactory compensation, lack of promotion, any other personal (psychological or physical) reasons or better prospects elsewhere.

           The options are: Renegotiate the terms of employment or Change the employer. In the second case following conditions operate depending on the age of the employee.

Persons under the age of 30 have Positive operants in their favour, like: highest mobility -capacity to settle at any geographical location, work under most difficult conditions, and highest learning abilities. These qualities are very appreciated by employers and so desire to hire people either as a complete fresher or less than 30 years of age (i.e. with 5/6 years of experience). A person before the age of 35 must gain the varied experiences and find the best employment or plan own professional practice (self employment).

           Design professionals, by the age of 35 years begin to mature with sufficient work experience, personal contacts, and specialized knowledge, but also begin to have Negative operants like: reduced learning capability, lesser reorientation faculties, less motivation, less migration and reestablishment willingness. The ideal period or the last opportunity for designers seeking to refresh the employment, is before the age of 45 years, because now all the accumulated abstract gains of the past (experience, expertise, know-how) can now be converted into promotion or other materialistic matters.

           Beyond the age of 45 years the chances of re-employment taper of drastically. Only way a designer can hope to shift the position is by joining another organization as partner, senior associate or a free-lancer. Such opportunities are very few, and would demand persons with outstanding competence and capacity to contribute.

Employer wants to terminate the Employee

           The reasons are: inability to reset with changed circumstances, lethargy of advancing age, technological irrelevance of the skill, experience, etc., uneconomic compensation structure, lack of scope for promotion, unacceptable social behaviour, or resistence to relocate at a new location.

           The options are: There are many legal hurdles, though some of this can be met through monetary offers. However, instead of wasting efforts to surmount such hurdles, employers try to assign a different role, retrain, relocate, assign different tasks, provide punishments, and curtail other advantages, to their employees, and coerce them quit.


An Employee quits an Organization

           When an employee quits, the organization loses an asset, accumulated mass of knowledge and experience, personalized contacts, a proven mode of communication, secrets, patent procedures, formulas, etc. Organizations are nominally unwilling to let a reasonably seasoned employee quit. Organizations set right the causes of dissatisfaction, and match their terms of employment to the enticing offer for shift-out.




3.5                             DATA SYSTEMS


In nominal usage, Data refers to facts, posed to our receptive faculties or sense organs. Data is perceived, when it is within the limited perceptive (sensory) capabilities, and if has some relevance to our needs. Data perception is also affected by the mental and physical state. Our mind (and other organs) processes the data into Information. Data processing refers to acts like gathering, manipulating and transmitting for specific objectives.

Data adds to the knowledge of the person receiving it. But to work efficiently and within the personal biological capacity, one retains only relevant sections of data. Electronic or digital systems have (at least now) better receptors and larger storage capacities, so every thing can be stored. However, such systems, like their biological counterparts, invariably include barriers or filters to select only relevant things. A computer during the receiving and recording phase converts everything into a storable representation or a surrogate form.

Data is open knowledge, but when perceived in some context or for a probable purpose, it becomes information. Information, on the contrary is a personalized property. Same ‘data’ may provide ‘different information’ to a different perceiver. ‘One person's information becomes another person’s data’.

Data can be processed manually, mechanically and electronically. With each processing the data gets structured differently, and provides a new insight. A machine (mechanical, electronics) processes data according to set parameters, so is more objective then any manual process.



Information is a message received and understood, or knowledge acquired through study, experience or instruction. ‘Information means knowledge, instruction, communication, representation, and mental stimulus’. ‘Information is the result of data being processed, manipulated and organized’.

The information stored in the mind thins out with time, so must be either communicated or recorded. Recoding is formatting information over a medium. In-forming implies that a form is impressed onto -a medium. The formatted expression on a medium is less likely to get lost with time. Information is formatted (recorded) on a medium for storage or communication. Recorded and communicated material is already processed, but as we re-communicate it, it gets further processed. During each process of expression, perception, recording or retrieving, information corruption occurs.

Methods and modes of ‘formatting’ the information are like: writing, printing, transmitting, receiving, storing, retrieving, etc. However, formatting ‘conditions’ the data, and often ‘corrupting’ it. The forming mediums are physical, such as: paper, magnetic tape, etc. and formatting tools are: languages, images, graphics, metaphors, etc.

The information expression and formation (on a medium), both are acts of communication. The originator has less control on how the expression will be recorded (in-formed) or perceived. The Information originator accessing own records at some other time-space level cannot revert to the original physical and mental state, and re-experience or reestablish the original. Though the communicated information manifests slightly differently, yet it is a ‘knowledge transmission process’.

For communication to occur the originator and the accessing user both must have common formatting tools. Since the time of pre-Socratic philosophers, Semiotics, the study of signs and sign phenomena has been noted. ‘Signs are considered the irreducible elements of communication and the carriers of meaning’. Charles S. Peirce places three dimensions to signs: the body or medium, the designated object, and the interpretation of the sign. Charles W. Morris, designates the sign dimensions as: syntactic, semantic, and pragmatic.



Once upon a time the ‘Data’ was generated from commercial activities or political taxation-expenditure system. Another set of ‘Data’ was collected from experimental or observation sources. These were considered ‘data systems’ because of their sheer size and often the complexity. It was impossible to derive any (logical) meaning out of it. There were two deficiencies: Structure and Organization of the data. This was tackled on two fronts Numerical and Non Numerical data:

1. Observed data is narrative or descriptive of the qualitative aspects. Qualitative data carries labels or names and descriptions of it. It bears a structure of categorization.

2. Experimental or Measured data is more likely to be quantitative. Quantitative data essentially shows how much or how many of things

For example in a survey, where age, sex, married status and income are sought. Age and income generate quantity values and sex and marital status, provide quality values. Data of observations though description can provide quantitative indicators through labelling (categorizing), coding and counting.



Numerical data: It consists of numbers. It can be treated as non-numeric entity, but then it cannot be processed as numeric data. It is quantitative and so well structured. It occupies less storage space but requires many levels of manipulations to infer a meaning. Certain types of filters and checks to reduce the logical and statistical errors.

Non-Numerical Data Non numerical data is any form of data that is expressed in words (i.e. not in numbers). It non quantitative and may not immediately reveal any structure. It requires very large storage capacity. It takes more time to record, communicate, retrieve or understand (processing) the data. However, likely number of operations or manipulations are fewer.



Numerical data has been processed through statistical methods for ages. Statistics is the science of collecting, analysing, presenting, and interpreting numerical data. Media presentations of data are in Descriptive statistics using graphs, tables, and numerical summaries.

Data is numerically measured through processes such as:

Proportion: Proportion is a part, share, or number considered in relation to a whole.

Percentage: Percentage is also a proportion but a rate, number, or amount in each hundred.

Mean or average: For mean or average all data values are added and the sum is divided by the number of data values. It indicates the central location for the data. Mean or average is affected by abnormal data values that are very large or small.

Median: For median all data values are arranged in terms of their magnitude, for odd-numbered lists the value in the middle, and for even number of items the average of the two middle values is median.

Mode: Mode is the data value that occurs with greatest frequency provides measure of a central tendency.

Percentiles: Percentiles are typically used in anthropometrics -a proposed seating height of x dimension may suit certain % of people. Quartile divide the data values into four parts, the first quartile is the 25th percentile, the second quartile is the 50th percentile (also the median), and the third quartile is the 75th percentile.

Range: A range is the difference between the largest value and the smallest value. The range is determined by only the two extreme data values.

Variance and Standard Deviation: These are measures of variability that are based on all the data and are more commonly used numerical measures. The deviation (difference) of each data value from the sample mean is computed and squared. The standard deviation is the square root of the variance.

Outliers: Outliers are values that are abnormal (too large or small) in comparison to others in a set. These values are reviewed carefully, using mean and standard deviation.

Statistical tools for data management include:

Complete randomization      Treatments are assigned to the randomized experimental units. Entire set of data is rearranged to change the sequencing or arrangement.

Randomized block design    Takes into consideration the probable effects of factors not accounted for by the experimenter but may affect the response variable. Various treatments are randomly assigned to grouped blocks.

Factorial design                    Allow conclusions about more than one factor, or variable. The term factorial is used to indicate that all possible combinations of the factors are considered.



Non numerical data could be a single word ‘label’ or a set of words like phrase assigning a meaning to one lot of information. To derive better meaning such lots are rearranged ‘relative to one another’. The label provides a context to a set of numeric values.

Word processors use non-numerical data processes to check spelling, grammar, syntactical errors, quality of expression, search, etc. These programmes also help in condensing, translating, converting text to spoken language or vice-versa.



Data from our nominal activities can be managed, often without recording it. However when the data densities are very large some form of recorded format helps to get a clearer picture. Account books, Revenue assessment noting, records of formulas and processes, almanacs, rules, cannons, etc. forms of Data collections. The data was recorded by writing on stone, clay tablets, plant leaves, metal sheets and paper. The data recordings were sequential in nature and non-erasable. To link data segments across pages or books tags were used. One of the most innovative method of tagging is being used by ‘Vansh Lahias or Khatedars’ -people who maintain records of ancestry at pilgrimage centres. Processing of large volumes of data such as land, crop, salary, revenue and other tax, and business records like money transfer notes, etc. in spite of using statistical methods was labourious and susceptible to human errors. Some form of objective and fast system was required.

Traditional data systems had few drawbacks: Data is ordered in sequential form, Data cannot be physically rearranged, data is spread over a very large estate (pages, books, journals) and Data was muddled mix of numerical and nonnumerical entities.

Traditional documents were visited for strategic quarries. It was not possible to ‘link any level of tactical use for consequent action’. It was not possible to set an automated error or abnormality finding mechanism. Attempts were being made to create algorithm, a ‘document’ that can initiate a set of actions. Such ‘command documents’ were not recipes but list of processes. The processes were ‘timed’ and contained command initiating and terminating conditions.

In Kashmir, a master weaver recites coded language of instructions from an attic, which are followed by all weavers in the room, creating identical patterns in the carpet. Joseph-Marie Jacquard of France, during 1804–05 developed a loom (Jacquard-loom) with interchangeable punched cards that controlled the weaving of the cloth. Any desired pattern could be obtained automatically and repeatedly. The Jacquard’s weaving loom was the first practical information processing device. The punched cards’ system was adopted by British Charles Babbage as an input-output medium for an analytical engine, which was used by the American statistician Herman Hollerith to feed the census data. Later machines were devised that had ability to execute instructions in other than sequential order. The conditional branching allowed taking on a different strategy on input of some data.






3.6               INFORMATION SYSTEMS


For information to be meaningful, must be organized according to some logical relationship. Self-sufficient or a unique collection of information is grouped and placed in a Document. Documents are unitised sets of information, such as: Books, articles, reports, movies, photograph albums and audio-video cassettes. Contents of large documents are further divided into sub units (chapters, sections). Sections of a document have linear, sequential or one-dimensional order. Though direct access to a section is possible through preset strategies like: keywords, summaries, content lists, tags, flags, indices, FAT (floppies & discs), etc. A library can have books physically arranged by access code, date of arrival (serial number), author, publisher’s name, publication date or even size. However, a card’s catalogue which is a pre sorted listing, allows one to trace the required document directly. Another method of facilitating physical access is to place sub sections of the documents as separate sheets bunched together by a thread (French=fil), wire, or metal-rod as a File holder. Punched card systems made access like sorting and rearranging faster. Large databases such as police records, telephone directories, library records are difficult to access quickly through cards. Mechanical sorting systems such as Cardex used punched locations to reposition (sort) the card. However, physical access did not allow ploughing into the data or contents.

Traditional sequential data arrangements take longer to access as the entire lot has to be searched for specific data or anomaly. Hard drives are also sequential, but here the data is placed wherever convenient lots are available. A file Allocation Table = FAT keeps track of location of each file, and each data-lot (bit) contains a link to the subsequent location of data. This is the reason why need to often scan the hard disc through programmes like Defragmentation, to place all sub-parts of a file contagiously. It increases the integrity of the disk and rationalize access time.

Sequential access is mainly used for permanent storage and random access memory for temporary storage. Sequential storage systems are cheaper, easier to manage, and less susceptible to damage from electrical surge. Memory sticks, cards chips use random access storage system, for their quicker response.

Data access and arrangement systems made a process faster, but mathematical processes like statistical interventions (finding mean, average, deviation, etc.) and algebraic equations did not work. There was no storage capacity for intermediate output and facility loop (repeat) processing.

Chinese abacus and some of the Indian Algebraic system have shown some promise in handling large numbers. A similar method was also employed by the mechanical clock-watch manufacturers of Europe. (Seconds wheel had 59 standard notches, but one deeper notch that moved the minute wheel and similarly 24 notch hour wheel had configurations to strike bells every 30 minutes and 60 minutes). Late 17th C mechanical calculators adopted the same technique. Here the computation was mostly memory less, products or results had to be noted down and re-entered. Each mathematical process though faster and fault-less required human intervention. For formulas the 'fixed' and 'variables' were not ‘distinctive’. The need was to create a machine that could be auto-commandeering, i.e. operate on its own with pre-defined logic or rules.

‘The alignment of pebbles on an abacus frame, the juxtaposition of moving scales on a slide-rule, and the configuration of cogged gears on the devices of Schickard, Pascal and Leibniz, are all examples of representational techniques that seek to simplify the complex processes underlying arithmetic tasks’.



Wherever and whenever information is stored, it is intended to be retrieved. Storage and retrieval both have some intrinsic access modality. Collections like libraries and archives are repositories of analog-form of information.

Library cards are manual access catalogues. It was not very difficult to turn them into Digitised catalogues. Digitization not only increased the speed of access manifold, but allowed addition of large number of search elements like Chapter index, back page keyword index, bibliographies, etc. Digitization also allowed ‘key word or phrase searches’. With advent of internet Digitised catalogues with facsimile image (e.g. PDF files) of the original printed document became accessible from anywhere. For images, audio-video items documents specific protocols are now available.

A digital document, is created with standard protocols or programmes. It is possible to collate data of many different forms of digitalized documents. Once a digital document was once thought to be a collection of information by an author or agency, and stored at one physical location. But now documents in different forms and at different locations can be treated as one set of linked entity. Smart search engines learn personal needs and preferences, and then offer appropriate linked entities.



Managing large quantities of structured and unstructured data is a primary function of information systems. Documents often consist of highly structured set of data, called a database. A simple database contains information of one class or category as records, and each record may have several units of specific data as fields. Fields are the basic units of a database.

Fields are of basic five types: Character, Date, Logical, Memo, Number, a database record can have up to 128 fields and 4000 characters. each field typically contains information pertaining to one aspect or attribute of the entity described by the database’.

Early data systems were sequential arrangements (i.e., alphabetically, numerically, or chronologically). Large enterprises use many databases with overlapping data. Data updated in one section must get updated in other locations wherever it is used.

Large business and other organizations build up many independent files, but containing related and even overlapping data, and their data-processing activities often require the linking of data from several files.

A database management system DBMS supports the automatic linkage of files through different types of database models: Hierarchical model -the record types are linked in a treelike structure, with single links between sets of records at different levels. Network model -create multiple linkages between sets by placing links or pointers, to one set of records in another. Relational model are used where associations among files or records cannot be expressed by links. A tabulated list establishes the relation or multiple relations for required information. Object Oriented model can store and manipulate more complex data structures, called objects, organized into hierarchical classes that may inherit their properties from classes higher in the chain.

In any data set, objects are related by some order or key: an intrinsic property such as: size, weight, shape, or colour, or an assigned value such as: the object’s class or date of purchase. Primarily data is structured, by arranging the keys, to create inventories and to facilitate locating specific objects in the set. The secondary objective is to conserve the storage space and economise the effort in searching the objects.

A database itself is a document. However, in real situations several databases contribute components that constitute a relevant document. The contributing databases can alter the data in the ‘constituted document’ and also inturn receive (back) the data input from the constituted document. Database documents have various levels of access authorities as well as alterations’ permissions.

The International Committee on Archives’ (ICA) committee on electronic records defined a record as, ‘a specific piece of recorded information generated, collected or received in the initiation, conduct or completion of an activity and that comprises sufficient content, context and structure to provide proof or evidence of that activity’.



Some of the techniques of information retrieval are:

Reference retrieval systems store references to documents rather than the documents themselves. Such systems, provide all information about the documents, their physical location and availability. Singapore port uses such a system to manage ship-containers’ location, arrival, despatch, etc. Courier companies let one check the status of a document on-line. Companies organize their procurement strategies to minimise the cost of storage (warehousing).

Database retrieval Systems treat components of a document as a database. This system is suitable where data is structured in various categories. Text documents (prosaic or poetic) are treated as a database by word processor programmes. These allow spelling and grammatical checks, replacement of characters, words or strings of words. It also checks the quality of language, word count, etc.

Hyper-text retrieval system. In this method, documents that are related (by concept, sequence, hierarchy, experience, motive, or other characteristics) are connected by establishing a relationship or through embedded ‘hyperlinks’. Variety of documents such as text, numeric, audio-video recordings, graphics and images, all can be linked. From a document one can access other documents as is done in a digital encyclopaedia such Britannica or for internet navigation.

SGML Standard Generalized Markup Language is a system for encoding electronic texts so that they can be displayed in any desired format. It takes advantage of standard text markers used by editors to pinpoint the location and other characteristics of document elements (paragraphs and tables, etc.). It draws semantic relationships (relating to meaning in language or logic) from a body of text. SGML is often supplemented by other syntactic techniques (arrangement of words and phrases to create well-formed sentences) to increase the precision.

Indexing Spatial Data: In indexing spatial data such as maps and astronomical images, the textual index specifies the search areas, each of which defines a territory or a spatial entity such as a triangle, rectangle, irregular polygon or circle, cuboid, sphere, etc. These spatial attributes are then used to collate or extract and present the image. Often other external attributes such as orientation, colour (normal, infra red, night vision), angle of view (perspective) etc. are applied to enhance or to de-augment the image.

Image analysis and retrieval: The content analysis of images is accomplished by two primary methods: image processing and pattern recognition. Image processing is a set of computational techniques for analysing, enhancing, compressing, and reconstructing images. Pattern recognition is an information-reduction process: the assignment of visual or logical patterns to classes based on the features of these patterns and their relationships. The stages in pattern recognition involve measurement of the object to identify distinguishing attributes, extraction of features for the defining attributes, and assignment of the object to a class based on these features. Image processing and pattern recognition, both have extensive applications in various areas, including astronomy, medicine, radiography, 3G & 4G communications, forensic identification, industrial robotics, and remote sensing by satellites.

‘The field of research activity, in which observations being made are classified and described, is known as pattern recognition. It is one of the applications of artificial intelligence. If statistical information obtained from patterns is used in their classification, the method is known as statistical pattern recognition. The statistical pattern recognition methodology is sub-divided into other disciplines such as feature extraction, discriminant analysis, cluster analysis and error estimation. The syntactical pattern recognition methodology carries out grammatical parsing and inference. The pattern recognition methods are often used in identifying data that is very complicated. Therefore, this identification system can fall in the group of algorithmic modelling’.

Speech analysis and retrieval: Here discrete sound elements are converted into alphanumeric equivalents. The alphanumeric data is subjected to content analysis like any text. Sound data contains many personal characteristics as well as acoustic features. Some of which are not distinct from one to another. The spectral sound converted to digital spectrographs is matched with sample data and also pre stored patterns. Often larger strings or ‘passages’ are checked to search and match a pattern. ‘The reverse process of digital to analog conversion is comparatively simple, but the quality of the synthetic speech is not yet satisfactory’. Sound analysis (speech or music processing) is complex, and requires high computational power and storage capacity. But someday it will offer instant translations, synthetic songs and new techniques of machine (robotics) interactions.





A document is a lot of related knowledge which when referred to, provides the intended information. ‘A document is a storable lot of information’. Like other storable units, it is modulated according to what it is to contain, and stacked (stored) according to how it is placed, referred and retrieved. Traditional documents are like: letters, reports drawings, specifications, procedures, instructions, records, purchase orders, invoices, process control charts, graphs, pictures, etc. Such documents’ ‘pages’, chapters or sections are bounded together or tied together to avoid disturbing the order of placement. Sub units of documents also carry a positional identifier like page, chapter or section number. Documents are stored in their order of arrival, category, size, nature (paper, books, tapes, etc.), author, etc. Documents as sequential data storage systems are also created in the form of index cards, punched data cards, magnetic tapes, etc.

A digital document stores information in bytes and bits, that are pre-sized lots. These data lots or sets can be stored anywhere on media, as in random access storage systems such as floppy disks, CDs, HDs, etc. A file allocation table FAT as a dynamic index system maintains a record of it.

In modern information technology information lots or documents are called files. Filed information has: a title, a description of contents and the mass of content. Additionally it occupies a space, so size, and the birth context (date/ time/ location/ other circumstances of origin). Beyond these primary endowments, a file may be given different attachments (links and references). A file is the most common unit of information transfer. A file carries many identifiers such as:

time (of origin)

size (of storage, transmission time & effort)

author, contributors

content (index, key words, summary)

place of origin

place of destination, identity recipient

authority to create, read, write, alter and delete the contents of a file

affiliations, linked documents, preceding and following documents


embedded codes

signs, symbols



mode of communication

limits and conditions of relevance

It is through such identities that a file begins to be relevant or worthy of access. A simple file is static, because its data entities are allocated specific physical space. A complex file may contain variable size space allocations. There are often filters that decide which of the data entities are to be allocated a free or variable space. Data entities have labels. Data entities in a file remain static or are changeable. The conditions that cause a data to remain static or be variable could be external or internal. The internal conditioners are inseparable parts of information files. In static files, the structure remains unaltered even while data entities are changed. The meaning deriving out of that file however may change. Static files are easy to process but are not capable of providing qualitative information. Static files usually contain data that is mathematical or substantially logical. In dynamic files the structure of a file gets altered along with the nature of data entities. Dynamic files are complex to process,


Hard copy vs Soft copy: Substantial quantity of information is generated as hard copy, i.e. written or printed. It is possible to copy these type of documents in parts or whole, through processes like carbon copying, scanning, lithography, screen printing, transfer printing, photo and Xerox copying. Some of these processes require specific media. Some processes are capable of enlarging or reducing the scale (micro films). But contents cannot be edited, revised or manipulated. A digital data file is often called a soft copy because its contents can be manipulated with much ease. It can also be linked as a whole or by its parts, to other files or their parts. It can be analysed, dissected, reassembled, rearranged or restructured. Through such manipulations even ordinary looking data takes on different forms, and new meanings can be established.

Most printed documents are opaque. It is very difficult to superimpose or merge two or more such documents. Digital documents, on the other hand can be treated as set of layers or even three dimensional matrices. Digital documents can be treated as transparent and miscible. AutoCAD creates files as transparent layers. Digital files could be made interactive, i.e. a change in one file can be made pervasive in all other linked files.



Information Resources of Organizations

Data arrives in organizations, at periodic intervals or on a continuous basis, but it arrives in parts, that will:

- probably form a whole,

- automatically create a structure with definite boundaries (close ended)

- form an ever growing matrix (open ended).

Organizations receive and generate a lot of data, which have two sets of relevance. Information with distant use is strategic, and will be used for planning and forecasting. Strategic information is more general than any tactical information. Information with immediate use is tactical, and is used for decision making and problem solving. Operational uses of information are very occasion or situation specific.

Information has five qualities:

Brevity (specific to the context),

Accuracy (of the right context or sensible),

Timeliness or up to date,

Purposiveness (capable of causing desired actions),

Rarity (original, novel).

Prime Internal Information Resources (IIR) for organizations are: experience and knowledge that comes with owners, employees, consultants, etc., and data generated from the routine activities. External information once procured by the organization if properly stored can be a great internal asset. The External Information Resources (EIR) are: media based such as books, periodicals, internet, CDs, tapes, etc., and input and feedback from consultants, suppliers, contractors and clients.

External information is inter organizational, fraternity level, society, community, national, or of a universal domain. External information is acquired for a payment of compensation in proportion to its quality, quantity and acuteness of need. Organizations, as a result, end up paying a stiff price for sourcing external information.

Internal information is personal, departmental or organizational. Internal information resources are nearly free, require only processing at a negligible cost, but are ignored. Organizations thrive and proliferate on the quality and quantity of data within their reach. Organizations by continuously processing their data generate synergies that in turn sharpen their data processing capacity.

Cost of information: Information as a commodity can have an ordinary cost, if it is universally available and not urgently needed. However, information of rare or proprietary nature and that requiring immediate access can have a high price. Information is also available in many free domains without any obligations. Cost of information is also formed by absolute factors like the cost of acquisition, processing, storing, retrieval and transmission.


Information systems and emerging forms of business organizations: Information systems affect the structure of organizations and design of the workplaces. IA information networked organization is more dynamic because its workers communicate amongst themselves and with other firms. This provides for greater coordination and collaboration in projects’ handling. This has also ‘led many organizations to concentrate on their core competencies and to out-source other parts of work to specialized companies’. ‘The capacity to communicate information efficiently within a firm has also led to the deployment of flatter organizational structures with fewer hierarchical layers’.

Information systems built around portable computers, mobile telecommunications, and groupwares have enabled employees to work virtually anywhere. ‘Work is the thing you do, not the place you go to’. Employees who work in virtual workplaces outside their company's premises are known as Tele-commuters.

Two forms of virtual organizations have emerged: network organizations and cluster organizations. A network of individuals or geographically widely dispersed small companies working with internet and wide area networks, can join seamlessly through specific protocols to present a multi disciplinary appearance of a large organization. The subsets operating in all time sections seem to be operating 24 hours a day and seven days a week. In a cluster organization, the principal work units are permanent, complimented by multiplicity of service providers or temporary teams of individuals. A job or project begins to percolate within the cluster and different sub units begin to react to it, providing their inputs. A solution begins to emerge from apparently fuzzy and often unrelated ideas or concepts. Team members, are connected by intranets and groupware.

3.9                             LANGUAGES

A language is defined in many different ways. Some have defined a language to be a string or combination of vocal sounds as well as graphical characters by which communication occurs. Others have called it to be an expression of ideas. Expression and communication are imperative for language, but the association of sound is sometimes considered an essential component. If sound (speech) was inevitable part of speech, the deaf and dumb would not be able to communicate and express themselves. A language as an expression has a representation that is either vocal or graphical.

Two parties speaking to each other convey lots of meaning through non verbal communication through facial expression, tonal quality of voice, movements (gestures) of the limbs and postures of the body. These paralinguistic activities, like the verbal communications are greatly affected by the terrain and culture.

Within a community a language is the basis of expression and perception of information. It is a working system of communication matured over a period of time. However, across communities not only different meanings are assigned to different sounds and written scripts, but often words and groups of words have different meaning. Expression in one language is difficult to transport truthfully into another language.

The meaning of a sentence comprises of the meanings of the words it contains, and the structural or grammatical meaning carried by the sentence itself. Sentences containing exactly the same words but if displaced, carry a different meaning because the order of placement of words distinguishes -conventionally called subject and object, its meaning. However, in Sanskrit the object-subject relationship is not dependent on the sequence of the occurrence. The formal resources of any language for making distinctions in the structural meanings of sentences are limited by two things: the linear (time) dimension of speaking and the limited memory span of the human brain. Writing exactly copies the time stream of speech, Writing is partially relieved of memory-span restrictions by the permanence of visual marks. Because written texts are almost entirely divorced from oral pronunciation, sentence length and sentence complexity can be carried to extremes, as may be observed in some legal and legislative documents that are virtually unintelligible if read aloud.

Translations (literary meaning) and conveyance of contents of spoken material and written compositions of one language to another language, are full of problems. Here not only the syntactic structure but the pitch and stress placement are different. A word or phrase gets highlighted by its position, the active-passive construction (allowing the original subject to be omitted), through stressing (repeat and equivalents), nature of intonation (thyme, metre, commanding, imploring) and by support with gestures. The attitudes, feelings, and social and personal relations between the communicators (speaker-listener, author-translator-receiver) are even more difficult to transpose. Some languages have signs that have different phonetic and graphical values depending on the context of their use.

In any language only few words express direct experiences of objective reality, but most other words are sophisticated concepts with very high level of abstraction. Words also ‘have varying degrees of equivalence to one-another’. Large number of words in most of the languages have no direct association with sound, and in pictorial (Chinese-Japanese) languages the relationship with the objects shown is very thin or indirect. This is so, because substantial parts of our experiences are not directly associated with any kind of sound or its form.

A symbol is something that stands for something else. It is a sign used to represent something. Words can be called symbols but many words have evolved so much with associative meanings, that a language becomes huge collection metaphors, ‘an endless web of interrelated symbols’. A poet manipulates the metaphors ‘instinctively and subjectively rather than with deliberate rationality’.

A child establishes an easy relationship with the words and their object related meaning, but soon learns to express the structured meaning. However, such an ability is difficult to acquire for a second or foreign language.

Language used in expression or communication is deliberate to varying degrees. The information presentation is more succinct if it is purposive. The information is expected to carry a meaning. Sequencing provides a primary spatial structure. Accessing it in the same spatial format conveys the original intent, which is already intrinsically present there. To allow the reprocessing to be automatic (and so faster and massive in scale) some processes need to be interwoven. Kashmir master weaver was reciting the process but Jacquard made weaving through use of punched cards fully automatic (no human intervention). Something similar was required for Census machines. The process or instructions had two major draw backs. One: repeat operations had to be redefined in as much detail, Two: There was no method to correct or stop the action for any error (caused by some extraneous reason).




First programming languages were process oriented languages, consisting of commands to control a machine. The commands were executed by pressing levers, marked buttons or switches. Such processes were often chained to one another for sequential activation, a rudimentary form of automation. In the process languages such as Kashmir carpet weavers’ manuals, the commands were recite-able names in ordinary language. Often strings of words were used to not only name the command but also its purpose. However, in Jacquard looms, piano playing scrolls and cuckoo clocks, the commands were punched holes or slots. These were machine-readable languages, and used a very small vocabulary. The dash-dot language of earlier wireless communication is an example of such an economical expression. Later binary, hex and many other types of number formats were used.

Command triggering which was initially automated through sequencing and looping also had time delay mechanisms (first by mechanical action and later through electric and electronics devices). Internally generated or supplied information of feedback, feed forward and other parametric definitions provided the specific conditions for initiation, continuation and termination of the commands. It was also recognised that certain independent sub-processes could be handled in a parallel mode. For linear processes the command structure -algorithms are easy to implement, but nonlinear (branched, looped) processes, the commands are many times too interdependent, requiring many complicated control-check systems.

Algorithms: A machine works on two basic inputs: the raw materials (or data) and instructions or activation commands (to initiate or terminate a process). An algorithm is an exact formulation of method for accomplishing a task. However, not all statements or documents of instructions have such logical order that can be seen as an algorithm. Algorithms are a must for machine processing of information. An algorithm is an aid or process analysis that ultimately is transformed through coding into a machine-language.

The key idea behind Jacquard's loom was to control the action of the weaving process by interfacing the behaviour of the loom to an encoding of the pattern to be reproduced. In order to do this Jacquard arranged for the pattern to be depicted as a groups of holes `punched' into a sequence of pasteboard card. Each card contained the same number of rows and columns, the presence or absence of a hole was detected mechanically and used to determine the actions of the loom. By combining a `tape' of cards together the Jacquard loom was able to weave (and reproduce) patterns of great complexity, e.g. a surviving example is a black and white silk portrait of Jacquard woven under the control of a 10,000 card `program'.



Programming language is a set of step by step instructions (algorithm) to make a machine work. Such languages predate the invention of computers. The process commands are known as software, as against the machine or computers for which these are designed as the hardware. Developments in Software and Hardware have proceeded concurrently, forcing advancements in each other fields. The evolution of computers and languages can broadly categorised as Generations, each is marked by major technological change. Just as computers have become increasingly smaller, cheaper and powerful, the ways of programming it have turned complex, often automatic and reliable.



First generation computers (1940-1956) had vacuum tubes for circuitry and magnetic drums for memory. These were very large, consumed power and produced excessive heat. In very early stages the systems were programmed by changing the wires and setting of dials and switches, later the instructions were fed through punched cards or tapes. These computers relied on machine language to perform operations, and could only solve one problem at a time.



Second generation (1956-1963) computers saw replacement of vacuum tubes with transistors (invented in 1947). Computers now became smaller, faster, cheaper, cooler, energy-efficient and reliable. But reliance on punched cards for input as well as output persisted. Magnetic tapes as recording media were on the horizon.

Binary machine language based programmes though faster in execution were unfriendly for programme development. The computers now stored some of the instructions in their memory, which were accessed by symbols (addresses). An assembler programme translated the symbolic instructions into processor instructions.

Computer languages at these stage were of Three basic classes:

Machine language: The instructions consisting of op-code and data are stored in sequence in continuous locations of the memory. Writing machine language is not only tedious and error prone. Each computer system was differently built and required unique method. Programmes written in machine language are called object programmes.

Symbolic language: Here alphanumeric symbols are used for op-codes and instructions (e.g. LDA=load, ADD=add, MOV=move, called mnemonics). The computer operating system translated these symbols into matching numeric codes. The software translating symbolic language into a machine language is called an assembly programme. Symbolic languages are very much machine oriented.

Procedure oriented languages: are closer to normal languages and often called high languages. A source programme written in procedure oriented language must be translated into a machine language by using a compiler.



Third generation (1964-1971) computers were chiefly characterised by the integrated circuits. The transistors were miniaturized and placed on silicon chips, called semiconductors, which reduced the energy requirements and increased the speed of processing. The computers now had keyboards and monitors. The systems were capable of handling many different applications concurrently, through a central programme that monitored the use of memory, and processing capacity.

Programming languages now resembled more to the natural language, though consisting of a stripped down vocabulary of some two or three hundred reserved words. High Level Languages like BASIC, PASCAL, ALGOL, FORTRAN, PL/I, and C were developed.

Traditional programmes for computers were in the form instructions to the computers. Object-oriented programmes (OOP) define types of data, their structures and types of functions -operations. It is a programming system where concepts are represented as objects having data fields (attributes that describe the object) Objects as an entity interact with one another to design applications and computer programs. However, it became very difficult to check programmes and verify their implementation as hardware and software both became complex. Modular software such as Simula, a programming language was the first to resolve such issues. Real OOP took off well in 1990s. OOP now used techniques like inheritance, modularity, polymorphism, and encapsulation. These functions are modulated as a collection of cooperating objects capable of receiving messages, processing data, and sending messages to other objects almost like an independent little machine with a distinct role or responsibility.



Fourth generation (1971-onwards) computers have microprocessors made up of thousands of integrated circuits built onto a single silicon chip. The fourth generation is also characterised by networking systems like LAN, WAN, internet and WI-FI. The computers of this generation had many different GUIs (graphical user interface), mouse and other hand held devices.

Programming languages which began to be like natural language with 3rd generation computers, now made it possible to build applications without in-depth knowledge of a programming language.

C was developed between 1969 and 1973 as a high-level flexible language for systems programming. Languages such as FOCUS, SQL (Structured Query Language), and dBASE were developed as query languages for database management and were very close to human language. PROLOG (Programmation en Logique) was dramatically different logic programming language making use of the powerful theorem-proving technique. Prolog determines whether or not a given statement follows logically from other given statements. Programmes in such languages are written as a sequence of goals. Prolog and Lisp are both. Object Oriented languages.

Substantial advances in programming language implementation occurred in 1980s. The RISC (Reduced Instruction Set Computing) in computer architecture required that hardware should be designed for compilers rather than for human assembly programmers. In addition technological improvements in processor speed allowed intensive compilation techniques for high-level languages.

One of the earliest language easy to learn and use was BASIC (Beginner's All-Purpose Symbolic Instruction Code) in 1980. At a higher level of abstraction are the visual programming languages like Visual BASIC, in which programmers graphically express what they want done by means of icons to represent data objects or processes and arrows to represent data flow or sequencing of operations. During 1980s C++ was developed as a version that consolidated object-oriented and systems programming.

ASCII character set had a challenge from Unicode that permitted source code (programme text) in non-Latin-based scripts. The adoption of high-level language HTML (HyperText Markup Language) allowed the nonprogrammers to design web pages by specifying their structure and content but left the detailed presentation and extraction of information to the client's web browser.

New multi-language programming tools were required for Internet in early part of 1990. This was initially done by improvising the existing languages. Java programming language became popular due to its integration with the Netscape Navigator web browser. Many other scripting languages were also used in developing the customized applications for web servers. Java as both an object-oriented language and as a concurrent language (it contains built-in constructs for running multiple threads in parallel) also came to be used for server-side programming.



Fifth generation marks highly efficient, multi layered and multi tasking processors, graphics processors, and high capacity faster access storage devises. Right from the 1st first generation of computers attempts were being made for a better interface between the man and the machine, but only now are beginning to be realized. Other technologies to emerge include are: Parallel computing, distributed storage systems, and networking.

Japan started (1979) a search for future directions in computing fields such as: Inference computer technologies for knowledge processing, Computer technologies for processing very large-scale data and knowledge bases, High performance workstations, Distributed functional computer technologies, and Super-computers for scientific calculations. The project also envisaged a parallel processing computer machine with performance between 100M and 1G LIPS (Logical Inference Per Second). There were similar projects in other countries (in USA Microelectronics and Computer Technology Corporation MCC), (in UK Alvey), (in Europe European Strategic Programme of Research in Information Technology ESPRIT). Two major developments however, changed the scenario: Apple Computer introduced the GUI and Internet made locally-stored large databases less relevant.

Fourth-generation programming languages were designed to build specific application. 5th generation computer were being conceived to define and solve problems on their own without any algorithm written by a programmer. These systems were to be capable of learning and self-organization. 5th generation languages reflect the research in Artificial Intelligence.



Next generations of programming are showing a shift from computational and data management to task handling through many different types of devices, like: Mobiles, iPods, IC mechanisms, integrated analytical systems, biometric, etc. These systems have programme and architecture seamlessly entwined. The programming sequences are non sequential (nonlinear) and have intuitive and adoptive human interfaces. The systems integrate Artificial Intelligence, fuzzy logic, neural networking (for distributed processing and storage management) with not only multi layered processors but also multi node processing architecture at local and remote locations.



The term Artificial Intelligence has many different meanings, such as:

1         The science and engineering of making intelligent machines,

2         Intelligence as exhibited by an artificial -man-made, non-natural, manufactured entities.

3         AI is the study of how to make computers do things that people do better.

AI began in 1950s when machines’ capability to perform human-like conversation and computer game playing programmes like Tic-Tac-Toe, Checkers and Chess were developed. First attempts in AI were to turn computers into smart game opponents by designing all possible moves, but also record sequences of moves as possible strategies for future. Computers were expected to better at exploring number of paths and selecting the best one, at a very superior pace. During 1958 Lisp was considered the ideal AI programming language, later in 1972 Prolog was added to this list. During the 1990s and 2000s AI was very much influenced by probability theory and statistics.

AI methods were extended from games to learning of languages (visual and speech). AI is now used in commercial operations, stock movements, and search systems for large data bases to detect changes that occur within and outside of the norms, and mark these out for further human investigations. Jobs considered too dangerous or tedious for humans are assigned to AI robots. Other fields for AI use include: Optical character, pattern, handwriting, speech, face recognition, computer vision, virtual reality and Image processing, diagnosis, gaming and strategic planning. Emergent frontiers for AI are: Artificial life, automated reasoning, behaviour-based and cognitive robotics, cybernetics, biological emulation modelling and computing, concept and data mining, E-mail spam filtering, knowledge representation, and semantic web.

AI was initially based on symbolic language structures or logic to generate the intelligent programme. Some programming models emulated human cognition, but not the brain mechanism. Cybernetics, the science of communications and automatic control systems, in both machines and living things, has developed from realisation of human cognition systems.

AI system posses three essential components:

1         A method for representation of various fields of knowledge, at different levels of abstractions (each imposing type of inferences that can be deducted out of it).

2         A framework of reasoning -a medium to manage the search through the knowledge base.

3         A mechanism to learn new data from the environment / context within which the system flourishes and incorporate the newly-learnt data into existing set up without destroying it.

During 1982-86 easy availability of desk top computers stimulated the development of parallel distributed processing and back-learning methods, in small set-ups including by home users (SOHO small office & home office users). This has led to many convergent technologies that used designing of intelligent systems by not just copying the human cognition system but also by using the brain metaphor (like for multi node or distributed processing).



Neural networking is a computing solution that is modelled on the cortical structures of the brain, so it is also known as parallel distributed processing network. The cortical structure of brain consists of interconnected processing elements called nodes or neurons that work together to produce an output function.

The information processing cell of the brain are neurons. The human brain is connected to a network of 100 billion neurons. Each neuron has about 10000 synapses. (A junction between two nerve cells, consisting of a minute gap across which impulses pass by diffusion of a neurotransmitter.) The biological neural network is composed of a group or groups of physically connected or functionally associated neurons. A neuron receives signals from others through a mesh of fine structures called dendrites. A neuron delivers messages through axons, which terminate on other dendrites or muscles.

The information processing by the neural networks is done in parallel rather than in series (as in earlier computers). In brain, the collective operations of neurons allow the processing to proceed even if some nodes are unavailable. This endows a very high efficiency and reliability to the brain system. The neural networks use networks of agents as the computational architecture to solve problems (software and hardware entities linked seamlessly). Neural networks are trainable systems that can learn to solve increasingly complex problems.

A biological neuron may have as many as 10,000 different inputs, and may send its output (the presence or absence of a short-duration spike) to many other neurons. As a result, biological brain is always more complex than any artificial neural network so far conceived. Artificial neural networks are conceived to complement the operations of conventional algorithmic computers. Certain tasks like arithmetic operations are more suited to an algorithmic approach, whereas tasks like to infer a function and learning from observations are more suited to neural networks.

Brain, neural networks and computers: It is accepted by most scientists that the brain is a type of computer but with a very different architecture from normal computers. The brain is more massively parallel than any the multiprocessor computers. This means that any simulation of the brain behaviour on a traditional computer hardware will be slow and inefficient.

An artificial neuron is a device with many inputs and one output. The neuron has two modes of operation: the Training mode and the Using mode. In the Training mode, the neuron can be trained to fire (or not), for particular input patterns. In the Using mode, when a taught input pattern is detected at the input, its associated output becomes the current output. If the input pattern does not belong in the taught list of input patterns, the firing rule is used to determine whether to fire or not. A firing rule determines how one calculates whether a neuron should fire for any input pattern. It relates to all the input patterns, not only the ones on which the node was trained.

Artificial intelligence and cognitive modelling try to simulate some of the properties of neural networks. Though similar in their techniques, the former has the aim of solving particular tasks, while the later aims to build mathematical models of biological neural systems. AI with neural network technology is expected to offer an ‘autonomous, self changing, a living piece of software’ that creates new agents based on the interaction of the end user and interface. Many practical applications are dependent on artificial neural networks that pattern their organization by mimicking brain's neurons.



Artificial neural networks have been applied to speech recognition, image analysis and adaptive control, to construct software agents (in computer and video games) or autonomous robots. Most of the currently employed artificial neural networks for artificial intelligence are based on statistical estimation, optimisation and control theory.

Artificial neural networks are applied to following categories of tasks:

1         Function approximation, or regression analysis, including time series prediction and modelling;

2         Classification including pattern and sequence recognition, novelty detection and sequential decision making;

3         Data processing, including filtering, clustering, blind signal separation and compression.

Application areas for Artificial Neural Networks also include system identification and control (vehicle control, process control), game-playing and decision making (backgammon, chess, racing), pattern recognition (radar systems, face identification, object recognition and more), sequence recognition (gesture, speech, handwritten text recognition), medical diagnosis, financial applications, data mining, visualisation and e-mail spam filtering.



Fuzzy logic is an organized and mathematical method of handling inherently imprecise concepts. It is specifically designed to deal with imprecision of facts (fuzzy logic statements). For example, the concept of coldness cannot be expressed through an equation, because it is not quantity like the temperature is. There is no precise cutoff between cold and not so cold. Whether a person is inside or outside the house is imprecise if one stands on the threshold. Is the person slightly inside or outside the house? While quantifying such partial states (xx % inside and yy % outside) yields a fuzzy set membership.

Fuzzy logic is derived from fuzzy set theory dealing with reasoning that is approximate rather than precisely deduced from classical predicate logic. Fuzzy truth represents membership in vaguely defined sets and not randomness like the likelihood of some event or condition. Probability deals with chances of that happening. So fuzzy logic is different in character from probability, and is not a replacement for it. Fuzzy logic and Probability refer to different kinds of uncertainty.

Fuzzy logic is used in high-performance error correction systems to improve information reception (such as over a limited bandwidth communication link affected by data-corrupting noise). Fuzzy logic can be used to control household appliances such as washing machines (which sense load size and detergent concentration and adjust their wash cycles accordingly), refrigerators, rice cookers, cameras focussing, digital image processing (such as edge detection), elevators, Fuzzy logic is used for video game artificial intelligence, language filters on message boards and for filtering out offensive text in chat messages, remote sensing, etc.


3.10            PROJECT MANAGEMENT

Project Management is a carefully planned activity to achieve an objective. Project management helps achieve the objectives:

           1         Defined scope

           2         Time

           3         Cost

           4         Quality of the delivery

Advance planning of the effort and required resources assures the nature of the outcome. Projects have finite amount of time, limited money, human and other resources. Projects are conditioned by the available technology, by legal, social and such obligations.

A project is a unique endeavour undertaken

           - to form a concept

           - create a product

           - render a service

A project could be ‘an idea or concept taking shape in mind or being readied for an outward expression’, a strategy to actualize an idea, way to recollect a happening, estimate the scale of an event, reproduce an experience or a search for a match or fit.

‘Project management is unique in character and distinct from other more traditional, routine and bureaucratic means’. ‘Projects emerge out of circumstances -aided by all kinds of debate and analysis, by managerial or political policy decisions, -as unique ways of addressing social, business and organizational issues, within increasingly complex environments’.

A project represents unique mix of various conditions so it is a first-ever effort. The effort at sub level involves some degree of repetitions or known conditions. This is very different from processes or operations, e.g. industrial, that are ongoing and do the same thing repeatedly. Even in such situations one can improvise by treating them as unique.

Projects are predominantly either Technical or Procedural, but not exclusively one or the other. Interior Design is an example of the former, whereas marketing or the training of personnel would be an example of the later. In project management what is unfamiliar and non routine, invariably necessitates all kinds of learning, adaptation as well as problem solving, and technical projects need more of it.

Projects have four parameters, Scope (defined space), Time, Cost and Quality (concerns). A small variation in one of these, affects all other factors. Projects are so Scope or extent dependent that an increase or decrease in size affects its time schedules, changes the cost and affects the quality concerns. A Time dependent project when delayed impacts the benefits or losses out of it. A project hastened may require greater or uneconomic inputs and may remain idle till other connected systems are complete. Though with early or accelerated execution, extensive benefits could be derived. Cost generally determines the extent of a project in the early stage, but costs are extremely variable and can change the perception of extent. Quality parameters are worst affected when conditions are abnormal and survival of an individual or the society is threatened, such as during war, natural calamities, catastrophes etc. Best and most challenging projects management methods have emerged in such acute conditions.

Critical Projects: Projects become critical when one or many of the four factors impact. So projects are hypothetically made critical by focussing or de-focussing on one or few of the aspects. A project in critical mode reveals its weak points or inferior sections. A project is considered as weak as its most inferior section, but the project achieves a strength equivalent to the average strengths of all its sections. Project management systems entail recognition of dependencies of the four aspects as risks and provide means for dealing (risk management) with them.



First ever human efforts of unprecedented size and complexity such as construction of buildings, cities, facilities, wars, calamities, writing epics, works of art, have been executed as projects. These projects require strategic planning, research, innovations, procuring and transporting the supplies, storage, human resources, tools and equipment’s deployment. Such projects often lasted for several generations or were conducted by different people taking over the controls. The 3 important elements of conducting a project, namely: Documentation, supervision and feedback system helped ‘timeless’ continuity by managing the changed circumstances.

Historically large projects were initiated by the powerful coterie of rulers who could command large number of workers as believers or slaves. The armed forces were the most organised of groups and were preferred executioners. However, early 19th C. saw emergence of different variety of projects. Rich merchants and Entrepreneurs began to plan large Industrial production units. The Industrial projects, in comparison to any other project in the history, were conceived, executed and made operational in a very compact time frame. The time compression necessitated new methods of project management.

Just before and during the world war II, it was necessary to ensure that production of war materials of all forms matched the anticipated demand, and was supplied to the right place at the right time. For this purpose new planning and forecasting methods were required. After the world war, these mathematics based planning methods developed into a new discipline known as OR -Operations Research. OR is a discipline concerned with the planning, assessment and control of operating systems, such as industrial production, commerce, etc. or virtually any human effort. Interest in the methods for design and logic of these systems, rather than in their operations, led to another subject, SE -Systems Engineering.

Decision making in design is covered by SE. Decision making in planning of the construction, execution, implementation, operations and the management thereof, is covered by OR. In reality these two disciplines overlap and merge into an overall systematic approach for Project Management.

As a discipline, Project Management has developed from several different fields such as building construction, mechanical engineering, military projects, etc. Two types of mathematical project scheduling models were developed.

The PERT -Programme Evaluation and Review Technique was developed as part of the United States Navy's (with Lockheed Corporation) for Polaris missile submarine programme, and the CPM -Critical Path Method was developed (jointly by Dupont Corporation and Remington Rand Corporation) for managing plant maintenance projects. Other such tools were like: work breakdown structure (WBS) and resource allocation methods.

By the 20th C. project managers began to (time) schedule productions for the rapidly changing markets (choices, technologies). In the 1950s and 1960s project planning methods for time management and cost control through inventory, warehousing, transportation management, were developed. Much of this development was based on the concept of determining a precedence relationship (that is, identifying which work activities must be completed before other work activities). Methods like CPM and PERT are now offered as software tools to plan and control project costs and schedules.

Business was facing challenges of more complex products and services, demands for better quality products, cost-conscious customers, faster development cycles, stiffer international competition, and need for joint ventures to share risk and collaboration for leveraging the expertise. Project management was designed to help the business leaders do just that.

Enormous projects are often called programmes, divisible into multiple projects. The projects, in turn, can be broken down into smaller sets of activities. These are further dissected into tasks, or work packages. Tasks are assignments for a person, equipment or a facility (department). Project management techniques are applied to planning and managing activities at all such levels.



         Defining the project objectives (concepts, policy, analysis, design, quality parameters, concerns)

         Defining the means and methods of achieving the objectives (assessing, choosing)

         Planning the resources (estimating, procuring, allocating)

         Time and Space scheduling: Forecasting, decision making and problem solving

         Delineating the project into various components and tasks

         Organizing the project actualization

         setting evaluation and preview methodologies

         Assessing, controlling and providing for risks

         Implementing a feed-forward and feedback system for project execution and operations phases.



A project takes place in Four Stages, such as:  



3Execution or Actualization



The stage of project concept formation, definition, initiation. A project owner, initiator, convenor or user often seeks or is approached by a project consultant. A project consultant at a very primary stage could be a multi disciplinary techno consultant or a financial expert. This expert defines the nature of project and expertise required for it. Very often the same expert also defines the time and resources that will be required. The most important decision taken at the planning stage is, whether or not to pursue the idea for the project. In order to make this decisions, many such questions must be answered:

Is the project needed ?

What will the project cost ?

What will the benefit of the project be ?

How big will it be (compared to similar ones if any) ?

What impact will it have on the environment ?

Who will pay / finance for the project ?

What alternations are available ?

What are their quantitative and qualitative advantages and disadvantages ?

These decisions are often made by administrators, politicians, and professionals of other fields. Design professionals are generally not involved. However, where designers participate in the planning stage decisions, a project has smooth design implementation. A Project Planning Exercise results into a Project Charter or a Project Profile Report. This becomes a benchmark document against which the success of a project is judged.



The design phase follows the planning phase. But in some cases the design phase either begins with the planning phase or has an inseparable identity. Here the design phase begins as a case study exercise during the planning phase. It is continued and developed further, and later perhaps may get separated as a distinctive design phase. Where designers are conceivers or visualizers of the project, the concepts or ideas come through the design (intuitively and logically). Here the planning stage is hardly perceptible.

The design processes can be divided into macro and micro design stages. The decisions made at a macro stage relate to identification and feasibility of large and separable sections of the scheme (or the scheme as a whole). In a way this process is very similar to planning phase. At a micro stage feasible sub sections are detailed and specified. Influence of the planning phase is very thin at micro level.


This follows the design phase. However, some preliminary work may start along which the design work, like: preparation of prototypes, master batching, a pilot project, samples production, etc. The project execution organization, procedures, and reporting mechanisms are established.

An execution phase of project management includes decisions like:

The extent of activity, Time and Space spread

Material requirement quantities and schedules

Material handling and storage modalities

Plants-equipment-tools requirements and deployment schedules

Physical occupation of the site, site problems

Import/export of goods-equipment-products at site

Financial resources

Coordination of 4 M’s Men Machine, Materials, Money

Projects which have passed through very efficient Planning and Design phases, the framework for such decisions is well set.

There is an inherent time gap between the Planning & Design phases and the Execution phase. During the period circumstances change and a reevaluation becomes necessary. External factors such as climate, natural happenings, etc. also affect projects during the execution. At the implementation stage all projects are modified in varying proportions.

Change is a normal and expected part of the construction process. Changes can result from necessary design modifications, differing site conditions, material availability, new technologies, contractor-requested changes, value engineering and impacts from third parties. These changes also need to be documented properly.



The Operations or last phase of project Management has many different facets depending on who actually owned or will now own the project, the complexity of technologies involved in operating the project, the effects of residual liabilities related to the project, creating feed-forward and managing the feedback systems that help in rationalising the project operations and improvise future handling of the project.

Projects have operations phase either as an obvious supplement, or a very undistinctive or presumptive one. In cases, where the project convener is an ultimate owner of the project, the operations phase may not be very distinct. Building or estate developing companies plan, design and execute their projects and then deliver the product along with its operational mechanisms. Here the operations phase is intentionally made a separate entity or phase. Yet most complex projects at subsystem level are tested by the vendors in presence of not only their specifiers and buyers but also their assigned professional operators. This allows the planners, designers and executioners of the project to transfer their responsibilities (and so liabilities) to the professional system operators. In case of turn key projects, the vendor may operate systems for trials, and than hand it over to the appointed operations agency.

Where projects are small, but of common or repetitive nature, a feed back from the operations phase can help in optimization. But projects that are rare or unique, the feedback is not immediately available or is useful. Where owners or conveners, are also the project operators, are able to utilize the feedback to conceive a better project next time. Wherever projects are handed over to plant operators, the feedback may not be shared or efficiently used.

Projects that are conceived in terms of results to be obtained, consistently over a long period of time, have a definite conceptual framework. Here the operations phase is an integrated approach of the project management. Projects that involve risks arising out of their sheer existence or operations are planned, designed and executed with integral operational strategy.



All successful projects have documented objectives and deliverables. These documents are mechanisms to align sponsors, clients, and project team's expectations. However, where owners, conveners, planners, designers, vendors, executioners, supervisors and operators converge, the documentation may get obliterated due to in-distinctive roles.


Some of the Project documents created by the involved parties are:

         Project Charter, Project Profile Report, Business case model, Feasibility Study, Scope Statement, Terms of reference, Project Management Plan, Project Initiation Document

         Work Breakdown Structure, Assignments, Task lists, Schedules,

         Accommodation of Alterations, Change Control Plan,

         Communications Plan, Reportage system, notifications,

         Risk Register, Risk probabilities, Risk extent, Risk Management Plan (avoidance, mitigation, factors of safety, margins), Risk compensations

         Governance Model, Administrative strategies

         Export - Import Logs

         Actions history Lists

         Resource Management Plan

         Project Schedules, Targets

         Status Report

         Responsibility - authority structure

Such documents were once available to only relevant project members or owners, gradually have began to be available to all project participants. These are now also available as per ISO recommendations (ISO 900x And ISO 1400x standards relating to QMS Quality Management System and Environmental Management System) to all stack-holders. Such documents are hosted on shared resources like web-pages.

Public projects, Historically long lasting (monumental) projects and projects with large scale risks and other difficult to predict liabilities are not only well documented through all stages of the Project Life Cycle but such documents are desirably made as open documents for public scrutiny.



3.11                 DECISION MAKING



Project management involves Decision Making. Decisions are taken on factors that are essentially part of the project itself, and also on various presumptions, which may or may not become part of the project. In the first case the decisions are made on factors that are internal, through a process of selection, confirmation, elimination, etc. While in the later case, the decisions are made from external factors, where not only the relevance, but the entire range of their effects needs to be forecast.

Decisions are primarily taken when an action is required or when further decisions are due. Decisions are taken at: conscious level (intellectual) and subconscious level (intuitive). Actual time and exact place of a decision cannot be identified. However, the context within which certain decisions are made can be known.

Decisions are taken through:

Analysis: Dissecting a whole into parts so to understand it better.

Synthesis: Combining several things to form a whole to see if it is pertinent.

Holism: Conceptualizing the whole thing.

The quality of decision is governed by the decision makers’ state such as: physiological fitness, mental alertness, personality traits (daring, fear), information, training, experiences, opportunities, time, resources (human, equipment, finance, circumstances), etc.

There is no perfect decision. A decision is the best course for a given situation. Decisions do not have a mathematical sharpness or uniqueness. There could be many different ways of achieving the same goal. But in reality each action is likely to take place in different time and space context, so decisions pertaining them are likely to be different.

Efficiency of a decision is judged, on how much it accomplishes and in what time. A reasonable decision always takes one closer to the goal, however, slightly. Effective decision makers are fully aware of working of their decisions. They invariably have the capability to improvise or correct the situation as decisions actualize.


Decision makers ask questions like:

Is the objective defined ?

Is sufficient information available ?

How many options are available ?

Have these options been evaluated ?

Are all risks identified and provided for ?

Does this decision feel right, now that actions are being taken on it ?

Decision making and consequences thereof (actions or further decisions) are often so interlaced that it is not possible to view them separately.

Decision making comprises of:

1forecasting the most opportunity moment and the most obvious context, for the consequences to occur or even not to occur.

2determination of probabilities of occurrence or follow up actions.



Decision making involves some degree of problem solving. Alternatively it can be said that problem solving itself is a decision making process. In decision making some intuitive and alogical processes are operative, but problem solving occurs in a more realistic situation. Problem solving can be defined as an exercise of observing situations, vis a vis change causing elements.

To solve a problem, it is necessary to separate it, as a unique entity or event. Severe the connections and dependencies with other entities and happenings. To make such a dissection, one has to define the level and intensity of various connections, with following objectives:

◦Are other objects and happenings causing changes in the problem on hand.

◦Can the problem on hand cause, or can cause a change in other entities and happenings.

◦Are the interactions occurring both ways.

There are six categories of problems that required to be solved.

Mysteries: A mystery is an unexplained deviation from what is expected. Our efforts are in finding out what has caused the deviation. But for a mystery to happen it is necessary to understand as to what is a deviation (size, scale, measure, range etc.) and what forms a standard. A deviation is not necessarily a bad or foul thing, it may be an advantage or even a gain.

#         Mysterious problems get tackled as soon as deviation causing elements are identified.

Assignments: These are enforced exercises. There is a party which assigns the work and there is another, which undertakes the task. It is like a contract, where in goals or tasks must be properly defined, resources allocated, and delivery standards specified.

#         Assignments deal with known things but involve application of skill and management techniques.

Difficulties: A difficulty occurs for two reasons, either, we do not know, how to manage a situation, or feel that we lack the resources. Difficulties are subjective or objective. In the first case, the problem lies within the person. The person has the capacity, but is unable to accomplish a task. In the second case, the problem lies outside the person. A person may not have the talent, know how, motivation, resources etc.

#         Difficulties, if subjective in nature, require human resources, whereas objective ones need other physical inputs.

Opportunities: Efforts are made to recognize the problem and its context as a specific situation. The specific situation is looked upon as consisting of opportunities. Evaluation of opportunities in terms of the potential benefit or loss leads to solution of a problem.

#         Opportunities are time related, and so need to be perceived well before, or as soon as they occur.

Puzzles: A Puzzle is a situation where we know a correct solution exists, but sufficient efforts have not been made to discover it. Puzzles are of three types: soluble, currently insoluble and ever insoluble. Soluble puzzles can be tackled with current knowledge. Currently insoluble puzzles will be hopefully solved, when resources and information are adequately available. Puzzles, however, remain insoluble, when certain important sections are irretrievably lost.

#         Puzzles have such inbuilt solution that in real sense, there is no need to solve any thing, but locate the solution and identify the way to reach it. Puzzles are solved as soon as the end itself or the means to the end are in sight.

Dilemmas: Dilemmas offer two or more choices, each of which seems equally fitting. Dilemmas remain in-force only for a particular time span, situation or value judgment. So dilemmas if handled by a different person, attended at another time, or dealt in another situation, may not be a problem at all.

#         Dilemmas pose as twin offering, of which only one need be appropriate. So if the problem is probed further and deeper and separated from its dependencies one of the solutions is likely to be just slightly more superior or less inferior.



There is often no right decision, but just competing alternatives with their inherent risks and consequences. Decision making also involves forecasting the end results. It involves determining the chances, frequency and intensity of occurrence or non occurrence of an action.

Forecasting is required, because at the decision making location and moment:

1. sufficient data is not available,

2. some problem solving exercise are incomplete,

3. resources are not available,

4. time is insufficient,

5. there is a lack of experience or competence.

Decision making in many instances involves selection of one course of action over many others. Most often a decision is valued on what it ultimately achieves, and how efficiently. Since the outcome of a decision is always in future, it needs to be forecast. However, when the quality of outcome or approximation to a goal for all the possible actions is nearly same one need to bring in a value judgment. Forecasting the effects of decisions, helps in better decisions making.


Forecast-able situations are inherently probable. A human being cannot operate on a situation that is never probable. However, probabilities are either deterministic or in-deterministic.

Deterministic probability: A hill station is likely to be a cool place, because all our experiences have taught us that height and coolness of a place are correlated. Determinable probabilities have fewer operative factors, so chances of probability are much focussed.

Indeterminable probability: An oil well may give oil, which however, may or may not occur. And the oil, if it occurs may not have a commercially viable quantum. Such situations pose many uncertain factors.

3.12              DESIGN PROCESSES

There are many ways Designing is handled. There are obvious conditions like: Nature of output, Presentation tools and methods, Scale of detail, Nascent effort or routine application, Human and other resources available, Technology involved, etc. But most important one that affects the output quality is the Technic of Design or the Design Process. Some of the important Design processes are discussed here. These processes are nominally not comparable.

           1         Holistic approach

           2         Component approach

           3         Redesign or Re-engineering

           4         Concurrent engineering or Simultaneous design



Design effort that conceives a complete and self-contained system, initially, is called a Holistic approach (whole to the part). Holistic approach entails germination of an intuition into a complete system. Such creations are very personal, akin to a work of art, often not functional, and not necessarily reproducible. Holistic approach is useful in areas where sufficient information is unavailable or there is a distinct disinclination to search for the detail. Holistic approach is inadvertently followed when inspiration rather than logic causes a design. A holistic concept and its execution if distanced in time, some recall is required forcing documentation, and the holistic approach may not remain as wholesome.

The term holism was introduced by the South African statesman Jan Smuts in his 1926 book, Holism and Evolution. Smuts defined holism as the tendency in nature to form wholes that are greater than the sum of the parts through creative evolution.

The whole is more than the sum of its parts -Aristotle.

Holism (from holos, a Greek word meaning all, entire, total) is the idea that all the properties of a given system (biological, chemical, social, economic, mental, linguistic, etc.) cannot be determined or explained by the sum of its component parts alone. Instead, the system as a whole determines in an important way how the parts behave. Reductionism is sometimes seen as the opposite of holism. In science reductionism is seen as a complex system that can be explained by reduction to its fundamental parts. Chemistry is reducible to physics, and biology is reducible to chemistry and physics, similarly psychology and sociology are reducible to biology, etc. Some other consider holism and reductionism to be complementary viewpoints to offer a proper account of a given system.



A complex entity is perceived, as if composed of several sub systems each of which is already substantially functional. Here one is required to solve the inter-subsystem relationship, and while doing so, upgrade the original subsystem, or possibly select a new subsystem. Component approach (parts to the whole) provides systems that are reliable, but usually traditional. Where situations demand a radically different or a novel solution, parts to the whole design approach are often inadequate. The component approach requires one to have a complete overview of the system, and be able to recognise the value of the component in the whole. This is rather simplified by recognising the time and space extents of the subsystems. The components dwelling or manifesting within such defined domains are not much affected from conditions beyond their boundaries, so can be dealt easily. Component approach creates systems with some regimentation where subsystems have predictable dependency and yet are replaceable. Component approach systems are fairly matured. Modern day automobiles, computers are examples of this, but robots are not.



Most products, however claimed to be original, are only improvised versions of some existing thing or a Redesign. This is a well accepted design process for products’ development. It has perhaps, a little less relevance in design processes of unique or first ever systems, such as Civil structures and Architectural entities.

Japan perfected the process and achieved distinctive product design solutions in early 1960s. Sony music system Walkman has evolved from such efforts. At that point of time taped music systems were very bulky and weighed very heavy. To enjoy the Hi-Fi sound quality outdoors, one had to have large sized twin speakers, large size battery for power supply and spool type tapes. Walkman as a redesigned entity became a very innovative product.

Manufacturers need to design new products and launch them before their competitors do. Redesign or Re-engineering is used for product development for Automobiles, `white goods', office equipments, etc. For this markets are continuously surveyed to find out the features that make certain products leaders in the market. An attempt is made to absorb and improvise such features. As one is working with a successful subsystem, the chances of its failure are less. Redesign generates a product in its new Avatar.

Redesign addresses to deficiencies of aging technologies, fast changing tastes and varying operative conditions of products. It gives very specific clues which new features are accepted and which are the emergent technologies. It also allows faster incorporation of new technologies as new subsystems being offered by inventors and innovators are sought. New products are launched with minimum changes to existing tools and plant. Workers only need to upgrade their skills, and new employees or new training schedules are not required. The improvised product has slight familiarity with the existing range, and as a result comfort of acceptance is high.

Redesign practitioners operate with notions that:

         A whole system is divisible into subsystems, each of which can be improvised.

         These subsystems can be improved in-house, but technologically better solutions are being developed by others, so identify them and collaborate to resource such emergent solutions.

         It is more efficient to redesign or re-engineer a known system, then go into basic research to discover a new entity.

         A product of redesign process has fewer chances of failure, because one is improvising upon a working system.

         Transfer or absorption of new Technologies is very fast.

Redesign processes require a lot of field surveys for identification of a market leader product. The field data is often so enormous and with minor or rare variants that may require statistical processing. Very often feedback from consumers is subjective in nature. There is a distinct danger for the design leader/ team to get entangled in the data collection and interpretation work at the cost of essential design clarity and creativity. Redesigned products have to be very careful about infringing intellectual property rights of others. It is also extremely difficult to secure patents, copyrights, etc. for such products.

Organizations, that deal in very competitive markets, prefer the redesign processes as it allows them to continuously update their product with minimum of risks.



Concurrent engineering or Simultaneous designing has some bearing on component approach for design. However, the implications here relate to entire project and not just the product. A product or an aspect of a project is recognised as an entity designed, produced and operating in a larger context. Till recently, these were considered separate task modules and handled sequentially. Whenever a major change was proposed everything had to be reset, forcing rethink and rework. It increased the ‘development time’ of a project.

An Integrated Product Development IPD, as the Concurrent Engineering is sometimes referred to, allows several teams to work simultaneously. It brings together multi-disciplinary teams working in diverse locations, taking advantage of locale talent or resources, the daytime zones and climatic conditions. The teams could be a departmental, outsourced facility or free lancing entities.

The simultaneous approach need live or virtual linkage channels for very fast communication. Concepts, ideas, designs, specifications and alternatives are exchanged instantly, and shared with the project leader, teams handling specific tasks, and often all stack holders. Sharing may also be through a public domain like internet world wide web allowing anyone to pass an opinion or make a business offer. Concurrent engineering offers gains such as reduced product development time and cost, reduced design rework, and improved communications.

For example, a significant design change in structural design of a bridge span will affect design of many other sub systems. It could mean change of loads on the columns, foundation structures, scaffolding requirements etc. Each of these would have new design parameters, but with electronic drafting tools and instant communication means, all design changes can be apparent to all the concerned agencies, immediately.

Concurrent Engineering or Simultaneous Designing works with following notions:

         A system can be perceived as consisting of several independent, and inter-dependent subsystems. The nature of the dependency is defined so that the subsystems can be dealt by the same team (sequentially) or by different teams (simultaneously -in parallel mode).

         Association of different teams allows superior technological input. Different teams working in parallel mode offer faster a throughput. Teams located in different time zones though do not fully operate in parallel mode, offer advantage of local technologies and 24x7 day-light working hours.

         Virtual parallel processing of projects occur in many different ways. Database, spreadsheet, CAD drawings and other documents can be altered by many different users, with each version or layer identified separately and a possibility of assimilating (merging) it selectively.

         Current day high speed virtual communication (broad band internet, video conferencing) allow changes to be proposed, confirmed and accommodated in real time mode.

         The evolution of design becomes participatory. It does not remain restricted to hired or appointed experts, but becomes a public domain affair with inventors, innovators and other free lancers offering novel ideas. Such offers are usually on a try it - like it - buy it basis, i.e. without any consultancy charges or purchase-payment obligations.

Concurrent Engineering or Simultaneous Designing works best when resource constraints are very acute. It also works well where technologies that are uncertain or less defined now, can also be included later. It helps in completion of projects in the shortest possible time and maximises the profit or advantage. It matches tasks to available human resources, machine capacities. Organization dabbling in off the track jobs cannot suddenly recruit new employees, upgrade the competence of staff or resort to over-time payments for the extra work, efficiently use the concurrent engineering. Concurrent Engineering or Simultaneous designing is one of the best methods to infuse new technologies, adjust to erratic finance flows and cope up with external factors like climate, political conditions, etc. These methods allow use of human and other physical resources however, remote they may be.






A designer, as a professional, strives to assure that projects when completed provide the intended benefits with planned level of inputs. Such assurances are needed at many different levels. A designer needs to assure the project initiators, project users (owners or the product buyers), project operators and the society. Such assurances, regarding the project, translate into a pursuit for a quality.

Quality represents the fundamental economics of the input-output equation. The emphasis is upon maximizing the achievements, value addition and minimizing process effort, resource wastage.

`The concept of quality is the totality of features and characteristics of a project, product or service that bear on its ability to satisfy stated or implied needs' (ISO 8402). An enhancement of satisfaction is the key element of quality conscience. Quality is both a perception and a value judgment, concerning human satisfaction; the basis for both is ever changing.

Quality results from a three-way interaction between:

1The nature of the project, product or service, as perceived by the originator, i.e. the thing in its own entirety.

2The user's original needs and altered expectations, as a result of interaction with a completed project or product.

3The operations or functioning of a project, product or service, as reflected in training, servicing, parts availability, ease of replacement, warranties etc.

The characteristics of the project, product or service by themselves, cannot determine the measure of quality. Quality is an issue how the projects, products or services are carried out or employed, and also how the external conditions support the usage. A product that is satisfactory in every respect may fail, if the external use conditions are drastically altered.


Quality in interior design jobs results from an interaction between `what the interior is' and `what the users do with it'. There several contextual issues, against which quality judgments are made, like: comfort level, variety, novelty, prestige, economy, etc., with their social, cultural, psychological, political and other dimensions. These secondary issues are considered fairly predictable and stable, but projects that coincide with ‘major change phases of secondary issues' fail to serve in terms of changed quality perceptions.

An interior designer prepares a project brief of determining all requirements, such as: clients’ needs and demands, technical requirements, statutory obligations, prevailing standards, current styles, available technologies, etc. The user-client usually may not understand these aspects, so in good faith allow the designer to proceed.

As the design gets underway and the design presentations, in colour, 3D format and now in virtual animations, makes the user-client ‘truly’ react to the design. The client, in the meanwhile, ‘due to the subjective involvement’, becomes very perceptive to all issues of ‘Interior Design’. The client begins to absorb new ideas from friends, media, etc. Such an awareness on the part of a client completely changes the perceptions. A designer should see this as the inevitable and be prepared to modify the design at a later stage.

As the project materialises on the site, the user-client begins to have first life size or realistic experience of the designed entity. Once again the designer faces a barrage of new demands, requiring substantial to a complete rethink of the design.

A project as it is delivered to an actual occupying-user (who could be a new person, different from the assigning-executing client) the designed entity is revalued. The new occupant, who may not bother to involve the original designer, begins to re-validate the entity on -‘what the (his/her) personal space should be’. This could be based on sum effects of many factors like cultural roots, aspirations, economic status, etc.

Interior designers as a professional have an interest in seeing clients derive satisfaction during the project execution phase, by adequately answering their quarries, offering convincing explanations, and by providing economic and technical comparisons amongst various options. Interior Designers continue to satisfy their clients even after completion. This helps clients come back to the original designer for the next Interior Design Job. In interior design, the next job usually arrives within Five years, unlike in Architecture, where it may not happen in the current generation, i.e. not before 20/25 years.


To achieve quality meticulousness, an organization must offer products or services that:

           a         meet a well defined need, use or purpose,

           b         satisfy customers' expectations,

           c         comply with applicable standards and specifications,

           d         comply with statutory requirements and other social obligations,

           e         are made available at competitive prices,

           f          are provided at a cost which will yield a benefit or profit to the user.

For developing quality meticulousness it is very necessary that all matters relating to quality control are well documented. A well-documented brief serves as a benchmark for assessing the level of the quality being achieved. Wherever Quality control documents that are formal, transparent and accessible, to all stack holders (clients, users, public and competitors), the projects, product and services have greater quality assurance. Such entities are more acceptable.

In order to meet these objectives, an organization should keep the technical, administrative and human factors affecting the quality under control. Such controls are oriented towards the reduction, elimination and prevention of quality deficiencies.





3.14               RISK MANAGEMENT

A great deal is expected from every human endeavour. Entities, events or organizations are set up with expense of resources, effort and time. Planning and Operative care are imperative. Yet these human endeavour fail to take-off, perform adequately, or satisfy its stack-holders. Risk is any such factor that adversely affects an entity, event or an organization. Risks are both Natural and Man-made. All risks that do not have an explainable base are considered. In other words risks that cannot be proscribed as human caused are considered natural.

Risk management is a process of

1         Identifying the risks

2         Assessing (scale of affectation)

3         Prioritizing (sequencing of risks in terms of their severity of consequences and chances of occurrence)

4         Mitigating the risks (by way of monitoring and controlling the probability and by way of absorption and diversion of consequences).

Risk Management has been recognised as a generic standard under series ISO 31000. Risk management processes are applied to project management, security, engineering, industrial processes, financial portfolios, actuarial assessments, and public health and safety.

Risks are broadly categorized as Natural or Circumstantial and Man-made or Intentional.

Natural or Circumstantial failures originate from outside the system due to the context or changes in the environment. This could be perceived as an advantage that system can be isolated with a barrier. But some systems have to be participating with the environment, otherwise cannot flourish, and cannot be isolated. Circumstantial failures are accidental, i.e. unpredictable in scale (size) and time of occurrence.

Man-made failures are defined as intentional because of the Human involvement with entities, events or organizations. These occur because the conception, observance or operations of the system are faulty. These can set right by foresight, flexibility of approach (such as adopting ‘open system or open-ended architecture’), provisions of additional capacities, and by including escape or safety procedures.

           Some of the man-made failures occur, because:

                    System is not designed or adequately equipped (technically) to serve the nominally expected functions.

                    System is required to serve functions for which it is not designed and there no processes to regulate the over-use, mis-use or under or non-use.

                    System has a rigid design, structure or setup regimen which prevents corrections or improvisations.

                    System is so liberal that a coordinated emergency action plan can be enforced.



‘Risk is any factor that affects an activity or object that denotes a likely negative impact from some present process or future event’. Contrary to this some believe risks often have an advantage, like a lottery may provide unusually large gain for a very small loss. Risk if negative is valued against the scale of loss and frequency of occurrence.

Purchasing a lottery ticket is a risky investment with a high chance of no return and a small chance of a very high return. But since the amount lost is small and the gain very large, lot of people go for it. In contrast investing money in a company involves a large investment, so we take care to find out the identity of the company. A government bond though provides a small interest is considered less risky. In finance the greater the risk, higher is the potential return.


Risks in personal health are reduced by preventive actions, like avoiding illness causing situations. Secondary prevention can come by early diagnosis and perhaps preventive regimen and treatment. Third level of action is directed in terminating negative effects of an already established disease by restoring function and reducing disease-related complications.



         Determinable Risks are predictable. Certain factors trigger such risks, so observance and reportage mechanisms for such conditions can help avoid it.

         Probable Risks are predictable but within limits of probability, but the trigger factors are not easily definable. Historical experiences show us what the scale of affectation and pattern of occurrence will be. Affectation can be spatially isolated and temporally limited, by design of the joints, connections, and by spacing and distancing. The occurrence schedules may be matched with a timed action or even planned dormancy. Additional capacities (factor of safety, safe margins), are provided for such contingencies.

         Indeterminable Risks have very low probability, or the twin aspects such as scale of affectation and pattern of occurrence are indeterminable. The damage and suffering cannot be predicted. Its mitigation is left to the concerned age and society.



One can avoid, manage or accommodate the risks to a limited extent. Beyond these, the effects of risks have to be compensated, replaced or transformed in such a way, that there is a sense of equilibrium. One may not be able ‘to reestablish the lost entity, reenact the missed event, or resurrect the dead system’, but one may indemnify against such losses.

Indemnity: (origin = Latin indemnitas, from indemnis 'unhurt, free from loss') 1 security or protection against a loss, hurt, damage or other financial burden. 2 security against or exemption from legal penalties, liabilities or responsibility for one's actions. 3 a sum of money paid as compensation, especially by a country defeated in war.

Dorfman 1997 prescribes four way strategies for managing the risks: Tolerate (retain), Treat (mitigate), Terminate (eliminate), Transfer (buy insurance, hedge). Ideal use of these strategies may not be possible as some of them may involve tradeoffs that are not acceptable to the organization or person making the risk management decisions. Another source (US Department of Defence) calls this ACAT, for Accept, Control, Avoid, and Transfer.

Risks are managed in three ways:

         Perceive the likely scale of affectation,

         Determine the chances of occurrence,

                    Prioritize risks as per their scale and nature affectation

         Develop strategies

                    to control the effects,

                    to recover the earlier condition, as close as possible

                    to compensate the losses to people and organizations.

                    to replace a high risk situation with low level risk (less severity of affectation or predictable occurrence).

A process of prioritization has many different facets. Saving lives is given a higher priority then salvaging goods and equipments. Evacuation of human beings gets greater priority then saving a structure. But many countries feel sacrificing a human life may be unavoidable then surrender to terrorist hostage situation. Risks with greater probability, higher monetary loss (of replacement) are handled first.

Risk management include equating the cost of controlling the risk versus the cost of compensating the losses. It also includes the evaluating the cost of recovery against the expense for compensation. Justifying the cost of being prepared over a long duration for an event that has low probability.



Risk avoidance is just one important aspect of risk management. It means ‘controlling all detrimental activities’. But all risks cannot be avoided and thereby managed. Some risks are delayed, hastened, diverted, or even embraced. Avoiding risks also means losing out a very high gain potential situation. Many take a ‘calculated plunge’ for a small or rare risk.

For example it is perceived that taking on a client (new project) translates into larger profit in business. That is not always so, because the bother of dealing with an unusual or odd client could actually mean less or no profit for the organization. Another example would be procuring a non standard (without full guarantee and warrantees) for an acute need.

Risk reduction It involves methods to reduce the severity of the loss. In buildings this includes fire escapes, controlled use of combustible materials, installation of sprinklers with fire detectors, etc. The cost of such risk reduction systems is checked in terms of what it can save or prevent.

Risk retention Risk retention means the person or the party bears the loss resulting out of an event. This is a viable strategy for small risks where the cost of insuring and getting compensation would be greater, like in minor illness or injuries. All risks that are not avoided or transferred are presumed to be bourn or retained by the person or party.

Risk transfer Risks are transferred to another party by contract or by hedging (as in betting). Insurance is one type of risk transfer that uses contracts. Risks are transferred to another party, schedule to other time, shifted to different / separate location. The pace of transfer is often hastened or slowed, and the affectations are concentrated or spread. Risk of injury due to local impact (and so intensive) are spread to a wider area by means such as a helmet, a car air-bag, knee pad, a seat belt, etc. Impact buffers and such stopper mechanisms absorb the impact or divert it.

Some biological systems, pliant compositions and pseudo intelligent entities (e.g. some equipment with fuzzy logic and neural networking), have capacity to self regulate or self organize to accommodate the conditions of change. Such systems are inherently restricted or finite in capacity. Their risk sensing and accommodative functionality are available so long as required energy and other input are available. Designers strive to emulate such systems by integrating the risk handling features (such as: gas and fire detectors, auto sprinklers, auto open-shut opening systems, burglar alarms, earth quake and heavy wind load absorbers, etc.), into their creations.



Risks result into losses, delays, setbacks and death (system termination). Ideally on one hand, the expenditure on risk management must be minimized, and on the other hand maximize the risk-safe zones and periods (mean-time between failures). Yet, sometimes risks are indulged into or ignored in view of the benefits.

In the commercial risks are of two types of risks: Inherent risks are part of any business operation, and affect the profits or opportunities negatively. Incidental risks are natural and not always part of, or due to the business activity.

In case of risk insurance, only risks that are stated distinctly in the contract are included for premium compensation, all other risks (including unknown and indeterminable ones) are presumed to be bourn by the party (insurer).



Projects are designed to achieve certain goals effectively. Any event that affects these objectives of the project partly or fully is considered as a risk. Risks in projects are identified by enacting various scenarios (combination of various possibilities occurring together). The scenario, if risky is further probed to assess its potential severity and extent of loss. These two quantities are simple to measure when set as the value of the damaged component. However, the probability of occurrence is difficult or impossible to assess as an event due to lack of history.

The fundamental difficulty in risk assessment is determining the rate of occurrence since statistical information is not available on all kinds of past incidents. Furthermore, evaluating the severity of the consequences (impact) is often quite difficult for immaterial assets. Asset valuation is another question that needs to be addressed. Thus, best educated opinions and available statistics are the primary sources of information.

Provisions for various risks tend to have a cumulative effect. For example a building foundation is designed to carry the load of the building, with additional provisions for an earth quake, hurricanes, temporary loadings, etc., but not all of these are likely to occur simultaneously. Similarly we provide extra for individual considerations: loads calculations, strength of cement concrete and steel bars. These, if not properly attuned can add up to substantial over spending. All provisions for risks need a careful working for the individual as well as cumulative effect.

Interior Design Projects fail to satisfy a client, or are commercial losers for a variety of reasons like, shift in taste, changes in market demands, arrival of new technologies, prices, etc. Many of these factors become affective when projects’ execution is long drawn or delayed. Finishing projects on schedule eases many such problems. Projects become risky due to poor definition of the project requirements, and lack of complete understanding and acceptance of the project profile report by the client. Interior Design projects often fail due to iron-clad specifications, which may not allow correction or improvisation during execution. The risks on this count can be taken care of, for example by keeping ‘open certain windows’ for later formulation or decision. Such as even while specifying the quality and the price range, the colour and texture selection remains open.

A designer must be extremely careful of individual warrantee and guarantee that when read as a combination often cancel out each other. Complex Interior Design projects formed of several systems (offered by equally varied vendors), have conflicting provisions.



Insurance is a risk management investment. By paying a small sum, the premium, risks are conditionally insured. The compensation is invariably for providing an equivalent product or commercial value (at the time of loss) in monetary terms. Emotional and such other associated considerations (Nose of an actress) are often insured, but by determining a fixed value for it, before a contract is made. Value for the loss of life, is an example of similar nature. Loss of opportunity such as earning, business, etc. due to sickness, injury, strike, riots, war, etc. can also be insured. Loss due to certain happenings like flood, riot, calamity, malicious damage by any person, devaluation of currency, sudden drop or rise in prices, defaulted business services, blames, lawsuit expenses, fines, compensation payments, etc. can be provisioned through insurance.

Insurance is an indemnity against loss. It is a way of contracting out of a risk. A person, company, an organization, or governments pay a small amount -premium, to protect own self from a potential large loss.



A typical insurance company working on life insurance has a large clientele consisting of people of various age, vocation, etc. Of these only few will die, in a year, for which compensation is paid. The premium rates are based on historical data, such life expectancy, rate of natural deaths and caused by accident, etc. An actuary is an expert who compiles and analyses statistics in order to calculate insurance risks and premiums.

An insurance company can be in a problem zone if in one locality many people were to die simultaneously. In such an eventuality, the sudden demand for compensation can be very difficult to meet. To provide for such an eventuality, the insurance company reinsures itself with another company that perhaps has no such liability in the same geographic region. This reinsurance strategy spreads the risks, over time and space.

The insurance company operates on the premise that not all risks happen simultaneously and to all the insurers. Insurance companies plan their business in such a way that in comparison to their premium income the amount to be paid out for compensation is less, resulting in meeting the administrative expenses of business and a reasonable profit.

Often for a very large risk like insurance for nuclear power plant or a space craft, an insurance company or some other commercial entity acts as an underwriter. For example in Britain `Lloyd' is a recognized underwriter for insurance business. It does not on its own insures any risk personally, but as a professional body with very strict rules of conduct, manages everything about insurance and takes the first liability. It than, divides and transfers the risk, to several insurance companies by sharing the earned premium. Lloyds do not deal directly with the public, but works through the insurance brokers.


In India till recently Life Insurance was handled exclusively by Life Insurance Corp. of India and General Insurance business was carried out by the 4 units of General Insurance Corporation. However, now many other private companies are allowed to do both types of businesses. An insurance company or their appointed agencies carry out the pre-insurance assessment, premium determination, risk cover and distribution management, loss surveys, payment of claims, etc. Large organizations often hire consultant actuaries to assess the insurance cover.

         Basic classes of Insurance: Fire, Accident, Life, Marine.

         Other ways of defining the Insurance:

Imperative: These are risks, which would imperil the organization's existence, such as the destruction of assets (fire, flood, etc.), or circumstances that would seriously impair its ability to operate effectively (such as a machinery breakdown, loss of vehicles and the like).

Statutory: These consist of insurance covers that are required by law, such as employers’ liability for its employees, and third-party accident in connection with motor vehicles, etc.

Contractual: Construction and similar contracts require the contractors to take out insurance to cover such risks as public liability and fire.

Advisable: These include risks that could be costly or embarrassing: examples are burglary, export credit and accident to key personnel.

Others: There are many minor risks that organizations consider it worthwhile to insure against, plate glass insurance for retail stores being an example.

3.15                                 FINANCE

A Design professional deals with money mainly to conduct own commercial organization (professional practice) and sometimes to help a client implement the project. The second case like situations are rare, (but occasionally this do happen with small clients) here the designer gets a free hand, to spend someone else's money. However, in a professional practice it is the management of these sums that though may provide a great comfort to the client, but causes problems with the tax authorities.


This can happen in several ways, knowingly and inadvertently. For managing project expenditures some precautions are necessary:

1         The ideal condition is one where the designer approves bills of expenditure and certifies the payment, client then arranges the payment.

2         Next option is to operate a joint signatory bank account. This must be operated in the name of client and as main operator, and the designer as the authorized signatory. Alternatively a single operator bank account, in the name of client but operated by the designer as the authorized (power of attorney) signatory.

3         In case 2, a designer must avoid granting payments to own-self such as for professional fees or other chargeable amounts.

4         Client’s money (in any form) meant for the execution of a project must never be deposited in a designer’s personal account or design company’s account even for a short duration transfer.

5         For case 2 (as above) All transactions must be through cheques drawn to party receiving the payment, and no third party or bearer cheques. No self-cheques for cash withdrawal.

6         All payments to designer own-self, design company or their employees must be made with client’s own signature on the cheque.

7         When a bank account as per 2 above for project expenditure is operated, it is meant for expenditure on the project that is payments for labour, services, materials, other consultant’s fees, etc., but may not include payments for site rent and taxes. For the later, a separate clause must be added to the authorization deed.

8         The power of attorney or authorization must be for specific period (if necessary with provision for periodic renewal), but not with non-specific mention such as ‘till project is completed’.

9         Avoid payments from such accounts that are like investment (including shares, bank deposits or bonds), or speculative spending.

10       All payments (by client or by designer as an authorized signatory) must be over invoices or vouchers made in the name of the client. Avoid accepting any invoices made in the name of the designer or designer’s company. Write cheques only in the name of the (suppliers, vendors, contractors, etc.) party who generates the invoice (to avoid third party payments).

There are some basic differences how small and large clients (and corporates) manage their project expenditure.



Small clients have the budgeted amount almost ready for investment, as if the entire sum is to be spent immediately, and in one lot. Interior projects however small, consist of items that occur in phases, and so do the payments for them. If a designer takes care to prepare, a schedule of expenditure, in addition to the nominal schedule of estimates, a client can be advised on ‘When and What sums will be required’. By properly scheduling the purchases of independent systems to later part of the project one can delay the investments. Such delayed purchases also help in taking full advantage of guarantee and warrantee provisions, and also delay the expenses on risk management costs like insurance. Date of purchase also affects the amount of depreciation (a purchase made during the last few months before the year ends, qualifies for full year’s depreciation).



Such clients usually borrow money for a project from different internal account heads and also from outside sources like financial institutions. Outside borrowing have to be planned and sanctioned (committed), even before the project is launched. A service charge of 1 to 3 % is levied on the sums sanctioned (but not actually borrowed) as loan (in addition to the interest on the amounts as and when actually borrowed). For this reason loan sanctions, and consequently heavier borrowing are differed as much as possible. Stand alone or complete systems like ACs, elevators, etc. are procured, as late as feasible, but sometimes a little earlier to take advantage of depreciation accounting during a financial year. Such clients usually need not only an estimate but also a very detailed schedule for payments.



A designer is often have to attend meetings where finance expert opine on project. Designer is expected to have some understanding of basic finance terms. Some of the documents like project reports, estimates, schedules, invoices, etc., generated by the designer must meet the requirements of accounts department of clients. Some terms of finance are explained here.


Capital is any amount that is spent for creation of wealth. It includes all possible material, nonmaterial, and human inputs. There are two forms of capital. Money is a fluid and intangible form capital, that is used as investment. The other capital is in the form of physical things such as: buildings, machinery and equipment employed for production of other goods and services, i.e. wealth.

Capital is created through, personal savings, borrowed from some source with attached obligations, or one that can be availed of by selling, renting, transferring in any other manner, whole or part of any tangible or non tangible property. Capital can be in cash, rights (ownership, tenancy, membership, citizenship, patent, copyright), abstract things (prestige, goodwill, expertise, knowledge, skill, information), etc..

Capital or the advantage out of it, are primarily used in creation of assets (fixed assets). Other uses include investment for the purchase of inputs, rents, etc., till an output is readied (working capital). The income earned by capital is profit, (like the wage or rent).

In an accounting sense, the capital of a business firm is that part of the net worth that has not been produced by the operation of the enterprise, or, in other words, the original stock of net assets of the firm before any income is earned.

In its broadest possible sense, capital includes the human population; nonmaterial elements such as skills, abilities, and education; land, buildings, machines, equipment of all kinds; and all stocks of goods—finished or unfinished—in the hands of both firms and households.

‘Capital arose out of the excess of production over consumption’ Adam Smith.

Fixed capital is usually defined as that which does not change its form in the course of the process of production, such as land, buildings, and machines.

Circulating capital consists of goods in process and operating expenses, raw materials, and stocks of finished goods waiting to be sold; these goods must be transformed, as when wheat is ground into flour, or they must change ownership, as when a stock of goods is sold.



It is a resource with economic value that an individual, corporation or country owns or controls with the expectation that it will provide future benefit. An asset is a physical or non-tangible entity that has some value of sale, purchase or even possession. Normally we own entities that are of some value now or in future. To ‘own’ here includes rights of exclusive possession (traditional ownership), rights of utilization (lease or rent), and other rights (visitation, guardianship).

Assets are capital: ‘It is any entity formed out of capital, and any entity that can be converted back as capital’. In account books, such assets are accounted as capital.

Projects on completion become physical assets for the clients.



Investment is any sum that is not used by a person in buying assets but allowed to be used by others for the same purpose. The other party provides some return for the sum allowed to be used for such a purpose. Generally investments are arranged with a fixed rate of interest, but sometimes these are linked to rate of inflation, risk perception, period of borrowing, etc., often called a floating rate of interest. Compared to these when the lender agrees to share the profit and / or loss (but may not participate in matters of other party’s affairs or business), called dividend. The dividend is dependent on share of profit being generated from the investment, so it is uncertain and risky, but provides greater advantage.

3.16                                REPORTS

A report is a presentation about an event or object, in a structured form designed to inform a specific audience. Reports are created in formats suitable for access by own-self or other people, in some other time and space. We create as well as access reports of many different types, like: Medical, Weather, Radio, TV, News paper, School, Site, Departmental, Confidential, Legal, Business etc.

Reports present results, prescriptions, directions, listings, proofs, deductions, explanations, confirmations, contradictions, facts, readings, observations, experiences, specifications, formulations, procedures, predictions, etc.



Reports have many different forms, letters, memos, notes, essays, descriptions, theses, submissions, dissertations paper, minutes, memoranda, etc.

Regular or Routine Reports are well structured to accommodate data of consistent nature and arriving at a regular pace from frequently occurring tasks or events. Special Reports are rare, unique, and may follow some basic structural format to facilitate presentation of varied type of data.

Personal Report is record or is an author’s memoir. Others may not have access. Very personal or private reports could use abbreviated or coded language. Specific Report would provoke only a specific class of users, whereas a General Report may stimulate different people differently.

Information Reports are meant to record entities and events, and may or may not expect sympathy or feedback from the receiver. Information reports are become records for the posterity (history). Most reports have an intention. Some reports are designed to provoke or instigate, so directly or subtly prescribe a course of action or a mode of operation. Such Provocative Reports may or may not indicate benefits and hazards of the prescribed actions.

Response Reports are created to answer quarries as in examination, investigations, or departmental actions. Conclusion or Deductive Reports are also created to prove or disprove a hypothesis or to conclude concepts as in case of surveys and as dissertation.



Reports have many styles or formats. Contents of Reports and the style of presentation vary according to the purpose it is required to serve.

● Reports begin by stating the truth or an assumption, and follow it up with data and analysis to justify or reject it.

● Reports discuss or analyse a context or situation, and like some detectives uncover the purpose, cause of happening, or truth behind it.

● Reports also take a middle course, i.e. take-on various presumptions and situations and build up a hypothesis, formula, or a theory.



All reports have some effect. Reports affect values, beliefs, feelings, prestige, honour of individuals, cultures, societies, nations, races etc. Reports affect its author, composer or originator and publisher, when results or feedbacks are not of intended nature, do not come in at all, are inadequate or late. Reports provide, both gains and losses to individuals and organizations.

Reports stimulate a person or a class of people very mildly, or provoke them to take corollary action. Such consequences are often perceived or prescribed by the author or publisher of the report. The context of time and situation are used for exploiting or controlling such consequences. Authors are also affected by premature release or unauthorized use of reports.

Reports are assets with a cost of rarity (unique, patent, invention, realization, miracle); cost of acquisition (investigation, surveys, experimentation, prototyping); cost of dissemination (printing, publication, presentation, copyrights, patents, distribution). Report costs are recovered as salaries, royalties, commissions, fees, charges, prestige, goodwill, increased commercial and strategic benefits, etc.

Authors or publishers often indicate the nature of results, and also guarantee the quantum of gain or loss. In case of a thesis the author has to guarantee authenticity of facts and truthfulness of comments and observations. Historical reports expect the author to be sincere and unbiased.



Public Reports are placed at a place where these can be accessed. Reports or documents are stored with many other documents. All storage arrangements have some degree of classification system.

        FIRST classification range is the order of arrival. This by itself provides little meaning, except that it shows what is new (-so latest), and what is old (possibly redundant). Documents are either, date stamped or given a sequential identifier (a chronological number -numeric, alphanumeric or alphabetical).

         SECOND classification range is the size of the document. However, this may not faithfully reflect the contents’ nature (print media, digital, etc.), quality or quantity.

         THIRD classification range is the identity of the author of the document. If the author is well known, certain level of content and quality can be presumed.

         FOURTH classification range the name of the document. Documents have many different titles like: as provided by the author or the publisher, and in addition may have a name for the public, for the storage system (computer file system, or internet file protocols). Titles are abridged or expanded to include search characters, numbers, words or keys.

Documents often have identical titles and may be distinguished by various appendages such as author’s name, publisher’s name, date of publication or arrival in storage system. Computer file system and internet site address protocol use the extension codes for this purpose.

         FIFTH classification range is the title of report as given by its author. These are usually of two to three types or tiers. The main title broadly describes the contents and sometimes the purpose of the report. Usually it is of more than one word long, and often runs for two to three lines or sentences. Main title identifies the report from such reports that deal with similar or parallel subjects. Main title to the report is specific and never a general one. For example Study of lighting in Interiors is a non specific title, because lighting in interior could be natural, artificial, mixed, direct, reflected, borrowed, even, spot, day, night, evening, purpose related or general illumination. Interiors could be residential, public spaces, commercial, or industrial. Unless the report covers all these, a specific title could have been Study of day time artificial lighting needs in industrial interiors, or Study of lighting in terms of its effect on the perception of heights in interior spaces.

         SIXTH classification range relates to document’s relevance to other field of knowledge. The contents of reports refer to two or more distinct branches of knowledge. The main title or its other tiers though describe the contents, often fail to discuss the intentions of the report. Such reports have abstract, a brief description, excerpt or summary. (Paper back novel publishers often include critics views on back and inside covers. and such internet sites allow public to review books, music, etc. and paste their comments.) Such short descriptions are also used for primary dissemination of information, and function as a mini report on the full report.

         SEVENTH classification range derives from the parts of the document. An index and table of contents, show the sequence, size, placement of sub-parts of the document. The sections, chapters and paragraph headings, other media presentations (photographs, illustrations, audio-video clips, links to other chapters, references to other documents, internet links to other resources, provide some idea about the contents.

Yet topics that are dealt at lower levels, i.e. at sentence or paragraph level may not be adequately covered. A Glossary of key words or terms provides an ideal reference for the sub topics. Internet search engines and research institutions draw out such keywords and add them to their master data base of terms. The database not only provides reference as to the location of terms but also their context.

With modern day electronic multi tasking capability and multi media capable systems, the format of a report has completely changed. Terms like Index, Glossary, list, appendixes were indicative of physical placement of various categories of information. Some of these physical locations were difficult to access. Similarly for the sake of medium of presentation most documents were hard copy with little graphics here and there. But electronic media now allows interactive presentation formats in audio, video, virtual reality, etc. All types of documents irrespective of media type, language or format are easily accessible. Hypertext has become a tool for interactive access system. Documents in other storage devices located at different geographical locations are available.