Product and service reviews are conducted independently by our editorial team, but we sometimes make money when you click on links. Learn more.
 

SAM in the ITIL Lifecycle Vs. SAM in the ISO Model

SAM in the ITIL Lifecycle Vs. SAM in the ISO Model
By

Many organizations have deployed an IT framework such as the ISO network management model or the ITIL lifecycle model to provide structure for the development, deployment and ongoing operations of their IT infrastructures. There are interesting insights to be gleaned from each of these frameworks; here we examine where Software Asset Management (SAM) fits within the ISO network management model versus its role in the ITIL lifecycle model.

Savvy companies should consider principles and processes from both of these frameworks when implementing any SAM tool.

We’ll begin by sketching out each of these models, starting with the ISO network management model, and continuing with the ITIL lifecycle model, with an emphasis of where software asset management (SAM) fits into each one. Then we’ll explain their respective strengths and weaknesses, and prescribe options that companies and organizations may wish to consider as they implement or optimize SAM for themselves.

The ISO Network Management Model: FCAPS

The FCAPS acronym is strongly associated with the ISO Telecommunications Management Network model and framework for network management. It is a standard originally developed at the International Organization for Standardization, or ISO, in the early 1980s as part of the Open Systems Interconnection effort. It was also adopted as the foundation for the International Telecommunication Union’s Telecommunications arm, aka ITU-T, as standard series M.3000.

FCAPS stands for the primary management categories into which the ISO model divides up network management tasks – namely:

  • Fault: a fault is a problem, issue, or even that has a potentially negative consequence. Thus, the goal that drives fault management is to recognize, isolate, correct, and log any faults that might occur on a network. Fault history also provides a basis for trend analysis, so that future faults may be predicted and mitigated in advance as much as possible. In practice, this means monitoring network behavior and traffic to look for anomalies or statistical outliers, as indicators of potential faults in the offing.
  • Configuration: a configuration represents data that describes the settings, parameters, and characteristics of programs and systems that operate on a network. The primary goal that drives configuration management is to gather and store configuration data (often in a configuration database) for all network devices and software. Secondary goals include the use of tools and automation to simplify device configuration, tracking of changes to configuration data, and planning for future growth or scaling of network devices and software over time.
  • Accounting: sometimes also called billing, accounting seeks to track usage statistics for users of a network as they consume bandwidth, processing time, storage space, consumables, and other tangible resources. By tracking such consumption, and placing a value on resources consumed, IT operations can “charge back” for system and network use. In practice, accounting also subsumes many network administration tasks, including authentication, access controls, and systems and network administration tasks (such as backup and restore, usage tracking and reporting, system maintenance and upkeep, and more).
  • Performance: performance measures the efficiency and efficacy of networks and systems, including such measures as uptime, throughput, response time, latency, and error rates. Performance management hinges on performance measurement and monitoring, and establishment of baselines that represent normal or acceptable measures of performance. Performance management also embraces measures of capacity and reliability, and provides a foundation for planning of future capacity and capability.
  •   Security: security concentrates on control over network assets and information, and protecting such assets or information from unauthorized access or outright theft, and from loss or harm. Data security is usually approached through authentication of users, appropriate access controls, and encryption, where access controls come from either operating system or database management system (DBMS) control settings (accounts, logins, rights and permissions, and so forth).

Ed Tittel is a 30-year-plus veteran of the computing industry, who’s worked as a programmer, a technical manager, a classroom instructor, a network consultant and a technical evangelist for companies that include Burroughs, Schlumberger, Novell, IBM/Tivoli and NetQoS. He has written and blogged for numerous publications, including Tom's Hardware, and is the author of over 140 computing books with a special emphasis on information security, Web markup languages and development tools, and Windows operating systems.

Check out Ed's Tom's IT Pro  Making It in IT - Certification & Training blog here

See here for all of Ed's Tom's IT Pro articles.

(Shutterstock cover image credit: Two Businessmen)