1 - AS/NZS 14598.4:2000 INFORMATION TECHNOLOGY-SOFTWARE PRODUCT EVALUATION - PROCESS FOR ACQUIRERS
4 - PREFACE
5 - CONTENTS
7 - 1 Scope
7 - 2 Conformance
8 - 3 Normative references
8 - 4 Terms and definitions
8 - 4.1 commercial-off-the-shelf software (COTS)
8 - 4.2 custom software
8 - 4.3 existing software
8 - 5 Software product evaluation - General considerations
8 - 5.1 Correlation between evaluation and acquisition processes
9 - 5.2 Inputs to the evaluation process
9 - 5.2.1 System requirements
10 - 5.2.2 Integrity level requirements
10 - 5.2.3 Software requirements specification
11 - 5.2.4 Evaluations performed by others
11 - 5.3 Tailoring
12 - 6 Evaluation during acquisition of off-the-shelf software products
13 - 6.1 Step 1 - Establish evaluation requirements
13 - 6.1.1 Establish the purpose and scope of the evaluation
14 - 6.1.2 Specify evaluation requirements
15 - 6.2 Step 2 - Specify the evaluation
15 - 6.2.1 Select metrics
16 - 6.2.2 Select the evaluation methods
17 - 6.2.3 Evaluations performed by others
17 - 6.3 Step 3 - Design the evaluation
19 - 6.4 Step 4 - Execute the evaluation
19 - 6.4.1 Execute the evaluation methods
19 - 6.4.2 Analyze the evaluation results
20 - 6.4.3 Draw conclusions
20 - 7 Evaluation during acquisition of custom software and modifications to existing software
21 - 7.1 Step 1 - Establish evaluation requirements
21 - 7.2 Step 2 - Specify the evaluation
21 - 7.3 Step 3 - Design the evaluation
21 - 7.4 Step 4 - Execute the evaluation
22 - Annex A - Definitions from other standards
22 - A.1 acquirer
22 - A.2 acquisition
22 - A.3 attribute
22 - A.4 audit
22 - A.5 baseline
22 - A.6 CASE tool
22 - A.7 configuration item
22 - A.8 contract
23 - A.9 developer
23 - A.10 direct measure
23 - A.11 evaluation module
23 - A.12 external measure
23 - A.13 external quality
23 - A.14 failure
23 - A.15 fault
23 - A.16 firmware
23 - A.17 implied needs
23 - A.18 indicator
24 - A.19 indirect measure
24 - A.20 integrity level
24 - A.21 intermediate software product
24 - A.22 internal measure
24 - A.23 internal quality
24 - A.24 measure (verb)
24 - A.25 measure (noun)
24 - A.26 measurement
24 - A.27 metric
24 - A.28 off-the-shelf product
24 - A.29 operator
24 - A.30 quality
25 - A.31 quality evaluation
25 - A.32 quality in use
25 - A.33 quality model
25 - A.34 rating
25 - A.35 rating level
25 - A.36 request for proposal [tender]
25 - A.37 scale
26 - A.38 software
26 - A.39 software product
26 - A.40 supplier
26 - A.41 system
26 - A.42 user
26 - A.43 validation
26 - A.44 verification
27 - Annex B - Tables
27 - Table B.1 - Example tailoring of evaluation / acquisition activities per target software integrity
28 - Table B.2 - Example specifying software quality characteristics, subcharacteristics, external metrics
29 - Table B.3 - Example specifying Software Quality in Use Metrics
30 - Table B.4 - Example of Cost-Effectiveness Ranking of Evaluation Methods
31 - Annex C - Figures
31 - Figure C.1 - Example evaluation/acquisition process for off-the-shelf products
32 - Figure C.2 - Example evaluation/acquisition process for custom software or modifications to existing software
33 - Annex D - Evaluation methods
33 - D.1 Review of user and technical product documentation (including on-line documentation)
33 - D.2 Evaluation based on supplier courses and training
33 - D.3 Assessment of software engineering process
35 - D.4 Review of operating history with the supplier
36 - D.5 Review of operating history with customers
36 - D.6 Review of supplier capability, support, and quality system
37 - D.7 Prototyping and other evaluation methods
38 - Annex E - Example of staged evaluation process
38 - E.1 Stage 1: Planning - Requirements Stage
38 - E.2 Stage 2: Design - Acquisition Stage
39 - E.3 Stage 3: Full Evaluation Stage
40 - Bibliography
40 - Standards
40 - Other References