GLOSSARY

Key terms and concepts in the ACE Method

GENERAL TERMS

Core ACE Method concepts that apply across all phases.

ACE Method

A structured framework for software development consisting of five phases: Start (project setup), Analyze (understand the problem), Create (build the solution), Evaluate (validate effectiveness), and Commit (deploy and maintain).

MVP (Minimum Viable Product)

The simplest version of a product that can be released to users while still providing value. Focus on core functionality first.

Sprint

A fixed time period (usually 1-4 weeks) during which specific work must be completed and made ready for review.

Stakeholder

Anyone who has an interest in or is affected by the project. Includes users, clients, team members, and management.

Deliverable

Any tangible or intangible output produced as a result of the project. Can include code, documentation, or reports.

Iteration

A complete development cycle including analysis, creation, and evaluation. Each iteration builds upon previous ones.

ANALYZE TERMS

Essential concepts for understanding and breaking down problems effectively.

Problem Decomposition

The process of breaking complex challenges into smaller, manageable components. Each component should be independently solvable while contributing to the overall solution.

Requirements Matrix

A structured document that maps features to business needs, technical constraints, and success criteria. Helps ensure all stakeholder needs are captured and addressed.

Risk Register

A comprehensive list of potential project risks, their likelihood, impact, and mitigation strategies. Updated throughout the project lifecycle.

MoSCoW Prioritization

Method for prioritizing requirements: Must have, Should have, Could have, Won't have. Ensures focus on delivering core value first.

Dependency Mapping

Visual representation of how different components, tasks, or systems relate to and depend on each other. Critical for planning implementation order.

CREATE TERMS

Implementation concepts and methodologies for building effective solutions.

Iterative Development

Building software in small, repeated cycles. Each iteration produces working software that can be tested and refined based on feedback.

AI Pair Programming

Collaborative coding where AI assists with implementation while humans maintain architectural control and make key decisions.

Technical Debt

The implied cost of additional rework caused by choosing an easy solution now instead of using a better approach that would take longer.

Code Review

Systematic examination of source code intended to find mistakes overlooked in development and improve overall quality.

Continuous Integration

Practice of merging all developer working copies to a shared mainline several times a day, with automated testing on each merge.

EVALUATE TERMS

Validation and quality assurance terminology for ensuring solution effectiveness.

Unit Testing

Testing individual components or functions in isolation to ensure they work correctly. Foundation of a reliable test suite.

Integration Testing

Testing how different components work together as a system. Catches issues that unit tests miss.

Performance Metrics

Quantifiable measures of system performance including response time, throughput, resource usage, and error rates.

User Acceptance Testing (UAT)

Final testing performed by end users to verify the solution meets their needs and works in real-world scenarios.

Quality Gates

Checkpoints in the development process where specific criteria must be met before proceeding. Ensures consistent quality standards.

Regression Testing

Re-running functional and non-functional tests to ensure previously developed features still work after changes.

TECHNICAL TERMS

Advanced concepts for experienced developers and technical architects.

Chaos Engineering

The practice of intentionally introducing failures to test system resilience. Helps identify weaknesses before they cause real outages.

Property-Based Testing

Testing approach that verifies properties hold for a large set of automatically generated inputs, finding edge cases humans might miss.

Distributed Tracing

Tracking requests as they flow through multiple services in a distributed system. Essential for debugging microservices architectures.

CAP Theorem

States that distributed systems can guarantee at most two of: Consistency, Availability, and Partition tolerance. Fundamental to system design decisions.

Blue-Green Deployment

A deployment strategy using two identical production environments to enable zero-downtime updates and instant rollback capability.

STRIDE Threat Modeling

Security framework categorizing threats as: Spoofing, Tampering, Repudiation, Information disclosure, Denial of service, Elevation of privilege.

Service Level Objectives (SLOs)

Specific measurable characteristics of a service like availability, throughput, or response time. Forms the basis of reliability engineering.

Mutation Testing

Testing method that modifies code to verify test suite effectiveness. If tests still pass after mutations, they may not be comprehensive enough.

Want to explore more terms? View the complete glossary on GitHub.

VIEW FULL GLOSSARY