Getting Started with Application Portfolio Advisor
This document aims to guide you through the process of getting started with the Application Portfolio Advisor (APA) module in Txture. The APA module is the central management area for creating and maintaining visibility into your application portfolio and identifying application optimization and modernization opportunities.
APA helps organizations prioritize optimization by identifying risks, technical obsolescence, and application rationalization opportunities, delivering actionable recommendations to cut costs, reduce carbon footprint, and enhance technology performance.
This documentation is divided into two sections: Data collection (1.) and Reports and outcomes (2.), designed to help you get started with the APA module for your organization.
1. Data collection for APA
Similar to other Txture modules, the starting point is the population of the Txture Repository with your data. The repository is the central place in Txture where all data (assets, links and properties) are gathered.
1.1 What data is required for APA?
The data requirements vary depending on the specific use case. In general, the more data you have about the applications and their infrastructure landscape, the better the recommendations Txture can provide. Integrating an existing cloud environment into your Txture instance is especially beneficial when using APA. The following tables illustrate how different types of data contribute to the results provided by the APA module.
Required links and properties per application
The two tables below outline the links and properties needed to generate the corresponding outcomes in the APA module. The tables should be interpreted in such a way that the application's stack is built from its connections to other assets (such as databases or virtual servers), as shown in figure 1. These connections come together to define the application as a whole.
Table 1 lists all required links between the various assets. Table 2 lists the required properties for each asset type within the application stack.
Table 1: Links
Link Type | Source Asset Type | Target Asset Type |
---|---|---|
consists of | Application | Technical Component |
runs on | Technical Component | Virtual Server |
runs on | Application | Virtual Server |
owner | Application | Stakeholder |
uses | Application | Database |
Table 2: Properties
Asset Type | Property | Example Value | Impacted Outcome |
---|---|---|---|
Application | Name | DRT - Production | |
Application Development | Maintenance | Technical Fit Score | |
Business Continuity Relevance | Medium | Functional Fit Score | |
Business Criticality | Medium | Functional Fit Score | |
Business Expansion Potential | Medium | Functional Fit Score | |
Innovation Lever | Medium | Functional Fit Score | |
Lifecycle Status | Production | Lifecycle Status | |
Programming Languages | Kotlin 1.9 | Technical Fit Score | |
Technical Support Available | Partial | Technical Fit Score | |
Database | Name | drt_db01 | |
Technology | Amazon RDS for SQL Server | Cloud Provider | |
End of Life (EOL) | |||
Technical Fit Score | |||
Technical Improvements | |||
Technology Variant | db.r5.large | Carbon Footprint | |
Energy Consumption | |||
Potential Savings | |||
Total Costs | |||
Capacity Total | 500 GiB | Cloud Migration Proposals | |
Location | AWS: Europe (London) | Carbon Footprint by Location | |
Cost by Location | |||
Interface | Name | DRT API | |
Protocol Type | HTTPS | Technical Fit Score | |
Stakeholder | Name | Jane Doe | |
jane.doe@example.com | Allows to notify application owner | ||
Technical Component | Name | drt_as01 | |
Technology | Tomcat 11.0 | Cloud Provider | |
End of Life (EOL) | |||
Technical Fit Score | |||
Technical Improvements | |||
Virtual Server | Name | drt_vs01 | |
Technology | Amazon EC2 | Cloud Provider | |
End of Life (EOL) | |||
Technical Fit Score | |||
Technical Improvements | |||
Technology Variant | a1.large | Carbon Footprint | |
Energy Consumption | |||
Potential Savings | |||
Total Costs | |||
Operating System | Alpine Linux 3.15 | End of Life (EOL) | |
Technical Improvements | |||
CPU Cores | 2 | Cloud Migration Proposals | |
RAM | 4 GiB | Cloud Migration Proposals | |
Location | AWS: Europe (Frankfurt) | Carbon Footprint by Location | |
Cost by Location |
1.2 Best practice for data collection
There are three main ways to collect data and populate the central Repository in Txture:
- Importing data from existing sources such as CMDBs, EAM tools, virtualization environments, and cloud environments.
- Crowdsourcing data from stakeholders within your organization using Txture Surveys.
- Manually adding data through manual entry or data modeling.
The recommended process for populating Txture with the necessary data is as follows:
- Start by using existing data sources to create a basic dataset in the Repository. Additional data sources and collection methods can be added at any time.
- Use crowdsourcing to gather missing data from application owners and other relevant stakeholders via the surveys functionality. You can either use Txture’s default surveys or create custom ones to collect specific data.
- If data is still missing or unavailable in a structured format, you can organize modeling workshops with the relevant stakeholders to fill in the gaps and improve the existing data.
1. Importing from existing data sources
In most IT organizations, some data is readily available. Txture can connect and synchronize various data sources and map the data to the data model (structure). For APA, connecting to an existing cloud environment is especially beneficial. You can schedule imports to ensure Txture stays continuously updated. The information about the application landscape and the infrastructure of the applications serves as the foundation for the results in the APA module.
For more details on import data, refer to our documentation on Importers.
2. Crowdsourcing data with surveys
If the imported data is incomplete or you lack high-quality data sources, you can fill in the gaps by collecting information from various stakeholders within your organization.
To streamline this process and gather data from numerous stakeholders more efficiently, Txture offers the Surveys feature.
Surveys are also an excellent way to quickly assess the data that has already been collected, as they are pre-filled with all the information imported into the repository.
For more details on crowdsourcing data with surveys, refer to our documentation on Surveys.
3. Data fine tuning with manual modeling
Sometimes, it's necessary to manually add missing information in collaboration with application owners to enhance the application's data foundation. This approach is the slowest and most time-consuming way to gather information, so it's typically used only to refine data that's already largely known or when no automated data source is available.
Modeling workshops are usually conducted using the Dependency or Tree report types. Those reports provides the overview needed to manually work through the application dependencies and follows a similar usage pattern to other modeling or graph-drawing tools. The key difference from traditional modeling tools is that the data created is stored in the repository. This ensures that the modeling is based on up-to-date information. Additionally, the resulting visualizations can be shared as reports or used for quality assurance. You can also search for specific assets in the repository and edit them manually when needed.
For more details on manual data entry, refer to our documentation on Manual Editing.
2. APA reports and outcomes
The APA module generates a wide range of insights and actionable results based on the collected and processed data. These results help organizations make informed decisions about their application landscape, enabling optimization, modernization, cost reduction, and sustainability improvements.
This chapter gives an overview of the key outcomes from the APA module.
2.1 Navigation concept
The horizontal navigation bar (1) at the top provides access to a summary and insights page, as well as four areas where APA offers recommendations for optimizing the application landscape. When you select an application from the vertical sidebar (2) on the left, it displays application-specific results for the chosen area — this will be referred to as "application mode" moving forward.
-
Summary
The Summary page helps maintain high data quality by actively highlighting issues such as duplicate assets and tracking recent modifications. In the application mode, you can view all data gathered for a specific application, ensuring you have a comprehensive understanding of its status. -
Insights
The Insights section offers a broad overview of the optimization potential within your entire application portfolio. It also features a quadrant chart based on the principles of Gartner's TIME analysis, which allows you to analyze the technical and functional fit of your applications. This analysis helps you identify where to focus your optimization efforts. -
Technology/Cloud
In the Technology/Cloud section, you can examine your applications from a technological perspective. This view enables you to discover opportunities for significant performance improvements and assess risk factors, such as end of life technologies. -
GreenOps
The GreenOps section allows you to evaluate the environmental impact of your applications. Here, you can analyze the current carbon footprints of your applications, identify those with the highest emissions, and determine which applications offer the greatest potential for carbon savings. -
FinOps
By applying a financial lens in the FinOps section, you can identify which applications have the highest associated costs and understand the reasons behind these costs. This analysis will help you pinpoint applications that present the most substantial cost-saving potential. -
Business/Organization
The Business/Organization section focuses on the links between applications and the business layer to keep an overview, for instance, which applications belong to an organizational unit or support which business capability.
The preferences (3) allow users to define criteria for cloud service selection, including allowed locations, geographic boundaries, security standards, and preferred CPU vendors. Additionally, users can manage End-of-Life timelines to ensure technologies align with organizational policies and compliance requirements.
To refresh the APA module user interface with the latest data, click the recompute button (4).
2.2 APA recommendations
APA continuously analyzes your application infrastructure, as well as the business layer and provides recommendations for potential savings and other functional or technical improvements. The recommendations are based on a set of rules that operates in the background and is constantly being expanded.
-
End-Of-Life This rule evaluates application stacks for components that have reached vendor specific technology end-of-life and recommends particular version upgrades. As the technology end-of-life date typically defines the point at which the vendor officially stops developing, selling or providing full support for that product, this output also influences the Technical Fit scoring. The time period that should be tolerated before or after an actual end-of-life date is adjustable (see Preferences).
-
Technology Versions This rule evaluates application stacks for components with outdated technology versions in use and recommends newer available versions. As newer technology versions often offer better price-to-performance ratios, as well as security, this output also influences the Technical Fit scoring. Technical fit is negatively impacted mainly by the amount of major versions the technology in use is lagging behind. Also minor and patch versions are considered if applicable.
-
Cloud Data Center
Two rules evaluate if and how much existing cloud components would benefit from a cloud data center relocation. One rule focuses on potential cost savings, while the other rule detects if there are relocation opportunities that would result in lower carbon emissions. Defining preferred data center locations or limiting the distance (in km) is possible via the Preferences. -
Righsizing
Some rules evaluate if and how much existing application cloud components would benefit in cost savings from choosing rightsized cloud service instances. Rightsizing cloud components (i.e. saving resources according to utilization or compute processor performance) can lead to cost savings, as well as carbon reductions. There are two separate rules to determine the saving potential in each category.
2.3 Transitioning to the Transformation Cockpit
Once you have identified one or several candidates for migration, you can seamlessly transition to the Transformation Cockpit in Txture. This feature allows you to view detailed Cloud Proposals based on the application stack data that has been previously collected.
To learn more about the Txture Application Portfolio Advisor have a look at the general module documentation.