Memory Sizing:
- Memory for column store
- Memory for row store
- Memory for caches and additional components
Disk Sizing
- Disk sizing for data files
- Disk sizing for log files
CPU sizing
Memory Sizing
Memorizing is the process of estimating, in advance, the amount of memory that will be required to run a certain workload on SAP HANA. To get an answer about how much memory requires. first, we need to understand the answer of the below given question.
1. What is the size of the data tables that will be stored in SAP HANA? You may be able to estimate this based on the size of your existing data, but unless you precisely know the compression ratio of the existing data and the anticipated growth factor, this estimate may only be partially meaningful.
2. What is the expected compression ratio that SAP HANA will apply to these tables? The SAP HANA Column Store automatically uses a combination of various advanced compression algorithms (dictionary, LRE, sparse, and more) to best compress each table column separately. The achieved compression ratio depends on many factors, such as the nature of the data, its organization and data-types, the presence of repeated values, the number of indexes (SAP HANArequires fewer indexes), and more.
3. How much extra working memory will be required for DB operations and temporary computations? The amount of extra memory will somewhat depend on the size of the tables (larger tables will create larger intermediate result-tables in operations like joins), but even more on the expected workload in terms of the number of users and the concurrency and complexity of the analytical queries (each query needs its own workspace).