Skip to content

SQL Server Sizing & Configuration#

General SQL Server Configuration#

Some additional recommendations for initial SQL server configuration are provided below. This can assist in setting up a new environment with some improved options over the defaults that SQL Server installs with.

Info

These are general best practice recommendations to assist in setup of a healthy SQL server environment. Since each environment may have different requirements, consulting your database administrator is always recommended. These should be taken as a general “best practice” and adjusted as your technical staff advises in your environment.

Setting Recommendation
Instant file initialization (IFI) Enabling IFI is recommended if policy allows, as this allows data file growths to be nearly instant, minimizing impact of growth events. Log file growth still has to “zero out” the allocated disk space for each growth, so pre-growing the log file is still recommended.
TempDb Configuration Check for proper tempdb file count and sizing, as tempdb is heavily utilized by AE. NumberOfCores = Filecount Up to 8
Sql Server Level Fill Factor Validate this has not been overridden with a custom value unless specifically required by the database administrator. The default setting should be set to 0.
Max Degree of Parallelism Change default of 0 to a calculated value based on Microsoft recommendations.
Cost Threshold of Parallelism Adjust to 50 as a better starting point and adjust based on dba monitoring.
Auto-shrink Off for all databases
Maintenance Plan Current maintenance plan in place. This is environment specific. Some best practice guidelines for backup and index maintenance can be found on Ola Hallengren`s site
Drive Configuration - If on SAN: Varies based on SAN configuration. Work with SAN admin for optimizing setup.
- If not on SAN: Consider separating log and database files to ensure best IO performance.
SQL Server Max Memory Configure max memory to leave 10% memory free for the OS, or adjust as appropriate based on other services running on machine.

Sizing Storage#

SQL Server Configuration Tip

  1. It is recommended to presize the database and log files, as well as ensure sufficient tempdb files are available.
  2. Presizing the database can be done by creating a blank database with the desired sizing of data and log files and then pointing the configuration utility to the newly created database to install AE.
  3. For any growth events, consider enabling Instant File Initialization if allowable, to expedite growths of the data file and reduce waits.

Sizing Storage Based on Assets#

Based on some initial estimates to help guide storage we have calculated a starting figure to use for estimating storage requirements per asset. This will of course vary based on a variety of features and input by users.

Property Type Average Storage KB
Office 4243
Hot 2474
Industrial 2280
Multifamily 1496
Mixed Use (Retail/Multifamily) 6897
Retail 2648
Mixed Use (Office/Retail) 1157

Planning storage for AE can be done based on some general estimates by using the Average Size noted above for each type of asset.

Calculating Storage Requirements
AveragePropertySize = 4243
AnalysisYearsOver10 = 2 # IE, 12 Years analysis = 2 years over 10
NumberOfScenarios   = 2
AssetCount          = 100

AdjustedPropertySizeKB  = AveragePropertySize + (( AveragePropertySize * (0.06 * AnalysisYearsOver10 )))
StorageRequirementsInMB = (( AdjustedPropertySizeKB * AssetCount) * NumberOfScenarios) / 1024
$AveragePropertySize = 4243
$AnalysisYearsOver10 = 2 # IE, 12 Years analysis = 2 years over 10
$NumberOfScenarios   = 2
$AssetCount          = 100

$AdjustedPropertySizeKB  = $AveragePropertySize + (( $AveragePropertySize * (0.06 * $AnalysisYearsOver10 )))
$StorageRequirementsInMB = (( $AdjustedPropertySizeKB * $AssetCount) * $NumberOfScenarios) / 1024

write-host ("`$AdjustedPropertySizeKB       {0:N0}    = $AdjustedPropertySizeKB + (( $AveragePropertySize * (0.06 * $AnalysisYearsOver10 )))" -f $AdjustedPropertySize)
write-host ("`$StorageRequirementsInMB      {0:N0}    = (( $AdjustedPropertySizeKB * $AssetCount) * $NumberOfScenarios) / 1024" -f $StorageRequirementsInMB)

This provides an initial sizing estimate. This initial sizing should be further increased based on specific business needs such as compliance requirements that specify retention of data, as well as the need to duplicate scenarios quarterly/annually, etc.

Example

As an example, if sizing 1000 Office assets for a 15 year analysis period, with the expectation of creating new scenarios every quarter, the storage could be calculated as the example below.

--- Sizing Requirement ---
AnalysisYears:              15
AnalysisYearsOver10:        5
AveragePropertySize:        4243
AdjustedPropertySize:       1272.9           calc ((4243 * (0.06*5)))
----------------------------
StorageRequirements:        4.85 GB          calc: (1272.9*1000)*4

Once this figure was evaluated, you could further plan for system demands for the first and second year, by multiplying by 2 years to get the estimated storage of 9.7GB

This provides an initial sizing estimate. This initial sizing should be further increased based on specific business needs such as compliance requirements that specify retention of data, as well as the need to duplicate scenarios quarterly/annually, etc.

Detailed Storage Sizing By Using Expected Portfolio Size#

Model sizes vary based on type, features used, and complexity of the data input. For the purposes of sizing, a sampling of all the property types were taken and the mean value calculated.

To improve accuracy of sizing, providing additional details can help prepare for future sizing needs.

Tip

Based on sample data, a property asset model storage size with 10 years of result data would be approx.: 9.5MB. This is a good starting point, however you can adjust the size based on your expectations. If you don’t have any number to start with, then use this number for the examples.

  1. NumberOfModels: Estimate the unique number of property Asset Models
  2. AdjustedModelSize: Analysis period has an impact on storage requirements. As a baseline, the calculation is set to 10 years. For each additional year, we recommend you take 4% of the Property asset model size and multiply by every year over 10.1
  3. ScenarioStorageRequirement: Total Storage requirement for the scenario could then be calculated as: (AdjustedModelSize * NumberOfUniqueModels)
  4. TotalPortfolioStorage: Once you have obtained the requirement per scenario, you can multiply against the number of scenarios expected in the portfolio. (NumberOfScenarios * ScenarioStorageRequirement)
  5. TotalStorage: If planning for multiple portfolios, you can add multiple the TotalPortfolioStorage * NumberOfPortfolios to obtain the adjusted TotalStorage requirement.
  6. In addition to this calculated TotalStorage figure, you can adjust the figures further to plan for growth by multiplying TotalStorage against the number of years you want to plan for
Database File Sizing - Detailed

This is included as a basic text walkthrough, as well as a PowerShell version which can be pasted into a PowerShell editor (Vscode for example) and used as a scratchpad to experiment with sizing calculations.

--- input values ---
NumberOfPortfolios = 1
NumberOfScenarios  = 4
NumberOfModels     = 100
AnalysisYears      = 12
Years              = 2

--- calculated values ---
NumberOfUniqueModels         = NumberOfModels * NumberOfScenarios * NumberOfPortfolios
StartingModelSizeInMb        = 9.5
AdjustedModelSizeMb          = StartingModelSizeInMb + ((0.04 * StartingModelSizeInMb) * AnalyisYearsAdditionalCost)
ScenarioStorageRequirementMb = AdjustedModelSizeMb * NumberOfModels
TotalPortfolioStorageMb      = NumberOfScenarios * ScenarioStorageRequirementMb
TotalStorageMb               = TotalPortfolioStorageMb * NumberOfPortfolios
TotalProjectedStorageGB      = (TotalStorageMb * Years) / 1024
<#
.Description
    Script for estimating size. Copy into Powershell ISE script window and adjust figures to see impact.

.Notes
    This is just a guideline to get you started. Actual storage requirements will vary based on model and other factors.
#>

#input values
[double]$NumberOfPortfolios = 1
[double]$NumberOfScenarios = 4
[double]$NumberOfModels = 100
[double]$AnalysisYears = 12
[double]$Years = 2

write-verbose "---Input Values ---"
write-host ( Get-Variable NumberOfModels, NumberOfScenarios, AnalysisYears -verbose| out-string )


# increased cost for storing more analysis years
if ($AnalysisYears -gt 10)
{[int]$AnalyisYearsAdditionalCost = $AnalysisYears - 10}
else { $AnalyisYearsAdditionalCost = 1}

#calculated values
[double]$NumberOfUniqueModels = $NumberOfModels * $NumberOfScenarios * $NumberOfPortfolios
[double]$StartingModelSizeInMb = 9.5
[double]$AdjustedModelSizeMb = $StartingModelSizeInMb + ((0.04 * $StartingModelSizeInMb) * $AnalyisYearsAdditionalCost)
[double]$ScenarioStorageRequirementMb = $AdjustedModelSizeMb * $NumberOfModels
[double]$TotalPortfolioStorageMb = $NumberOfScenarios * $ScenarioStorageRequirementMb
[double]$TotalStorageMb = $TotalPortfolioStorageMb * $NumberOfPortfolios
[double]$TotalProjectedStorageGB = ($TotalStorageMb * $Years) / 1024

write-host "---Calculated Values ---"
write-host ("NumberOfUniqueModels       = {0,-20} | calculated from: ($NumberOfModels * $NumberOfScenarios * $NumberOfPortfolios)" -f $NumberOfUniqueModels.ToString('N0'))
write-host ("AdjustedModelSize          = {0,-20} | calculated from: $StartingModelSizeInMb + ((0.04 * $StartingModelSizeInMb)  * ($AnalysisYears - 10))" -f $AdjustedModelSizeMb.ToString('N0'))
write-host ("ScenarioStorageRequirement = {0,-20} | calculated from: ($AdjustedModelSizeMb * $NumberOfModels)" -f $ScenarioStorageRequirementMb.ToString('N0'))
write-host ("TotalPortfolioStorage      = {0,-20} | calculated from: ($NumberOfScenarios * $ScenarioStorageRequirementMb)" -f $TotalPortfolioStorageMb.ToString('N0'))
write-host ("TotalStorage               = {0,-20} | calculated from: ($TotalPortfolioStorageMb * $NumberOfPortfolios) " -f $TotalPortfolioStorageMb.ToString('N0'))
write-host ("TotalProjectedStorage      = {0,-20} | calculated from: ($TotalProjectedStorageGB * $Years)" -f ($TotalProjectedStorageGB.ToString('N0') + "GB"))

Detailed Storage Sizing for Log File#

In Presizing, it is also important to account for the database log file as well. Once the calculated database size is determined, consider sizing the log file to 20% of this. In a test case, the 138GB database was tasked with a large recalculation job. After all the testing was complete the total size of the log file had grown to 21GB, resulting in 17% the size of the database data file.

Example

  1. Database size projected to be 80GB
  2. 20% of 80GB would be 16GB.
  3. Increase the size of the database log file to 16GB.

Storage Sizing for Log File

This basic calculation should be assess if very large database sizes are projected. If the size of the database is projected at 500GB for example, 20% of this would be 100GB. This would likely be too large. If you are dealing with a very database, consider reducing the ratio to 10%, and ensure the log autogrowth is set to a size of >= 4GB to avoid excessive small growth events impacting performance.

DatabaseSizeInGB = 80
Suggested Log Size = DatabaseSizeInGB * 0.20
#Storage Sizing for Log File
$DatabaseSizeInGB = 80
"Suggested Log Size: {0:N0}gb" -f ($DatabaseSizeInGB * 0.20)

Tip

SQL Server does not benefit from splitting up log files for better concurrency.

Detailed Storage Sizing for TempDB#

TempDb is heavily utilized by SQL Server and ensuring that resources are allocated sufficiently to TempDb can help reduce contention and help reduce impact by presizing. The following values were obtained during internal tests performed. You can increase the size of TempDB to a larger amount if you have the resources to do so.

Determine Your TempDb File Count
if(AvailableCores >= 8)
TempDbFileCount = 8
If(AvailableCores < 8)
TempDBFileCount = AvailableCores

Microsoft’s general recommendation is to create TempDB’s to match the core count up to 82. Once there are more than 8 cores, additional TempDB files should only need to be added if a database administrator notices specific contention issues.

Technical Notes: Our Testing Results

To assess the approx impact on sizing tempdb, a sample database was created with 9 portfolios, each containing 7 scenarios with 500 property assets. The size of the ARGUS Enterprise database was approx 138GB.

TempDB was reset for the SQL instance and set to very small growth amount of 1MB to grow evenly across the allocated 8 tempdb files (due to 8 cores on server).

After a batch recalculation job was performed the tempdb files had experienced over 500 growth events, and TempDB log had over 1600 growth events. The end space for tempdb for this isolated test ended up growing from 1MB per file to approx 529MB per file, with tempdb log growing to 1.8GB.

Based on our tests, a basic guideline for setting a starting storage size for tempdb could be obtained from:

  1. Expected Size of ARGUS Database (obtained in previous steps)
  2. TotalTempDbSize = 5% of SizeOfARGUSDatabase rounded up to nearest even GB
  3. TotalSizeOfTempDbLog = 50% of TotalTempDbSize
  4. Split the TotalTempDbSize / Count of TempDb Files

Example

  1. Expected Size of ARGUS Database = 138GB
  2. TotalTempDbSize = 8GB
  3. TotalSizeOfTempDbLog = 4GB
  4. Each of the 8 TempDb files = Approx 1GB
  5. Single TempDB log = 4GB

  1. For example, if analyzing for 15 years, this would be 5 * (4% * 9.5MB) = 1.9mb additional storage. This would result in an approx. AdjustedModelSize of 11.4mb 

  2. See Concurrency enhancements for the tempdb database and Microsoft Technet - Optimizing Tempdb Performance