top of page

Settings In SQL Server?

There are several commonly misconfigured settings in SQL Server that can cause performance issues, security vulnerabilities, or other problems. Some of the most common misconfigurations include:


Max Server Memory:

This setting controls the maximum amount of memory that the SQL Server instance can use. If it is set too low, the SQL Server instance may not have enough memory to perform well, and if it is set too high, it can cause other processes on the server to run slower.

The maximum amount of memory that should be configured for SQL Server depends on the specific requirements of your system and usage. However, there are some general guidelines that can be followed:

  • Reserve 1 GB of memory for the operating system and other applications running on the server.

  • Reserve additional memory for non-SQL Server related services that are running on the server.

  • For a dedicated SQL Server instance, configure the max server memory setting to be slightly less than the total physical memory on the server. This will allow for some memory to be used for disk caching by the operating system, which can help improve disk performance.

  • Monitor the server's memory usage over time to ensure that SQL Server is not consistently using all of the available memory. If the server's memory usage is consistently high, consider increasing the max server memory setting.

  • Consider the specific requirements of your workload. For example, if you are running a large data warehouse, you may need to set a higher max server memory setting than for a smaller OLTP workload.

It's also important to note that the max server memory setting is not a hard limit, and SQL Server may temporarily exceed the specified value if necessary. It's just a way to control how much memory SQL Server will use, but it doesn't mean that SQL Server will use the whole memory configured. Also, it's recommended to monitor the server's performance and adjust the max server memory setting accordingly. If the server is not performing well, or you see SQL Server is not using all the memory, you may want to increase the max server memory setting. On the other hand, if you see the server is running out of memory, you may want to decrease the max server memory setting.


Cost Threshold for Parallelism:

This setting controls the threshold at which SQL Server starts using multiple processors to execute a single query. If it is set too low, the SQL Server instance may not take full advantage of the available processors, and if it is set too high, it can cause performance issues.

Here are some general guidelines for configuring the cost threshold for parallelism:

  • For a dedicated SQL Server instance, start by setting the cost threshold for parallelism to a relatively low value, such as 5 or 10, to allow more queries to be executed in parallel.

  • Monitor the performance of the server and the number of parallel worker threads being used. If the server is not performing well and there are a large number of parallel worker threads, consider increasing the cost threshold for parallelism.

  • If the server is over-utilized, and the number of parallel worker threads is high, consider increasing the cost threshold for parallelism to reduce the number of parallel operations and ease the pressure on the server.

  • Consider the specific requirements of your workload. For example, if you have a data warehouse with large and complex queries, you may want to set a lower cost threshold for parallelism to allow more queries to be executed in parallel.

  • If you have an OLTP workload, you may want to set a higher cost threshold for parallelism, to reduce the number of parallel operations and improve the performance of small and simple queries.

It's important to note that the cost threshold for parallelism is not a hard limit and it's not only the cost of the query but also the resources available on the server that determine if a query can be executed in parallel or not. Also, the default value of 5 is suitable for most general scenarios, but monitoring and testing are needed to find the optimal value for your specific environment.

Recovery Model:

This setting controls how the SQL Server instance handles transactions and backups. If it is set to the wrong value, it can cause data loss or other problems

Simple recovery mode in SQL Server has the following pros and cons: Pros:

  • Faster backups: Backups in simple recovery mode only need to back up the data pages, which can be faster than backing up the entire transaction log in full recovery mode.

  • Smaller backups: Backups in simple recovery mode are smaller than backups in full recovery mode because they do not include the transaction log.

Cons:

  • No point-in-time recovery: In simple recovery mode, you can only restore the database to the point of the most recent backup. You cannot restore the database to a specific point in time.

  • No log shipping: Log shipping is not possible in simple recovery mode. Log shipping is a feature that allows you to automatically send transaction log backups from one server to another, so that you can use the second server as a disaster recovery solution.

Full recovery mode has the following pros and cons: Pros:

  • Point-in-time recovery: In full recovery mode, you can restore the database to a specific point in time.

  • Log shipping: Log shipping is possible in full recovery mode.

Cons:

  • Slower backups: Backups in full recovery mode need to back up the entire transaction log, which can be slower than backing up just the data pages in simple recovery mode.

  • Larger backups: Backups in full recovery mode are larger than backups in simple recovery mode because they include the entire transaction log.

It's important to note that the recovery mode you choose will depend on the specific requirements of your application and the level of data protection you need.

Auto Create Statistics:

This setting controls whether or not the SQL Server instance automatically creates statistics on columns that are used in queries. If it is disabled, the SQL Server instance may not perform well, and if it is enabled, it can cause performance issues.

It is generally recommended to enable the "Auto Create Statistics" setting in SQL Server. This is because, without statistics, the query optimizer cannot accurately estimate the number of rows that will be returned by a query, which can lead to poor query performance. Enabling this setting can improve query performance by allowing the query optimizer to make more accurate estimates, which can result in better query plans.

It's worth noting that Auto Create Statistics can also have some negative impact on performance and disk space if it is creating statistics on too many columns or tables. It's important to monitor the statistics and manage them accordingly.

Additionally, if the data in the table is frequently updated or inserted, the statistics may become outdated and could lead to poor query performance. In such cases, it's important to consider schedule regular statistics update or set the AUTO_UPDATE_STATISTICS_ASYNC option to ON to avoid blocking queries while statistics are updated.

Auto Update Statistics:

This setting controls whether or not the SQL Server instance automatically updates statistics on columns that are used in queries. If it is disabled, the SQL Server instance may not perform well, and if it is enabled, it can cause performance issues.

Auto Shrink:

This setting controls whether or not the SQL Server instance automatically shrinks the database files. If it is enabled, it can cause performance issues, and if it is disabled, it can cause the database files to grow too large.

Instead of enabling the auto shrink, it's recommended to use a more controlled approach to managing the size of a database. This can include:

  • Using the Database Engine Tuning Advisor to identify and remove unnecessary indexes and fragmentation.

  • Scheduling regular index maintenance, such as rebuilds and reorganizes, to reduce fragmentation.

  • Truncating or dropping large tables or partitions that are no longer needed.

  • Monitor the disk space usage and adjust the file growth settings of the database files to prevent them from becoming too large.

In summary, the "Auto Shrink" setting in SQL Server is not recommended and can cause performance issues and data corruption. Instead, it is better to use a more controlled approach to managing the size of a database.



I

Recent Posts

See All

Get in Touch

Thanks for submitting!

bottom of page