top of page

Configure A SQL Instance

SQL Server Mistakes to Avoid

Blog Introduction: As a database administrator (DBA), there are certain mistakes that you should be aware of and avoid when it comes to managing your SQL Server. Over the past 20 years of consulting on SQL Server, I’ve seen a few common errors that can lead to significant issues down the line if not addressed. In this blog post I will cover some of these key mistakes and how you can make sure your system is running as efficiently as possible.


No Backups or Full Backups with no Logs

One of the most important steps for a DBA is to ensure that backups are taken regularly and correctly. If backups are not taken at all, or if full backups are taken without logs, this means that there is no way to restore data in the event of an emergency. It is essential to have regular backup schedules in place and use both full and log backups to make sure your data is safe and secure.


Here's a T-SQL script to show all databases and their backup mode:

SELECT 
    [name] AS [Database Name], 
    CASEWHEN [recovery_model] = 1 THEN 'Full'WHEN [recovery_model] = 2 THEN 'Bulk-Logged'WHEN [recovery_model] = 3 THEN 'Simple'ELSE 'Unknown'END AS [Backup Mode]
FROM 
    [master].[sys].[databases]
WHERE 
    [database_id] > 4 -- exclude system databasesORDER BY 
    [name] ASC;

This script retrieves the name of all databases except the system databases and displays their backup mode as either Full, Bulk-Logged, Simple, or Unknown (if an invalid recovery model is set). You can run this script in SQL Server Management Studio or any other SQL client.


In SQL Server, transaction log backups are required in the Full and Bulk-Logged recovery models. These recovery models enable point-in-time recovery, which means that you can recover a database to a specific point in time using transaction log backups. In the Full recovery model, all transaction log records are retained until they are backed up or truncated. This means that you can recover a database to any point in time since the last full or differential backup by applying the relevant transaction log backups. To ensure that you have all necessary transaction log backups, you should perform them at regular intervals, based on your recovery point objective (RPO) and recovery time objective (RTO). In the Bulk-Logged recovery model, transaction log backups are still required, but the behavior is slightly different. In this model, large-scale bulk operations are minimally logged, which can improve performance but also increases the risk of data loss in the event of a failure. As a result, it's important to back up the transaction log more frequently in the Bulk-Logged model to minimize the amount of data that could be lost. In the Simple recovery model, transaction log backups are not required because the transaction log is truncated after each checkpoint. This means that you can only recover a database to the time of the most recent full or differential backup, which may result in some data loss. The Simple recovery model is suitable for databases that can be easily recreated or restored from a backup.


Mikes View:

Only use FULL mode when necessary and always have transaction log backups with full mode.

No Security or Too Many Individual Users in System vs Groups

Another key mistake that DBAs often make is not setting up proper security measures or having too many individual users in the system rather than groups. Having too many individual user accounts can slow down performance and increase the risk of unauthorized access. To combat this issue, it's important to set up appropriate security protocols such as password expiration, account lockout policies, and two-factor authentication. Additionally, consolidating individual user accounts into groups can help improve efficiency and reduce administrative overhead.


When it comes to security in SQL Server, there are two main approaches: using groups for security or using individual accounts. Both approaches have their advantages and disadvantages, and the choice depends on the specific needs and requirements of your organization. Using groups for security can simplify management and improve security by reducing the number of permissions that need to be managed. By creating groups that correspond to different roles within your organization (such as database administrators, developers, or report writers), you can assign permissions to the group rather than individual accounts. This can save time and reduce the risk of errors or inconsistencies in permissions management. Using individual accounts for security can provide more granular control over permissions and can be useful in situations where different users need different levels of access. By assigning permissions to individual accounts, you can ensure that each user only has access to the specific resources they need. However, managing a large number of individual accounts can be time-consuming and increase the risk of errors or inconsistencies. In general, using groups for security is recommended for most organizations because it simplifies management and improves security. However, there may be situations where individual accounts are necessary to provide the required level of access control. Ultimately, the choice between using groups or individual accounts for security depends on the specific needs and requirements of your organization.


Mikes View:

Working with the system admin sucks, but you would outsource adding people to groups to systems.


Maximum Degree of Parallelism

Maximum Degree of Parallelism (MAXDOP) in SQL Server is a setting that determines the maximum number of processors that can be used to execute a single query. In other words, it limits the number of processors that can be used to parallelize a single query execution. By default, the MAXDOP setting is set to 0, which means that SQL Server can use all available processors to parallelize a query. However, in some cases, using all available processors may not be optimal and can lead to performance issues. The MAXDOP setting can be configured at the server level or the query level. At the server level, you can set the maximum degree of parallelism for all queries by configuring the "Max Degree of Parallelism" option in the Server Properties dialog box. At the query level, you can override the server-level setting by using the "OPTION (MAXDOP n)" syntax, where "n" is the maximum degree of parallelism you want to use for that query. Setting the MAXDOP setting to an appropriate value can help optimize query performance by balancing parallelism with resource utilization. However, it's important to note that setting the MAXDOP setting too low can result in slower query performance, while setting it too high can result in resource contention and performance issues. Therefore, it's important to carefully evaluate and test different values for MAXDOP to find the optimal setting for your specific workload and hardware configuration.


The optimal setting for Maximum Degree of Parallelism (MAXDOP) in SQL Server depends on several factors, including the hardware configuration, the workload characteristics, and the overall system performance goals. However, here are some general recommendations for setting the MAXDOP value:

  • For OLTP workloads: It's generally recommended to set MAXDOP to 1 for OLTP (Online Transaction Processing) workloads, which typically involve many small transactions with short execution times. This is because parallelizing queries may not provide significant performance benefits and can instead lead to resource contention and contention problems.

  • For data warehouse workloads: For data warehouse workloads, which typically involve complex queries and large data sets, setting MAXDOP to a value between 4 and 8 is often recommended. This can help balance parallelism with resource utilization and provide good performance benefits without causing resource contention issues.

  • Consider the hardware configuration: The optimal value for MAXDOP also depends on the number of processors and cores available on the server. As a general rule of thumb, it's recommended to set MAXDOP to no more than the number of physical cores on the server.

  • Test and evaluate: It's important to carefully evaluate and test different MAXDOP settings to find the optimal value for your specific workload and hardware configuration. This can involve running workload simulations with different MAXDOP settings and monitoring performance metrics such as CPU usage, query execution time, and resource utilization.


Cost Threshold for Parallelism Set Incorrectly

The Cost Threshold for Parallelism is a configuration option in SQL Server that specifies the threshold at which SQL Server decides to use parallel execution plans for queries.

When a query is submitted to SQL Server, the Query Optimizer generates one or more execution plans to determine the most efficient way to execute the query. Parallelism is a feature that allows SQL Server to divide a query into multiple smaller tasks and execute them simultaneously on multiple processors, which can provide significant performance benefits for certain types of queries.

The Cost Threshold for Parallelism specifies the minimum estimated cost required for a query to be considered for parallel execution. The cost of a query represents an estimated measure of the resources (such as CPU, I/O, and memory) required to execute the query.

By default, the Cost Threshold for Parallelism is set to 5, which means that any query with an estimated cost of 5 or higher will be considered for parallel execution. However, this default value may not be optimal for all workloads, and it's important to evaluate and adjust the setting based on the specific workload characteristics.

Setting the Cost Threshold for Parallelism too low can cause SQL Server to generate parallel execution plans for queries that would perform better using a serial plan, which can lead to performance degradation and resource contention issues. On the other hand, setting the Cost Threshold for Parallelism too high can prevent SQL Server from using parallelism for queries that could benefit from it, which can also result in suboptimal performance.

In general, it's recommended to set the Cost Threshold for Parallelism to 50 and see how things work out.



Things You Should Configure

Server Memory Configuration: This setting determines how much memory SQL Server can use. You should ensure that the server has enough memory to handle the workload and that the memory settings are configured correctly. For example, you may need to adjust the min server memory and max server memory settings to ensure optimal performance.

TempDB Configuration:

TempDB is a system database that SQL Server uses to store temporary data, such as temporary tables and indexes. You should ensure that TempDB is configured correctly based on the workload characteristics. For example, you may need to adjust the number of TempDB files and the size of each file to optimize performance.


TempDB is a system database in SQL Server that is used to store temporary user objects, internal objects, and version stores. It is important to configure TempDB properly to ensure optimal performance of SQL Server. Here are the steps to configure TempDB in SQL Server:

  • Determine the number of processor cores: The number of processor cores on the SQL Server machine should be determined. This can be done by running the following command in SQL Server Management Studio:

SELECT cpu_count FROM sys.dm_os_sys_info
  • Determine the initial size and growth settings: The initial size and growth settings for TempDB should be determined based on the size of the workload and the number of processor cores. A general rule of thumb is to set the initial size to 8 MB per processor core and the growth increment to 64 MB.

  • Configure multiple data files: Multiple data files should be created for TempDB to improve performance. The number of data files should be equal to the number of processor cores. The files should be located on separate disks to improve disk I/O performance.

  • Set the autogrowth settings: The autogrowth settings for TempDB should be configured to ensure that the files do not run out of space. A recommended setting is to set the autogrowth increment to 64 MB and the maximum size to the size of the disk.

  • Set the recovery model: The recovery model for TempDB should be set to Simple to reduce the overhead of transaction logging.

  • Restart SQL Server: After making the configuration changes, SQL Server should be restarted for the changes to take effect.

Here is an example T-SQL script to configure TempDB:

USE master;
GO
ALTER DATABASE tempdb
MODIFY FILE (NAME = tempdev, SIZE = 8MB, FILEGROWTH = 64MB);
GO
ALTER DATABASE tempdb
ADD FILE (NAME = tempdev2, FILENAME = 'D:\SQLData\tempdb2.ndf', SIZE = 8MB, FILEGROWTH = 64MB),
     FILE (NAME = tempdev3, FILENAME = 'E:\SQLData\tempdb3.ndf', SIZE = 8MB, FILEGROWTH = 64MB),
     FILE (NAME = tempdev4, FILENAME = 'F:\SQLData\tempdb4.ndf', SIZE = 8MB, FILEGROWTH = 64MB);
GO
ALTER DATABASE tempdb
MODIFY FILE (NAME = templog, SIZE = 8MB, FILEGROWTH = 64MB);
GO
ALTER DATABASE tempdb
SET RECOVERY SIMPLE;
GO

Note: The above example assumes that the initial size of each data file is 8MB and that the growth increment is 64MB. The file locations and autogrowth settings should be adjusted based on the hardware configuration and workload characteristics of the SQL Server machine.

Database Compatibility Level:

The database compatibility level determines the version of SQL Server that the database is compatible with. You should ensure that the compatibility level is set correctly based on the version of SQL Server that you are using.



Security Configuration:

You should ensure that the security settings are configured correctly to protect your data and prevent unauthorized access. This includes configuring server-level security settings, database-level security settings, and auditing.


Here are some common queries you can use to check the security of your SQL Server:

List all logins:

SELECT name, type_desc, create_date, modify_date
FROM sys.server_principals
WHERE type IN ('U', 'G', 'S')
ORDER BY type, name;

List all users in a database:

SELECT name, type_desc, create_date, modify_date
FROM sys.database_principals
WHERE type IN ('U', 'G', 'S')
ORDER BY type, name;

List all server roles:

SELECT name, create_date, modify_date
FROM sys.server_principals
WHERE type = 'R';

List all database roles:

SELECT name, create_date, modify_date
FROM sys.database_principals
WHERE type = 'R';

List all users and their server roles:

SELECT p.name AS UserName, r.name AS RoleName
FROM sys.server_role_members m
JOIN sys.server_principals r ON m.role_principal_id = r.principal_id
JOIN sys.server_principals p ON m.member_principal_id = p.principal_id
WHERE r.type = 'R'ORDER BY UserName;

List all users and their database roles:

SELECT p.name AS UserName, r.name AS RoleName
FROM sys.database_role_members m
JOIN sys.database_principals r ON m.role_principal_id = r.principal_id
JOIN sys.database_principals p ON m.member_principal_id = p.principal_id
WHERE r.type = 'R'ORDER BY UserName;

List all users and their permissions on a specific object:

SELECT USER_NAME(p.grantee_principal_id) AS UserName, p.permission_name, p.state_descFROM sys.database_permissions p
WHERE p.major_id = OBJECT_ID('<schema name>.<table name>')
ORDER BY UserName;

Note: Replace <schema name> and <table name> with the schema name and table name of the object you want to check.

List all logins that have not been used in a specified number of days:

SELECT name, create_date, modify_date
FROM sys.server_principals
WHERE type IN ('U', 'G', 'S')
    AND name NOT LIKE '##%'AND name NOT LIKE 'NT AUTHORITY\%'AND DATEDIFF(DAY, last_login, GETDATE()) >= <number of days>ORDER BY name;

These queries can help you identify security-related issues in your SQL Server environment and take appropriate action to address them. You can use them in combination with other security tools and best practices to ensure the security and integrity of your data.

Auto Growth Settings:

This setting determines how the database files grow when they reach their maximum size. You should ensure that the auto growth settings are configured correctly to prevent the database from running out of space. This includes setting the growth increment and the maximum size of the files.

Filegroups: Filegroups are used to group database objects and data files. You should ensure that the filegroups are configured correctly based on the size and type of data that is stored in the database. For example, you may want to create separate filegroups for frequently accessed data to improve performance.

Index Settings: Indexes are used to improve query performance by allowing SQL Server to locate data quickly. You should ensure that the index settings are configured correctly based on the database schema and query patterns. This includes creating appropriate indexes, setting fill factor, and updating statistics.

Locking and Concurrency Settings: Locking and concurrency settings determine how SQL Server manages database transactions and concurrency. You should ensure that the locking and concurrency settings are configured correctly to prevent blocking and deadlocks. This includes setting the isolation level, configuring lock escalation, and optimizing query performance.

Query Store Settings:

The Query Store is a feature in SQL Server that allows you to track query performance over time. You should ensure that the Query Store settings are configured correctly to capture query performance data and improve query performance. This includes enabling the Query Store, configuring data retention, and using the Query Store reports to analyze query performance.


Database Owner:

The database owner is the security principal that owns the database. You should ensure that the database owner is set correctly and that appropriate permissions are assigned to the database owner.

Mikes Take: Set Database Owner to SA - unless there is a reason not to

SQL Server Agent Settings:

SQL Server Agent is a service that is used to automate tasks such as backups, maintenance, and notifications. You should ensure that the SQL Server Agent settings are configured correctly to ensure that jobs are executed successfully.


Here are some SQL Agent settings to check when installing a new instance:

  • Service Account: Make sure that the SQL Agent service is running under a domain account with appropriate permissions. It is recommended to use a separate domain account for each SQL Server instance.

  • Startup Type: Configure the SQL Agent service to start automatically when the server starts.

  • Error Log: Check the SQL Agent error log to make sure that there are no errors related to the service.

  • Database Mail: Configure Database Mail to enable alerts and notifications from SQL Agent jobs.

  • Proxy Accounts: Create proxy accounts for SQL Agent jobs that require elevated privileges. Use the principle of least privilege when assigning permissions to proxy accounts.

  • Job Schedules: Configure job schedules to run at appropriate times and intervals. Avoid running jobs during peak hours or when system resources are limited.

  • Alerts: Configure alerts to notify you when specific events occur, such as job failures or high CPU usage.

  • Operators: Configure operators to receive notifications from SQL Agent jobs. You can create multiple operators and assign them to different groups or job categories.

  • Backup Jobs: Create backup jobs to ensure that your databases are backed up regularly. Schedule the backup jobs to run at appropriate intervals and verify that the backups are completed successfully.

Database Mail Settings:

Database Mail is a feature in SQL Server that allows you to send email notifications from the database engine. You should ensure that the Database Mail settings are configured correctly to enable email notifications for important events such as failed backups or database corruption.


Here are the steps to set up Database Mail in SQL Server using T-SQL scripts:


Enable Database Mail: Run the following command to enable Database Mail:

sp_configure 'show advanced options', 1;
GO
RECONFIGURE;
GO
sp_configure 'Database Mail XPs', 1;
GO
RECONFIGURE
GO

Create a Mail Profile: Run the following script to create a mail profile:

EXECUTE msdb.dbo.sysmail_add_profile_sp@profile_name = 'YourMailProfileName',
    @description = 'Description of your mail profile',
    @profile_security = 'private';
GO

Add Mail Accounts: Run the following script to add a mail account:

EXECUTE msdb.dbo.sysmail_add_account_sp@account_name = 'YourMailAccountName',
    @email_address = 'YourEmailAddress',
    @display_name = 'YourDisplayName',
    @mailserver_name = 'YourMailServerName',
    @port = YourMailServerPortNumber,
    @username = 'YourMailServerUsername',
    @password = 'YourMailServerPassword',
    @use_ssl = 0;
GO

Note: Set the @use_ssl parameter to 1 if your mail server requires SSL encryption.


Add Mail Profile to Account: Run the following script to add the mail profile to the mail account:s

EXECUTE msdb.dbo.sysmail_add_profileaccount_sp@profile_name = 'YourMailProfileName',
    @account_name = 'YourMailAccountName',
    @sequence_number = 1;
GO

Add Operators: Run the following script to add operators:

scss
EXECUTE msdb.dbo.sp_add_operator@name = 'YourOperatorName',
    @enabled = 1,
    @email_address = 'YourEmailAddress';
GO

Add Alerts: Run the following script to add alerts:

EXECUTE msdb.dbo.sp_add_alert@name = 'YourAlertName',
    @message_id = 0,
    @severity = 0,
    @enabled = 1,
    @delay_between_responses = 0,
    @include_event_description_in = 1,
    @category_name = 'YourCategoryName';
GO

Add Alert Operators: Run the following script to add alert operators:

EXECUTE msdb.dbo.sp_add_alert_response
    @alert_name = 'YourAlertName',
    @operator_name = 'YourOperatorName',
    @job_name = NULL;
GO

Once you have executed these scripts, Database Mail will be set up in SQL Server, and you can use it to send email notifications from SQL Server.

Recent Posts

See All

Get in Touch

Thanks for submitting!

bottom of page