Professional Documents
Culture Documents
1-800-418-6789
LearnSmart
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
Table of Contents
Abstract. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
What to Know. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
Tips. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
Domain 1 Installing and Configuring SQL Server 2008 . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
Hardware and Software Requirements for Installing Microsoft SQL Server 2008. . . . . . . . . . 7
Installing SQL Server 2008 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
SQL Server Instances . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
Selecting SQL Server Features. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
SQL Server Security Considerations During Install. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
FILESTREAM Configuration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
Data Directories in SQL Server Installation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
Command Line Setup . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
Implementing Database Mail . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
SQL Server Configuration Manager. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
Domain 2: Maintaining SQL Server Instances. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18
Understanding System Stored Procedure: sp_configure. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18
Manage SQL Server Agent Jobs. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18
Manage SQL Server Agent Alerts. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
Manage SQL Server Agent Operators. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21
The SQL Server Declarative Management Framework (DMF) . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22
Backing Up SQL Server 2008. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24
Domain 3 - Managing SQL Server Security. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26
SQL Server Logins and Server Roles . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26
Password Policies. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28
SQL Server-Level Roles and Database-Level Roles. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28
The Guest, Public, and DBO Principals. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30
SQL Server Instance Permissions. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31
Logon Triggers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31
Impersonation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31
Cross-Database Ownership Chaining. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31
Managing Schema Permissions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32
Auditing SQL Server. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33
DDL Triggers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34
LearnSmart
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
C2 Audit Configuration. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35
Common Criteria. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35
Transparent Data Encryption. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35
SQL Server Surface Area . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36
Domain 4 Maintaining a SQL Server Database . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38
Database Management and Configuration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38
Database Files . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38
Database Filegroups . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40
Range of Database Options . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41
Recovery Models . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42
Detaching and Attaching Databases . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43
Database Backups. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44
Backup Type Details. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44
Verifying Backups . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46
Restoring Databases. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46
Restore Types and Strategies. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47
Database Snapshots. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48
Database Integrity. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49
DBCC CHECKDB . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49
Database Maintenance Plans. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50
Domain 5 Data Management Tasks. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52
Data Export and Import. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52
Methods of Data Import and Export . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53
BCP . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53
BULK INSERT . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55
OPENROWSET. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56
SQL Server Management Studio GUI Data Import and Export Capability . . . . . . . . . . . . . . . . . 57
Data Partitions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58
Data Compression. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 59
Maintaining Indexes. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 61
Domain 6 SQL Server 2008 Monitoring and Troubleshooting. . . . . . . . . . . . . . . . . . . . . . . 62
SQL Server Service Problems. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 62
Concurrency Problems. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 64
SQL Server Agent Problems. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 67
LearnSmart
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
LearnSmart
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
Abstract
This LearnSmart exam manual will help you prepare for the SQL Server 2008 administration as tested in
Microsoft Exam 70-432. This manual provides practical examples, test tips, and guidance on Microsoft best
practices, which pervade the questions in Exam 70-432.
What to Know
Students should study all facets of installing SQL Server, installing SQL Server instances, and adjusting
SQL installations via SQL Server Management Studio, SQL Powershell, and SQL Command Line. Students
should study SQL query syntax and understand the T-SQL commands related to installation, security, and
management of SQL Server.
Tips
It is important to ensure that you have hands-on experience with SQL Server 2008 and can attempt the
syntax of queries demonstrated in this manual and the configuration examples. The availability of so many
virtual server/PC type products today and Microsofts 180-day free trial of SQL Server 2008 should allow
plenty of chances to work with the product prior to taking the exam.
LearnSmart
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
Nickname or
Common Reference
x86
x64
Itanium
Example
Operating System
Microsoft Windows
Enterprise 2003 R2 x86
Microsoft Windows
2008 R2 Enterprise x86
Microsoft Windows
Server 2003
Enterprise R2 x64
Microsoft Windows
7 Ultimate x64
Microsoft Windows
2008 Server Enterprise
R2 Itanium IA 64
Microsoft Windows
2003 R2 64-bit
Itanium Datacenter
LearnSmart
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
Required RAM
Required Processors
LearnSmart
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
LearnSmart
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
LearnSmart
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
LearnSmart
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
LearnSmart
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
Account provisioning is a critical area during SQL Server installation. SQL Server provides two
authentication modes: Windows authentication mode and Mixed Mode authentication, which allows SQL
Server to maintain accounts for SQL Server access. Todays best practice for security is not to configure
any SQL Server in Mixed Mode, thus reducing the attack surface against SQL Server by not allowing
authentication by the SQL Server itself. In some cases a legacy application or support consideration will
require accounts in SQL Server. For the exam, keep in mind that using another instance of SQL Server
to accommodate different security needs is the way to go. Account provisioning is the stage of the
installation when you will decide who will administer the SQL Server. In SQL Server 2008 a decision must
be made to add an administrator to SQL Server for management once the product is installed. This data
should be prepared in advance of the installation. For example, a group in the Active Directory may
contain designated SQL Server administrators or a local group when the Active Directory is not in use.
Test Tip: Make sure you completely understand the ramifications of using Mixed Mode SQL Security! In
most cases dont do it!
FILESTREAM Configuration
An important new feature in SQL Server 2008 is FILESTREAM support. FILESTREAM is a feature that
supports the storage of large images and other BLOB type data over 1 MB. An example is scanned images,
or JPEGs of receipts. This SQL option enhances fast read access for stored objects. This feature can be
configured during installation or by using the SQL Server Configuration Manager. FILESTREAM enables
SQL Server to quickly access objects stored in a server file system versus a database table. The data type
is varbinary (max) when items are stored in file server storage. The configuration requires the name of a
Windows share.
Test Tip: The following T-SQL statement enables the FILESTREAM attribute capability after the installation
of SQL Server. The option 0 disables FILESTREAM, 1 enables FILESTREAM for T-SQL access, and 2 allows
FILESTREAM access for Win32 applications.
LearnSmart
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
LearnSmart
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
LearnSmart
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
LearnSmart
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
LearnSmart
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
LearnSmart
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
LearnSmart
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
The SQL Server Agent can alert via email, SQL event log, or WMI. WMI or Windows Management
Instrumentation is an alert that is parsed to a system that is running WMI based agents, such as Microsoft
System Center. These alerts can use the rich programming capability of the WMI model. These alerts
can also use any Windows performance object contained in the performance monitoring counter sets.
If you have a case where the alert should monitor SQL Server disk space, then the alert should use the
Windows performance counter element LogicalDisk with a configuration of % Free Space. The solution of
combining SQL Server Agent performance condition alerts with the underlying Windows Server operating
system performance object is very powerful.
Figure 11: SQL Server Agent Alert for Database Disk Size > 80%
SQL Server Agent alerts work in conjunction with other SQL Server notification features. For example,
if you have configured stored procedures running, ensure that they use the option to RAISEERROR WITH
LOG, and this will work with the configured SQL Server Agent alert.
The ubiquitous existence of email enabled cell phones today underscores email notification as a viable
and effective SQLS Server alert method. Emails configured as alerts can notify for both performance
and event alerts. This configuration also builds on the database mail settings covered in the Domain 1
of this manual.
Test Tip: SQL Server Agent logging uses the Windows Application Event log to create SQL Server
Agent alerts. If the Windows Application event log is using the setting to not grow and requires a
manual intervention to clear it, then SQL Server Agent alerts will fail if the Application event log is full.
LearnSmart
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
LearnSmart
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
LearnSmart
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
Once a condition is selected in building the expression, the options for that field are available. In this
example, setting a database policy for Recovery Model, when selecting the field @RecoveryModel,
the options of Full, Simple, and Bulk-Logged are available. Conditions can be combined to form multiple
element policies. The policy is applicable against all databases or select databases. Policy configuration
can be restricted based on the formulation of policy facets. A facet can determine exclusions for policies
or other considerations.
Policy Management contains an evaluation feature that allows evaluation of polices prior to production
implementation. Facets of a database are the feature settings, and the SQL Server policies set and manage
facets of a database. Policies can evaluate the conditions they are going to employ and different modes
exist to include:
QQ
On change: log, which will log compliance with the policy as it would have done in an
evaluation mode.
On change: prevent, which will stop the policy from executing if a policy evaluation fails.
LearnSmart
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
Test Tip: Any DMF policy created that is left in the on demand evaluation mode will not enforce the policy.
The policy must be set to on schedule to ensure it is executing. For example, if a policy for auditing
does not allow SQLMail enabling but SQLMail is running the reasons can be that the policy is set to on
demand. When policies execute on schedule, they use the security role of sysadmin, which ensures that
a proper security context is in use to execute the policy. SQL Administrators can log on and evaluate a
policy manually, but this will use the logged on users security context and it may not have all required
permissions to run the policy successfully.
Full Backup Complete backup of a database file. This type of backup only represents the
database at the time of backup.
Differential Backup A backup that only adds data changed since the previous full or
differential backup.
Transaction Log Full Backup Complete backup of a transaction log. Transaction log backups
can be Full recovery model or bulk-logged recovery model. The full recovery model allows
recovery back to a point in time by restoring backed up transaction logs. The bulk-logged
recovery model is for temporary backups and operations during a large bulk operation.
Partial Backup A backup that only addresses full data in the primary filegroup and any
specified filegroups.
Differential Partial Backup A backup that only contains the data extents modified since the
most recent partial backup.
Establishing a SQL backup is best handled through the DMFs Maintenance Plan Wizard.
This wizard is launched through the Maintenance Plans node under Management in the SQL Server
Management Studio. This wizard allows the setup of multiple schedules when using full and differential
backups, and for different times for a transaction log backup. You can select the Backup Database (Full)
option or others as required. You can also select other common tasks, such as Shrink Database. The wizard
allows configuration of the back location, which can be disk or tape. Normally, backups on disk are then
backed up to tape by another product, such as VERITAS Backup Exec. Compression is also an option to
ensure that data storage is optimized. For full backups the option exists to append to the designated
backup location, or overwrite. In most cases you will append data for the backup so that a point in time is
selectable. The result of the backup job can be emailed or written to a file.
LearnSmart
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
The exam will challenge you to conduct a database restore for a database problem with
ponypowersalesDB. Pay close attention to the order of the backups and what time the exam indicates the
problem has occurred. In this example, a failure occurs at 3:00 PM or 1500 hours. The restore order is (1)
the full database backup from 2300; (2) the 1400 differential backup; and (3) each transaction log backup
before 3:00 PM. This will restore the database as quickly as possible.
In some recovery cases, the restore will be for a database already restored. The database recovery model
Full Recovery allows you to restore a database using the WITH RECOVERY option. The Full Recovery model
allows the use of transaction log backups, while the Simple recovery does not. This will allow you to
restore a backed up database, which places the database in standby mode and allows the full backup with
recovery option described above.
LearnSmart
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
The COPY_ONLY backup option is also covered on the exam and allows backing up data from a database
or log file without affecting the backup sequence. If you recovered a file within a database from a previous
backup and want to ensure playback of the transaction log backups for true point in time recovery then
restore whatever part of the database file or filegroup is required then backup the transaction log with
the COPY_ONLY option and restore it WITH NORECOVERY and then restore it again WITH RECOVERY. Using
WITH_RECOVERY as the last option in the restore will bring the file back online. This is known as a tail-log
transaction log backup. When the failure of the database in question is reported or known, the quick
backup of the transaction log attempts to recover transactions between the failure and the transaction
log backup. In many cases this will ensure you are the SQL Administrator hero of the day because normally
no data is lost, especially if best practices are followed and the transaction logs are on different disks than
the database files.
Another option, which the exam will also cover, is the SQL 2008 backup option called READ_WRITE_FILEGROUPS.
In database design read-only databases are usually filegroups with the database structure. Backups are
quicker because read-only data is skipped.
LearnSmart
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
LearnSmart
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
Password Policies
SQL Server password policies provide the same robust security enforcement as Windows 2003 and
Windows 2008 Active Directory complex passwords enforcement. The parameters include enforcing
password complexity, password expiration, and then enforcing the policy. The policy applies ONLY to SQL
based logins, not Windows Active Directory based accounts.
Test Tip: Use the ALERT LOGIN T-SQL command with CHECK_POLICY to ON to enforce
the password policy.
Definition
SysAdmin
ServerAdmin
SetupAdmin
Any member can manage linked server and SQL Server startup options and tasks
Securityadmin
ProcessAdmin
DBCreator
DiskAdmin
BulkAdmin
Public
LearnSmart
Practice Exams
Audiobooks
Exam Manuals
Predefined
Database Roles
Definition
Db_owner
Db_accessadmin
Db_datareader
Db_datawriter
Db_ddladmin
Db_securityadmin
Db_backupoperator
Db_denydatawriter
Db_denydatareader
1-800-418-6789
LearnSmart
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
User mapping is also important to complete the setup of database roles. When setting up a user
administers have the option of mapping the user to a database and selecting a role. Again, best practice is
to use least amount of permissions to maintain a secure audit posture. Logins not mapped to a database
have no access.
LearnSmart
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
Logon Triggers
Logon triggers are used to activate a stored procedure in SQL Server. Logon triggers are covered on the
exam in the audit and control server portions of the questions. When audit type questions on the exam
describe methods to track user activity the answer is the logon trigger. The trigger can be set to run for
access to the SQL Server or a database. Logon triggers include the ability to activate stored procedures
and ensure they can enforce and prevent actions. A logon trigger can ensure that a login is not running
more than two sessions to an instance at any one time. The logon trigger can enforce (via checks)
conditions about the user session.
Test Tip: Exam questions that ask about restricting user sessions or checking the amount of sessions are
asking about the use of a logon trigger. The logon trigger is created in the master database to ensure SQL
Instance wide enforcement.
Impersonation
SQL Server impersonates credentials that originate as a Windows user when communicating within SQL
Server or SQL services. When SQL runs a process it will use the context of the login, and this transposes to
other functions, processes, Linked SQL Servers, and database access.
SQL Server has a Common Language Runtime (CLR) built in to execute managed code within the SQL
Server memory space. The SQL.Context.WindowsIdentity in the CLR is used to impersonate a Windows
account with permission to access the required databases and code. This code call will set the context
under which the code will execute which allows creation of Windows accounts with the appropriate
permissions to execute in the CLR space.
Test Tip: Exam questions that offer answers for impersonation by adding the Windows account to the
Active Directory Domain Admins group or local server Administrators group are wrong answers.
The enforcement of security always dictates the least amount of privilege to execute successfully.
Always evaluate exam questions in the security arena with a focus on least permissions.
LearnSmart
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
LearnSmart
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
SUCCESSFUL_LOGIN_GROUP
LOGOUT_GROUP
FAILED_LOGIN_GROUP
AUDIT_CHANGE_GROUP
SERVER_PERMISSION_CHANGE_GROUP
The following are examples of database specific audit specifications and should be applied
to all databases:
QQ
DATABASE_CHANGE_GROUP
DATABASE_PERMISSION_CHANGE_GROUP
DATABASE_OBJECT_ACCESS_GROUP
The following are database table specific audit specifications that ensure auditing is applied at the lowest
point of the database object:
QQ
SELECT (this audit feature captures the SELECT statement not the data!)
INSERT
UPDATE
DELETE
Protection is also required for the SQL Server audit files. Ensure that you are familiar with auditing the
reading and changing of the produced SQL Server audit files. An audit set on master.sys.fn_get_audit_file
records any login that reads the audit file or attempts to change it.
LearnSmart
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
Each database has a property called Change Tracking, which is enabled at the database object level and
then available to any table within the database. This will track all changes to data in a database within a
table and allow you to specify an interval to retain the change data.
DDL Triggers
DDL triggers are similar to logon triggers in that they execute events based on some defined
audit condition. DDL triggers provide the power to enforce SQL Server auditing by ensuring code
execution for defined events and the execution of alerting when required. DDL triggers track specific
events within databases. For example, if an SQL administrator wanted an event written every time a
database object is deleted a DDL trigger would satisfy this requirement.
DDL triggers help collect audit activity in databases. For example, an SQL administrator always has the
option of creating an audit table based on an XML column and inserting any DDL trigger events into the
table for simple parsing and reporting later.
A very powerful feature of DDL triggers is the ability to act on CREATE_USER actions and then execute
procedures to enforce permissions management. For example, if you wanted to have all users for a
database receive data_writer capability on creation the DDL trigger can import the user name from the
XML based data and then form an SQL statement to execute the requirement.
Test Tip: When the exam question asks how to audit an event every time it occurs and write an event,
the answer should focus on the DDL trigger capability.
LearnSmart
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
C2 Audit Configuration
The C2 audit mode was a previous SQL Server audit posture that matched event logging for both failed
and successful attempts to access statements and objects. This mode is easily turned on in SQL Server
Management Studio\Security Properties or by using sp_configure. All C2 information is stored in the sys.
traces catalog view.
Test Tip: C2 auditing collects a lot of data. If the file location where the logging is saved runs out of
disk space then SQL Server will shut down. To bring SQL Server back up you must free up disk space,
or for the exam know that you can bring up SQL Server and bypass logging with the following command:
Run: sqlservr f.
Note that the C2 audit standard is now replaced in SQL Server (and the industry) by Common Criteria.
Common Criteria
Common Criteria is an industry security standard certification that is promulgated by the NSA DBMS
Protection Profile. SQL Server 2008 is compliant when using Common Criteria compliance as an
option to EAL1+.
Common Criteria addresses the following elements. Residual Information Protection (RIP) ensures the
overwriting of bits in memory with a known pattern of bits before the memory is used by another process.
This prevents buffer and other types of data harvesting from leftover processes in memory. Common
Criteria supports the viewing of all login statistics to include successful logins, unsuccessful logins, number
of login attempts, and login times. These statistics are viewed in the sys.dm_exec_sessions Dynamic
Management View (DMV). The last element of Common Criteria is that any table with DENY will take
precedence over any column object of that table with a GRANT.
Common Criteria audit options are enabled via the SQL Server Management Studio or with sp_configure.
You must download a script from the Microsoft SQL Server Common Criteria web site to complete the
configuration and restart the SQL Service.
Test Tip: When exam questions focus on logging all login attempts both successful and unsuccessful,
the answer will include enabling common criteria compliance. When you review the answers to this type
of question take note that logon triggers only execute AFTER logons have occurred and only capture
successful login activity.
LearnSmart
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
The use of TDE requires underlying key configuration for SQL Server. You will need to create a master key.
This is accomplished via the CREATE MASTER KEY statement. The next requirement is a certificate, which
like all certificate operations, can be generated from the Active Directory Certificate Authority, the local
server certificate authority, or a third party certificate from VeriSign or a similar accredited certificate issuer.
The ability to create the required DEK for TDE is contingent upon the above two steps.
LearnSmart
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
Figure 23: SQL Server policy condition using the Surface Area Configuration facet
disabling Service Broker
Once a condition is created based on a preconfigured facet this condition rolls up to a policy on
the server, for example, a combined security policy which will include the defined conditions, in this case,
those related to surface area configuration.
LearnSmart
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
Figure 24: Adding the configured condition to a policy for Surface Area Configuration
Test Tip: In this area, the exam focuses on reusing policies for other SQL Server instances. SQL Server
Management Studio allows the export of a facet as a policy to a file. On a new SQL Server instance
select the exported policy file, evaluate it, and apply for any violations. Sp_configure is the answer
when exam questions ask you to evaluate a surface area solution for one SQL Server instance versus
multiple instances.
Database Files
The fundamental building blocks of SQL Server are the collection of files that make up the complete
database object. Anytime a database is created two files are created by default. A database consists of
a primary database file, with the extension *.mdf, and filegroups within the database object, with the
extension *.ndf, and the log file, which has the extension *.ldf. A database object always has a primary file,
and then filegroups become secondary. Within each SQL database object are the various elements:
tables, indexes, and schemas.
LearnSmart
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
Figure 25: Database Files with extensions, Primary, Filegroups, and Logs
SQL database objects can contain multiple log files. When one log file fills up, the next log file is used.
This ensures that the database object stays online and does not suffer downtime.
SQL Server database object files should be stored on separate disks or arrays to achieve fast throughput.
The primary *.mdf and *.ndf files should be on different disks or arrays than the *.ldf files. This supports the
goal of disk read and write operations employing different I/O paths to different physical disks. Ideally each
disk subsystem path will use a different disk controller or Host Bus Adapter (HBA). The SQL Server database
filegroup of *.ndf files can also benefit from performance with storage to different disk subsystems. In the exam
questions note that RAID 1 is the best disk RAID type to support a log file. This also provides fault tolerance.
Performance experts do not agree about the use of large RAID 5 volumes as locations for SQL Server *.mdf
and *.ndf files, but this does provide a solid fault tolerance solution. In most cases high transaction based SQL
Server and OLAP implementations use RAID 10 (RAID 1 + 0) SAN configurations to achieve the fault tolerance
capabilities of RAID 5, with the speed of RAID 1 data reads and writes.
SQL Server 2008 stores data on a page. This is a common term used in backup and restore scenarios
because at times the restore problem is to just restore a corrupted page holding some data within the row
on that page. Standard page storage is 8 KB. The grouping of 8 pages together in an SQL Server database
file structure forms an extent, which is 64 KB. The exam expects an understanding of SQL Server database
structures to this level and questions emerge on the smallest elements of a row and table.
LearnSmart
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
Database Filegroups
SQL Server database filegroups build on the common database file structures listed above and expand
the options for addressing data segregation, performance, and recoverability.
A filegroup stores multiple SQL Server database files in it. The creation of a database establishes the first
file in the group as the PRIMARY. SQL Server administrators in the design process can create multiple
filegroup objects which become the *.ndf files listed in the definition of database files. Filegroups allow
the spreading of tables and indexes across physical files. Exam questions from the SQL Server Performance
section (Domain 7) have filegroups as the basis for achieving great optimization in very large database
implementations. You can create multiple filegroups and place them on different volumes and achieve
better I/O. When queries execute to various tables spread between filegroups performance is increased.
Filegroups also provide the foundation for good database maintenance and backup planning.
Read-only data is placed into a read-only filegroup, which reduces backup time. One strategy is to place
large data in different filegroups and then use full backup or differential backup at different times to
protect the data. For example, a filegroup named PARTS is backed up fully on Monday and Thursday,
and a filegroup named SALES is backed up fully on Tuesday and Friday. This strategy saves time because
only portions of the database are backed up and on different days to avoid congestion when backing up
large amounts of data.
Test Tip: For exam purposes note that a transaction log can never be in a filegroup!
LearnSmart
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
LearnSmart
Optionspec
Auto_option
Change_tracking_option
Cursor_option
Database_mirroring_option
Date_correlation_optimization_option
Db_encryption_option
Db_state_option
Db_update_option
Db_user_access_option
External_access_option
Parameterization_option
Recovery_option
Service_broker_option
Snapshot_option
Sql_option
Termination
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
Figure 27: ALTER DATBASE command in action (changing recovery model to full)
Recovery Models
The type of recovery model used in the SQL Server database configuration will drive the restore method
available for SQL server databases. SQL Server has three recovery models: full, simple, and bulk-logged.
Each of these models has a different effect on how the databases transaction logs are set up. All of the
recovery models are set in the Database Properties\Options node. Right-click on a database in SQL Server
Management Studio and select database properties.
The simple recovery model dictates that the transaction logs are not kept. When database data is
changed and committed the transaction log in simple recovery mode is also cleared, or in SQL Server
terms, truncated. When a truncation of the transaction log occurs it is no longer useful in the restore
of a database. When a database is in simple recovery mode the only database backup type available is
full backup. An SQL Server administrator in support of production databases would not use this recovery
model. If a database is backed up each hour over six hours and a failure occurs at 5 hours and 55 minutes,
the data loss window is now 55 minutes of data as the administrator would restore the backup taken
at hour 5.
The full recovery model does not automatically create SQL checkpoints on the transaction log and
truncate them. The full recovery model allows the introduction of two new backup and restore functions:
differential and transaction logs. In the full recovery model an administrator can restore one file,
without the entire database, or one or more pages and a transaction log. The full recovery model enables
full point in time recovery for a database for any time range covered by the backed up transaction logs.
The bulk-logged recovery model is like the full recovery model except that this mode supports bulk
operation and is designed to support the full recovery model. This model is a performance-enhancing
mode for support of intensive database operations such as bulk imports of data or index creation. Bulk log
will reduce the log space required. However, one key difference from the full recovery model is that the
bulk-logged model does not support point in time recovery!
LearnSmart
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
The company does not have much available storage space for backups.
Some of the answers will offer the differential backup as an option. This option, however,
uses much disk space and backs up everything changed.
Answers will also offer different time intervals. Always ensure you match your answer with the
solution. In this case the operative timeframe is 10 minutes.
LearnSmart
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
Database Backups
Database backups and restores occupy a large portion of an SQL administrators time during support
or production database operations. This area is also crucial to successfully passing MS Exam 70-432.
Domain 4 adds to the basics of backups and restores covered in Domain 2. In this section we will
investigate more scenarios for backup and restore questions versus core definitions and how to set up
and conduct these services.
LearnSmart
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
The business division, which uses the 302Info database for corporate sales, has requested that you devise
a backup implementation that achieves the following business goals:
QQ
Filegroup Types
PowerAccessoriesSales
Parts
CustomerInfo
Test Tip: The above scenario demonstrates a method you can use to evaluate the requirements of a
question on the exam and how to eliminate wrong answers and isolate the correct one.
LearnSmart
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
Verifying Backups
Verifying backups in SQL Server is an important administrator task and one that should happen often to
ensure that when disaster strikes you are ready. The exam will expect you to know how to verify a backup
during the backup process and also manually.
The Verify backup option is available when setting up a backup job through the SQL Server Maintenance
Plan interface. This option will ensure that the backup is complete and the volumes are readable.
Another verification method is the option of performing checksum before writing to media.
The backup process will check media but continue on when it finds an error, but not use that portion
of the backup volume.
Figure 29: SQL Server Backup options for reliability and verification
Restoring Databases
The restore process for SQL server databases takes several routes. Keep in mind for the exam that the
SQL Server database recovery model dictates a route for the restore operation. If the recovery model
is set to simple, then a transaction log restore is not possible because it was not an option in the
backup configuration.
SQL administrators will excel on the exam by understanding the following restore methods, and this will
translate to successfully supporting corporate SQL database operations. The first time you conduct a
restore from well managed backup options that save a company money and time you will be the hero!
LearnSmart
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
Simple recovery model: Restores the entire database with no transaction log capability.
Full recovery model: Restores the entire database, and allows restore of any
subsequent log files to the full backup. As with all full recovery model restores the last
restore of a log is restored with the RESTORE WITH RECOVERY option.
Restore of a data file, which can include different filegroup database files and read-only files.
Simple recovery model: Restore individual files only if the database has at least one
read-only filegroup.
Full recovery model: Restore individual files, without restoring the entire database.
Filegroups being restored are always offline.
Full recovery model: Page restoration relies on having an unbroken chain of log
backups to the current log file.
Piecemeal restores recover databases in stages at the filegroup level, starting with the primary
and working through the remaining read/write or secondary filegroups.
Backup Time
Full Backup
2300 Nightly
Differential Backup
1000 Daily
Differential Backup
1400 Daily
At 1310 PM on a Tuesday, the SAN array hosting the database file abends (abnormal end to operations)
and is offline. How will you quickly recovery the database?
LearnSmart
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
When thinking about the solution above, using these steps you can successfully restore the database back
to its state at 1300 or a data loss window of 10 minutes.
The backup of the transaction log is detailed below in the Transaction Log Tail Restore section. This last
transaction log backup prior to the restore allows the recovery process to even restore some of the
transactions between 1300 and the time of the failure at 1310. There is a differential backup conducted
at 1000 daily so this will reduce the amount of time for the restore due to not having to restore logs from
2300 the night before.
Rule out answers that do not back up the log as the first step of the restore process. Beware of answers
that suggest that a restore of the latest transaction log is the only requirement for logs. Also note
answers that restore transactions logs from the original full backup time and do not incorporate the
differential backup.
Database Snapshots
Database snapshots provide another level of recoverability and fault tolerance in protecting SQL Server.
Microsofts database snapshot technology builds on their expertise with virtual drive snapshots for
Windows operating system products, but adds the important SQL unique snapshot requirements of
ensuring that the database snapshot is readable by SQL Server. A snapshot is a point in time picture of the
state of a database and all of its objects. Database snapshots are extremely useful for code updates that
affect table structures or other database stored procedures. When a code update goes awry or some new
process removes a table or index by mistake, the reversion back to the previous database snapshot saves
time from a restore and ensures rapid recovery capability.
Test Tip: There is no capability within the SQL Server Management Studio graphical tools to create
a database snapshot.
The database snapshot is read-only. However the contents of the created snapshot are usable with T-SQL
commands for restore back into the source database.
LearnSmart
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
Snapshots are created with the CREATE DATABASE command. An example is shown below.
Database Integrity
Database integrity refers to the usability of SQL Server databases. Are they fit? Are they corrupted?
The primary tool in your SQL administrative tool kit for analyzing database integrity is the
DBCC command. Common SQL Server challenges for databases include torn pages that affect
performance and data retrieval. Abrupt power outages can contribute to database damage.
The DBCC command and its options allow a range of checks and repair activity.
DBCC CHECKDB
Database Console Commands or DBCC are used to conduct maintenance and integrity checks
on databases. DBCC has a set of statements related to validation, which help ensure database integrity.
The primary statement used for validation is DBCC CHECKDB. This check will ensure correct allocation and
structural integrity of all objects in a database. It will identify and repair a wide range of possible errors.
Exam Prep: Using DBCC CHECKDB
You are responsible for a database named 302Info. This database is on an SQL Server 2008 instance.
302Info has six filegroups, and tables located over several filegroups. A problem has occurred and a
developer has indicated a check of the databases integrity is needed as quickly as possible.
What will you do?
LearnSmart
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
LearnSmart
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
Maintenance Plans are built from the Maintenance Plans node in SQL Server Management Studio.
The wizard supports rapid point and click creation of maintenance plans with all required options,
which include time, schedule, databases to affect, types of backups, and notifications via text files and
email reports. The large number of predefined maintenance tasks facilitates the quick build of multi-step
plans to allow individual step schedules.
The Maintenance Plan designer allows the manual creation of database maintenance plans and editing
of created plans.
LearnSmart
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
Never use a name for a BCP export if the data file name is already in use. This will OVERWRITE the
current exported data in the file.
The SELECT permission is a minimum security requirement to export data.
When exporting data, SQL Server 2008 will use parallel scans for export retrieval, i.e., data is not
retrieved in any specific order. When the exam mentions specific order for an export, always use
the queryout option explained below, with an ORDER BY switch.
Import
QQ
The destination table for the data import must already exist.
The target table must have a corresponding column with the imported data.
Use XML formats to import fixed length fields with a format file.
LearnSmart
CSV files are not supported for BULK IMPORT operations, however, if data fields do not contain
a field ender (the comma normally in a CSV file), or none of the data is in quotation marks,
then CSV-based data may import.
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
BCP
The BCP command can export or import data from a SQL Database object. The BCP utility is command line
based, and accessed via bcp.exe. This utility is beneficial in that it runs on a small memory space and is fast.
When exam questions focus on quick out of band SQL Server data import or export--think BCP! It can be
accessed via the Windows command line, in script, or with SQL Server Powershell.
LearnSmart
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
The most commonly used syntaxes for BCP are listed below:
BCP Setting
Explanation
-a packetsize
-b batchsize
-c
Specifies the use of character mode for data transfers. This supports a non-SQL
destination and will output data in ASCII text.
-C codepage
Sets the relevant codepage to use in the transfer (does not matter when character
values are > 127 or < 32).
-e errfile
-E
-F firstrow
-f formatfile
A very important component of any BCP command syntax. The format file is a file
with an extension *.fmt and contains key information for the import or export job.
-h loadhints
-l inputfile
-k
Keeps nulls.
-L lastrow
-m maxerrors
Determines the number of errors which can occur before stopping the BCP.
-N
-n
-o outfile
-P password
-r rowterminator
-S servername
-T
-U username
-w
-x
LearnSmart
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
A sample export command with BCP using a trusted connection is listed below. The data exported is
headers information from a table called Parts within the 302info database. The T switch ensures a trusted
connection at the command line.
bcp 302info.parts.headers out headers.dat T
Exam Prep: Using BCP
You manage several databases, one of which is called Accounts. This database has several tables in which
customers require the data monthly, and the data is very large. When exporting this data using SQL Server
based SSIS packages, you notice it takes a very long time and affects the capacity of your SQL Server
during the export. Since your operations are 24/7, any latency inducing operations are not good. What
actions should you take?
Solution: Using BCP
The answers on the exam will focus on many methods for exporting and importing and appear to always
mix in an XML file to help control data that is not formatted correctly. Other web service terms such as
XSLT and XSD do not APPLY to data import and export with BCP. The focus must be on time and efficiency,
which BCP adheres to since it runs in separate memory space, has the capability of running queries for
data, and is fast. As you evaluate the possible answers remember that BCP is always good for offsite
movement of data because its modes can support character-based export since the target system may
not be a SQL Server. Never forget that BULK INSERT is as advertised; it can only import data. BCP is very
scriptable in Cscript, Powershell, and SQL Agent jobs.
BULK INSERT
The BULK INSERT command is very similar in syntax to BCP but only addresses import, not export of data.
Note for the exam that BULK INSERT capability is related to setting the BULK-LOGGED recovery model.
Use this recovery model during BULK INSERT to speed up the process with less transaction log writes.
Database recovery models are flexible and can shift between BULK-LOGGED and FULL with no issues.
A sample BULK INSERT is listed below where table data is inserted into the Parts table object of the
database headers from a file named lineitem.tbl, with a field terminator switch and row terminator switch.
LearnSmart
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
The advantages of BULK INSERT are that it is great for importing large amounts of data, and the T-SQL
syntax switches are very similar to BCP. BULK INSERT has a corresponding recovery model to enhance
the insertion of very large amounts of data: BULK-LOGGED. This command also works very well with
constraints that include NULL values, clustered indexes, and foreign key constraints.
OPENROWSET
OPENROWSET provides another SQL Server 2008 method for data import. OPENROWSET is a function that
is used among SQL Servers for one-time, ad hoc data access. It is an OLE DB function called with the FROM
statement, or INSERT, UPDATE, and DELETE T-SQL commands. OPENROWSET also can read bulk data from a
file and then return it as a row set. This command allows rapid return of a row from a source and insertion
of the data into the destination.
The following example demonstrates the use of OPENROWSET with a SELECT statement and SQL Server
OLE DB as the provider. Different data providers may be invoked as required. The provider is listed first,
then the connection information, followed by the SELECT syntax, the FROM syntax, and additional
commands as required. One implementation note for using OPENROWSET is that Ad Hoc Distributed
Queries must be configured to on, or a state of 1. This is disabled by default.
LearnSmart
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
SQL Server Management Studio GUI Data Import and Export Capability
SQL Server 2008 provides a rich point and click wizard to set up data import and export packages via
tasks established in the SQL Server Management Studio. The graphical import and export wizard creates
packages that can import and export data in an ad hoc as required manner or as a scheduled job.
The SQL Server import and export wizard created packages can be managed by the SQL Server Business
Intelligence Development Studio and are SSIS packages. This wizard is accessed by right-clicking a
database in SQL Server Management Studio and selecting Import or Export under Tasks.
Figure 37: SQL Server GUI Data Import and Export Wizard
LearnSmart
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
The point and click wizard allows setting all package elements for the data import or export activity.
Key settings created in the wizard include:
QQ
Data Source (SQL Server, OLE DB, Microsoft Excel, or Microsoft Access)
Authentication Methods
Destination Server
Data copy options to include one or more tables or views, or a query to establish content
Table Mappings
Preview of Results
The ability to run the package immediately or save it for future use as an SSIS package
Ability to send exported data to multiple formats to include flat files, Microsoft Excel, and
Microsoft Access formats
Test Tip: Exam questions that describe regularly scheduled table data copies and import or export of data
from Microsoft Excel and Microsoft Access to SQL Server are indicating the use of SSIS.
Data Partitions
SQL Server 2008 tables and indexes can exist in more than one partition. This is termed horizontal
data partitioning because groups of rows are organized into individual partitions based on an
indentifed column. This type of table organization is critical to large database efficiency and performance.
Partitioning considerations include if the table will host data used in different ways, and any query or
other database operations that are latent or not executing properly. Additionally, you can partition a table
using the concept of vertical partitions, which is spreading the columns of a table across multiple tables
versus the rows.
If a large table for auto parts exists called PARTS, and the primary database operations against it are
INSERT or DELETE (some of the data is only used for SELECT queries), a partition could occur on the PARTS
types, and isolate the items used in SELECT queries from the ones used in INSERT and DELETE functions.
Another good example would be an order table, in which the current months data was used primarily
for INSERT or DELETE operations. The previous months data, which is mostly used for SELECT queries,
represents a target for partitioned tables. This ensures database functions do not scan or query an entire
large table.
The following commands support the partition function: CREATE PARTITION FUNCTION, ALTER PARTITION
FUNCTION, and DROP PARTITION FUNCTION. The exam focuses on an understanding of these commands
and when to use them to manage a data partition. Each new partition must have a range called the split
range, and the values must be unique from the existing table. If a partition function takes in the year 2006
for orders, then the new range must start with a different range, 2007, for example. The merging of data
into the new partition is accomplished once a range is created.
The steps for creating a table partition are:
1.
2.
3.
LearnSmart
Create or use an existing partition function that sets the correct range boundaries
(CREATE PARTITION FUNCTION).
Create or use an existing partition scheme tied to the partition function.
Create the table using the partition scheme.
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
This sample function demonstrates the creation of four partitions for each year with an int column.
CREATE PARTITION FUNCTION myDateRangeFP1 (int)
AS RANGE LEFT FOR VALUES (2006, 2007, and 2008)
This sample function demonstrates the alteration of an existing partitioned table in which orders received
in 2009 would split out.
ALTER PARTITION FUNCTION myDateRangeFP1 ()
SPLIT RANGE (2009)
A data partition is easily dropped with the following command:
DROP PARTITION FUNCTION myDateRangeFP1
Exam Prep: Altering a Data Partition
You manage a large database named PARTS hosted on SQL Server 2008. The PARTS database has a table
called INVENTORY, which is partitioned on its ReceivedDate column using the PFReceivedDate function.
This partition uses one month values.
You need to write a script that moves the older data for more than one year to the partitioned table for
PFRecivedDate into an empty partition. Which code should you use?
Exam Solution: Alerting a Data Partition
This example alerts you to use the ALTER PARTITION FUNCTION with appropriate syntax and switches.
You will run ALTER PARTITION FUNCTION PFReceivedDate() SPLIT RANGE(datelastyear). This will split out
the data from last year. Then you can merge the data into the empty partition, and add to it as needed.
Test Tip: The partitioned table feature is only available in SQL Server 2008 Enterprise and
Developer versions, not Standard.
Data Compression
SQL Server 2008 has several methods for data compression within the database object itself. The exam
expects an understanding of when to employ data compression based on the situation and how it will
affect the parameters of the database object.
SQL Server 2008 row compression allows the storage of fixed length data in variable length format,
and page compression attacks data redundancy.
SQL Server 2008, unlike SQL Server 2005, extends the capability of row compression to all fixed length
data types, to include integer, char, and float data types. This type of support allows data compression
without the application using the data requiring any type of change to access it. The goal is to reduce byte
storage consumption. If a common integer can fit in 1 byte, there is no reason to use 4 bytes to store it.
By using variable length storage format 3 bytes are saved. Variable characters are now a standard option
on column definitions.
LearnSmart
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
LearnSmart
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
Maintaining Indexes
SQL Server administrators must understand all types of indexing for database tables to ensure accuracy
when answering exam questions. The types of SQL Server indexes include:
QQ
Clustered Indexes
XML Indexes
Non-clustered Indexes
Composite Indexes
Unique Indexes
Certain index types are linked to the column type in use. A spatial index can only be used with a spatial
data type. A spatial index as with other indexes requires a primary key prior to creation. A spatial index
supports unique SQL Server data type features in the area of geometry and geography, which use the
spatial data types. Indexes are an underlying performance enhancement, which support rapid queries
and speed of data access. SQL Server database tables can contain clustered or non-clustered indexes.
A clustered index will keep the actual data rows at the leaf level of the index. Normally, they are in
ascending or descending order. Only one clustered index can exist on any one table or view. The term
heap refers to a table that has no clustered index think slow or small! Non-clustered indexes differ from
clustered indexes in that they dont actually contain the data in the rows, but only pointers to the data.
Non-clustered indexes cant be sorted, but you can create multiple non-clustered indexes per table
or view. In SQL Server 2008, 999 non-clustered indexes are supported. The composite index and the
unique index are two more forms of SQL Server table indexing. A composite index will contain multiple
columns, up to 16. A unique index, as you may have guessed, is based on a unique table parameter such as
a primary key. For example, when you create a primary key in a table an automatic unique clustered index
is created. XML columns in tables require an XML index which will create an indexed view of all the XML
tags, values, and paths of the XML data.
As an SQL administrator you will need to understand the commands available in SQL Server 2008 to change,
delete, or add indexes. These tasks are always included in other SQL Server maintenance operations. They are
also included in performance tuning as dropping and rebuilding indexes is a fairly common SQL Server
administrator task. The exam will expect fluency with the following T-SQL commands:
QQ
CREATE INDEX
DROP INDEX
LearnSmart
Column Name
Type of Data
Data In Indexes
CarID
Char(10)
IX_PRIMARY
IX_CARTYPE
IX_CARPRICE
Type
Varchar(25)
IX_CARTYPE
Description
XML
None
Value
Decimal
IX_CARPRICE
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
Performance testing indicates you need to add the Description column to the IX_CARTYPE index as an
included column. Table downtime should be minimized. What is the solution?
Exam Solution: Alter Index
You would use the CREATE INDEX WITH DROP EXISTING and perform this command offline.
The requirement to rebuild this index offline is due to adding an XML column, which is a large data type.
You can rebuild the index online if a large data type is not included. Anytime a column is added to an
index you must drop the existing index. In this case ALTER INDEX will not work because you cant add
a column with this command. In exam questions that focus on solving slow query time due to index
fragmentation, ALTER INDEX is a solution with the REBUILD WITH ONLINE switch.
Test Tip: Exam questions that focus on index management will always use DBCC commands in
possible answers. Take note that these commands, while good for database maintenance, do not
address most index management issues.
LearnSmart
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
LearnSmart
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
C2 audit mode and the common criteria along with other logging choices can impact disk space and
serve as the cause for SQL Server failure. When logging this much data, disk space is impacted. If smaller
drives are used as the location for audit log files and the physical disk fills up, SQL Server will not start.
The solution on the exam in this case can range from starting SQL Server with the f switch to moving the
audit files to another drive with more space.
The account under which an SQL Server service runs also presents an area for troubleshooting. If you
use all Active Directory domain accounts as the service accounts for SQL Server services and any Active
Directory group policy options require the Active Directory accounts to change the password (or they
expire), this will result in a failure of the SQL Server service using that Active Directory account.
Normally, you can use Powershell scripts or other management scripts to change SQL Server Active
Directory account passwords within the required time, which ensures that the account is changing
passwords per company policy or best practice. However, when an Active Directory account expires,
locks out, or requires a password change the SQL Server service in question can switch to Local System
to get SQL Server started again immediately while troubleshooting. Again, for exam purposes,
remember that the SQL Server Configuration Manager supports rapid change of SQL Server service
account information.
Exam Prep: Troubleshooting SQL Server Services
You have two Microsoft SQL Servers on your network. One is an SQL Server 2005 instance, and the other
is an SQL Server 2008 instance. SQL Server is configured with default connection information. The SQL
Server Agent runs on both SQL instances.
A new corporate application will access both SQL Servers for its data. You are working with the application
developers to troubleshoot any connectivity problems that may arise.
The developers report that they can only find connections to the SQL Server 2005 instance, but are forced
to specify a port when connecting to the SQL Server 2008 instance.
You are required to ensure both SQL Servers can return connection information automatically without
specifying a port. The solution should not change the established network security. What is the solution?
Exam Solution: Troubleshooting SQL Server Services
In this type of exam question the challenge of connecting to SQL Server 2008 is presented. The service
in question should immediately come to mind--the SQL Server Browser service. It is not configured
to run automatically in SQL Server 2008, which explains why the developers cannot enumerate or find
it automatically. The simple solution is the correct one--you need to start the SQL Server Browser service
and configure the service to start automatically. Answers that focus on changing an Active Directory
service account are not correct. Other answers on the exam for this scenario tend to focus on the SQL
Server Agent service, which again is not related to this issue.
Concurrency Problems
SQL Server operations sometimes suffer latency or dropped transactions due to blocks, deadlocks, or locks.
These are problems that require rapid identification with the SQL Server 2008 tools available.
As SQL Server reads and writes data to a database locking occurs to support concurrent operations. This is
SQL Servers method to ensure that the same data is not written to the same row/page at the same time.
This is also the heart of shared database access by users and applications.
The exam focuses on the tools available to troubleshoot concurrency problems and when to use them.
The SQL Server tools for diagnosing concurrency problems are the SQL Server Profiler, SQL Dynamic
Management Views (DMV), and the activity monitor.
LearnSmart
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
Blocking occurs when a lock is created and then escalated based on the size of the scan. For example, a
scan of data in a table that exceeds 5,000 locks will cause a lock escalation and ultimately block the next
process requesting the reading or writing of data. Blocks by default do not timeout. You can monitor SQL
Server for block conditions using the SQL Profiler and filter on the event (Lock: Escalation event),
the Dynamic Management View (DMV), or regular polling of system information for lock conditions.
These tools will help gather the information on blocks such as the block and lock time per object,
which will lead to a plan to remediate the condition. The SQL Profiler is available within the SQL Server
Management Studio and allows you to record information over time for a deeper analysis of the
blocking issue. The results are recorded to a trace file and are available for examination.
Test Tip: On the exam, questions focusing on time as a component of troubleshooting the lock issue
suggest the use of SQL Profiler.
Figure 40: SQL Server Profile Lock Events for Tracking Block Conditions
Deadlocks occur when several tasks within the same SQL Server memory space block each other by
having a lock on a resource that another database function is trying to lock. This situation is caused by
a variety of resources to include locks, worker threads, memory, parallel query execution, and multiple
active result sets. SQL Server is configured to automatically search for deadlocks every 5 seconds with
the database engine deadlock detection scheme. The resolution to a deadlock is made by the database
engine and is a result of choosing a thread to continue. This results in an error 1205, which is prevalent in
the exam questions on how to track deadlock conditions. Additionally, the trace flags 1204 and 1222 exist
to document deadlock conditions that have occurred. You can turn on trace flag 1204 as a standard event
to write to the SQL Server error log, which will allow parsing of this error by your management activities
and response to the occurrence. The flags list all relevant information for the deadlock to include what
process was running, the SPID, the mode, the row in question, the object, the key, and other metadata
that is available. When using SQL Profiler to track deadlock activity ensure you select the Deadlock Graph
as an option.
LearnSmart
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
LearnSmart
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
LearnSmart
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
The EXECUTE AS command is also a security context to support SQL Server Agent job execution. In some
cases the proxy account or other security context may not have the correct permissions to execute a
T-SQL job step. The EXECUTE AS command allows the use of a security context to complete the T-SQL step.
The SQL Server Agent proxy cant be used with a T-SQL job step because it is a Windows user account.
The stored procedure sp_enum_proxy_for_subsystem provides a return of which proxies have
permission for which SQL Server Agent job subsystems. This is another useful tool in troubleshooting
permission problems.
Another area of focus for the successful execution of SQL Server Agent jobs is the ability to connect to
required SQL instances. In multi-instance SQL Server configurations the SQL Server Agent can serve in a
master role, and then target other SQL Server instances for job execution. However, to ensure that jobs
work without SQL Server port or instance connectivity issues make sure you create a SQL Server alias for
the target servers. Additionally, during the creation of the Master/Target SQL Server job setup you can
specify the servers for connections.
Helpful diagnostic information for the status of steps in SQL Server Agent jobs can be obtained by
configuring each job step to log detailed information to a table. The contents of this table can be accessed
quickly with the following stored procedure: sp_help_jobsteplog. The SQL Server Agent core options
support the settings for the maximum number of rows entered for each job and the maximum size of the
log about the job.
LearnSmart
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
Figure 43: SQL Server 2008 Combined Log View with Filter Settings
Custom error information can be written to the SQL Server logs or Windows event logs. The management
stored procedure xp_logevent writes user defined messages to a log destination. Stored procedures in
use in SQL Server can harness this capability to ensure database errors are meaningful and assist in the
troubleshooting process. Anytime a stored procedure raises an error message, they are automatically
stored in the sys.messages table. A query against this table provides any errors logged by stored
procedures regardless of their severity level.
Exam Prep: Finding SQL Error Information for Job Execution
You manage 10 instances of Microsoft SQL Server 2008. You have one instance that has a problem every
hour with the SQL Server Agent service. Troubleshooting this problem requires detailed information about
the job execution. Execution trace messages should write to the SQL Server Agent error log. How will you
achieve this with in the quickest manner?
Exam Solution: Finding SQL Error Information for Job Execution
This exam question requires SQL Server Management Studio. A quick right-click on the SQL Server Agent
node will make the change to this tool easily. This type of question will present solutions that focus
on triggers that do not address this logging question. Other solutions suggest using DBCC as a form
of turning trace flags on. Again, this does not address configuring the SQL Server Agent to write trace
execution messages.
LearnSmart
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
Figure 44: Setting the Execution Trace Messages Option for SQL Server Agent
LearnSmart
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
An illustrative example involves preventing users in a group called MassiveQueryFun from using over
40% of the SQL Servers CPU time when running queries. The group in question is restricted much like
a restrictor plate on an engine in NASCAR racing. The restriction or policy element is the classifier that
activates the thresholds.
The resource governor is accessed via the Management Node of the SQL Server Management Studio.
Setup can occur via T-SQL statements. The elements of defining a resource with limits are the resource
pool, the workload group settings, and then the creation of classification functions for the resource pool.
Once the resource classification is registered with the resource governor then enable it and monitor the
performance. Here the DMVs allow quick query information about the resource use. The example below
establishes three workload groups under the default system resource pool and assigns the SQL Server
login power to the group, while establishing a classifier and then enabling it.
LearnSmart
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
LearnSmart
Conducts analysis of the possible changes, in effect a query test for index use, spanning the
query across tables, and actual query performance
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
2.
3.
4.
5.
Create a workload for the database tuning activity either in T-SQL script by collecting a trace file
or collecting trace data to a table. This should represent the work conducted by the database
you want to tune.
Determine what you want to tune: indexes, index views, or partitioning best practices.
The database engine tuning output is in the form of logs, summaries, and recommendations
rolled up into reports.
If necessary, you can use the results to perform hypothetical modeling by changing parameters
and analyzing them.
Once you agree with tuning recommendations you can apply them to the database in question,
or schedule for a later time.
Figure 46: SQL Server Profiler Trace File Using Database Tuning Template
LearnSmart
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
LearnSmart
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
LearnSmart
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
Figure 48: SQL Server Profiler XML Extract Screen for Deadlock Events
LearnSmart
Security
Transactions
Operating System
Extended Events
Service Broker
Resource Governor
Replication
Query Notifications
Objects
I/O
Indexes
Practice Exams
Audiobooks
Exam Manuals
Full-Text Search
Execution
Database
Mirroring
CLR
Change Data/Audit
1-800-418-6789
When using DMVs make sure you understand that there are thousands of possible T-SQL
combinations of DMV command syntax (this will be covered on the exam). However, always prefix the
name of the view or function by using the sys. schema. Then add the syntax for the function you query.
Security considerations require a minimum of SELECT permission on the object and VIEW SERVER STATE or
VIEW DATABASE STATE permissions.
A sample DMV query with results is shown below.
LearnSmart
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
LearnSmart
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
Minimum
Maximum
2 GB
3 GB
90
97
12 GB
16 GB
LearnSmart
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
Database Mirroring
SQL 2008 database mirroring is a database availability solution that literally mirrors all data from one
server database to another, provided each database is on a different SQL Server instance. One server is
a primary copy or one in use, and the other is termed a standby server. The SQL Server 2008 database
mirroring software and administration functions provide for the synchronization of data between the
two databases in such a way that no transaction is every dropped during the failover to the secondary
standby database. The database mirror can also use a standby database copy without up to date
synchronization, which can result in some data loss, but still provide a server ready to pick up
database operations in the event of a failure to the primary.
LearnSmart
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
Two SQL instances are identified, and then a database is selected for mirroring (in this case
the db Parts).
The mirroring wizard is run via SQL Server Management Studio, or you can use ALTER DB to
setup mirroring via T-SQL.
A mode is selected for the mirroring operation. In most cases, for complete recoverability, use the
High Safety mode with automatic failover, which requires a witness server.
The witness function works much like the witness server in Windows 2008 Server clustering and
the quorum drive concept:
If the witness server loses connection to the primary mirror server (Headers01) and is
connected to Headers02, a failover will occur (i.e., the second database server in the
mirroring configuration will serve as the primary database instance).
When the server Headers01 comes back online it will receive a message from the
witness server that Headers02 is now the primary server in the mirror setup, and this
will cause Headers01 to become the mirror database and to synchronize transactions.
This is a very elegant solution and one that in production situations induces 2-5 seconds of connectivity
for applications such as .NET web applications.
Exam Prep: SQL Server Database Mirroring
You manage multiple SQL Server 2008 servers for your company. You set up database mirroring. You use
automatic failover. An unexplained error occurs in the primary database in the mirrored databases.
The current database state is synchronizing. The witness state is not working. You need to act quickly to
fail the mirror over, and data loss is acceptable.
Exam Solution: SQL Server Database Mirroring
This exam style question immediately should prompt you to know the method of failover is now the
forced method. The forced method is one used to force failover, but does not guarantee that data loss will
not occur. This method of forcing a service change is normally used in a disaster recovery situation,
where the primary member of the mirror may never come back online. One other clue in the question
pointing to the direness of the situation is that the current database state is synchronizing, which means
they are not synchronized anymore.
Manual failovers require both the primary and mirror server to be online and the database to
be synchronized. Replacing the mirror server is not a good solution at any point in SQL Server database
mirroring operations.
There are many security considerations with SQL Server database mirroring. One upper level concern is if
users are going to login to SQL Server databases with SQL credentials, then you must ensure you create
all logins and database mappings in the mirror target SQL Server because you cannot mirror the master
database on the primary mirror server, which houses all logins and mappings. Database object users and
assignments will exist due to mirroring.
LearnSmart
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
LearnSmart
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
Figure 51 shows a simple Windows 2008 R2 SQL cluster with active/passive configuration. The Windows
cluster is comprised of two nodes: Cluster01 and Cluster02. Once Windows 2008 R2 clustering is
configured, a third virtual or cluster node is created, for example, ClusterNode. It is, however, comprised of
the two physical servers listed above. This cluster can now provide failover for standard Windows services:
DHCP, file sharing, disk drives, and other cluster available resources.
Once a cluster is established and functional, a SQL Server installation is possible that will detect
the Windows cluster and allow the running of the setup path to establish a SQL cluster instance.
This will create the virtual SQL cluster SQL01 with its own IP address. The SQL cluster instance will
have several prerequisite resources to launch, the primary one being the disk drives that are cluster
accessible and meet the definition of a shared cluster resource. The virtual SQL cluster instance named
SQL01 is what clients and applications would target in their connection string definitions and for
other database operations.
The failover process can identify which cluster nodes are available as possible owners of the cluster
SQL instance. If Cluster02 was a failover host for Cluster01, and Cluster01 suffered a bluescreen/memory
dump issue, then a failover of all SQL instances and services would occur to Cluster02.
This is amazing technology and represents a top-level solution for protecting SQL Server availability.
The failover process also provides quite a bit of management support; you can update and patch
servers in a cluster, as required, without downtime as you failover resources to allow for the updates
to occur. Server reboots and other types of maintenance such as a RAM addition can also occur during
the production cycle of a day due to failover capability.
LearnSmart
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
Apply any uncommitted backups from the backup file share established for log shipping.
Recover the secondary database to place it in a consistent state, and bring it online.
LearnSmart
Advise clients of the database instance name where the previously primary database
is now running.
If desired, set up other secondary log shipping targets for the new primary database.
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
SQL Replication
SQL Replication provides SQL Server the capability of pushing data to multiple endpoints in a
controlled and synchronized fashion. SQL Replication can replicate data between SQL Server and a
client, such as a remote inventory system or mobile notebook users. In the high availability space, SQL
Replication can replicate data from SQL instance to SQL instance. This can also include supporting data
warehousing or reporting.
SQL Server replication uses a push or pull type model to replicate data predicated on standard
taxonomies of publisher, subscriber, publications, articles and subscriptions. The publisher is a source
database pushing data out to other instances. The distributor is a database store that houses transaction
data about the replication process to include replication status, metadata, and in some cases queues
data for subscribers. The publisher and distributor can be the same database. Subscribers are databases
that receive the publishers data. A subscriber can be bidirectional in that it passes data back and forth
to publishers. An article is a database object in a publication. This can be views, tables, or indexes.
Data published as articles supports SQL Replication filtering, which is important for determining the
data composition of what is moved. A publication is a grouping or article, which can be from more than
one database. Lastly, the subscription is a request for data from a subscriber. The subscription determines
what will be received, when, and where. Subscriptions are push or pull type replication.
LearnSmart
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
Merge Publication - A snapshot of the data occurs and then changes are tracked
with triggers. When subscribers connect up for the changes, all changes are published
since the previous replication.
Transactional Publication/Transactional Publication With Updatable Subscriptions -
A snapshot of the data occurs at the publication database and then any changes are delivered to
the subscribers as they occur. With updatable subscriptions, changes made at the subscriber are
applied back to the publisher.
Snapshot Replication Snapshots are point in time replications and no updates are tracked.
LearnSmart
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
LearnSmart
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
Practice Questions
Chapter 1
1.
You work as the DBA for a medium-sized organization. One of the employees in the Accounting
department has asked for your assistance. She needs a solution that will allow her to transfer
data from a MySQL database into a SQL Server database. Additionally, the data must be modified
in several ways during the process. Which SQL Server add-on component will meet her needs?
Select the best answer.
A. SSAS
B. SSRS
C. SSIS
D. Database Mirroring
2.
You are configuring database mail and have been asked to ensure that fault tolerance is
provided in order to accommodate for random mail server outages. Which of the following
actions can provide this fault tolerance? Select the best answer.
A. Set the logging level to verbose
B. Configure multiple mail profiles
C. Configure the account retry delay to 90 seconds
D. Set the database executable minimum lifetime to 900 seconds
3.
You are using a mail server to send mail from SQL Server 2008s database mail feature. The mail
server has intermittent problems that may cause it to be unavailable for 30 to 90 seconds.
You cannot use any other mail servers. What database mail configuration parameters may
help ensure that email messages are delivered? Choose TWO.
A. Account retry delay
B. Create additional profiles
C. Logging level
D. Account retry attempts
Chapter 2
1.
When you configure an alert in the SQL Server Agent, only two response options are provided.
What are these TWO options? Choose TWO.
A. Execute job
B. Notify operator
C. Shutdown server
D. Check the integrity of a database
LearnSmart
Practice Exams
Audiobooks
Exam Manuals
2.
1-800-418-6789
You are configuring an alert. Based on the screen in Exhibit 1, what is the value of buffer cache
hit ratio that would fire the alert? Select the best answer.
A. Less than 7.8 percent
B. Less than 78 percent
C. Greater than 78 percent
D. Greater than 7.8 percent
Exhibit(s):
3.
You manage a single SQL Server 2008 instance with a single database called Engineering that
is in the Simple recovery mode. You have been creating Full backups every night for
several months. Now, the database has grown so large that the full backups take more than
two hours. You want to perform Full backups every weekend and a different type of
backup nightly. You must use the built-in backup feature of SQL Server 2008. What kind of
backup can you perform that cooperates with the weekend Full backups, but requires less time?
Select the best answer.
A. Incremental
B. Modified
C. Transaction Logs
D. Differential
LearnSmart
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
Chapter 3
1.
You are the DBA for a large SQL Server 2008 database application. When you arrived at work this
morning, you had several messages waiting from users. The users were complaining that they
cannot access the data. They are receiving access denied errors. All users could access the data
the previous day. At the end of the day yesterday, you performed the following tasks:-Manual
backup of the database-Manual snapshot of the database-Modified the ownership of the
call_routing table-Added seven new usersWhich action is the likely cause of the problem?
Select the best answer.
A. Manual backup of the database
B. Manual snapshot of the database
C. Added seven new users
D. Modified the ownership of the call_routing table
2.
You are running SQL Server 2008 Enterprise Edition. You want to enable a server-level audit
procedure using SQL Server Audit. What THREE steps are required to implement a server-level
audit procedure? Choose THREE.
A. Create a server audit specification
B. Enable the audit
C. Create an audit object
D. Create the server-level permission for the audit
3.
You are installing a test server that will run SQL Server 2008 Enterprise Edition. The installation
OS will be Windows Server 2008. You want to ensure that remote connections are not allowed.
How will you configure the installation so that only local connections are allowed but outgoing
network connections can be made? Select the best answer.
A. Unplug the network cable
B. Disable the TCP/IP provider
C. Block TCP port 21 in the firewall
D. Use the sp_configure Block Net-Conn, 1 command
Chapter 4
1.
You are creating a backup plan. Part of the plan includes the backup of the transaction logs for
your databases. All databases run on SQL Server 2008 Standard Edition and the OS used is
Windows Server 2008. You plan to use the following code to backup the transaction logs:
BACKUP LOG database_name TO DISK=logback.bak;In addition to backing up the log, what
would this code do? Select the best answer.
A. Truncate the log
B. Backup the database
C. Restore the database
D. Truncate the database
LearnSmart
Practice Exams
Audiobooks
Exam Manuals
2.
1-800-418-6789
You are the DBA for a small company in Florida. You must restore a database named Sales to 3:17
yesterday afternoon. The current date is Wednesday, June 3, 2009. A full backup is made every
night to a file named day-fback.bak. The transaction log is backed up each day at 12 PM and is
backed up to a file named day-log.bak. Each week, the files are overwritten. Which THREE of the
following four commands would accurately restore the database to the desired point-in-time
if run in the proper order? Choose THREE answers.
A. RESTORE LOG Sales FROM DISK=TUESDAY-log.bak WITH NORECOVERY, STOPAT = Jun 2,
2009, 3:17 PM;
B. RESTORE DATABASE Sales WITH RECOVERY;
C. RESTORE DATABASE Sales FROM DISK=TUESDAY-fback.bak WITH NORECOVERY;
D. RESTORE LOG Sales FROM DISK=TUESDAY-log.bak WITH NORECOVERY, UNTIL = Jun 2,
2009, 3:17 PM;
3.
You are creating a Maintenance Plan using the wizard. You want to make sure the database is not
corrupt and perform a complete backup of the database. In Exhibit 1, what options should
you check? Choose TWO.
A. Rebuild Index
B. Back Up Database (Full)
C. Shrink Database
D. Check Database Integrity
Exhibit(s):
LearnSmart
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
Chapter 5
1.
You are the DBA for a carefully planned database. In spite of the intensive planning, the data
files are still growing very large. Youve been asked to look for a solution that reduces the storage
space consumed by the database. The performance of the database is not an issue, as the users
have been very happy with the performance; however, space is quickly running out on the
server and the server administrator has indicated that more drive space cannot be purchased
until the next budget cycle. What feature can be used to conserve storage space?
Select the best answer.
A. Data compression
B. Transparent data encryption
C. Table partitioning
D. Filestreams
2.
3.
You are attempting to create an index with the WITH ONLINE = ON clause. However, it is
not working. You are running the SQL Server instance on Windows Server 2008 Standard Edition
and the SQL Server instance is running the Standard Edition of SQL Server 2008. Service pack
1 has not been applied to the Windows Server. Several users are connected to the database.
Why is the index command not working? Select the best answer.
A. Because Windows Server 2008 Enterprise Edition is required
B. Because users are connected
C. Because service pack 1 has not been applied
D. Because SQL Server 2008 Enterprise Edition is required
4.
LearnSmart
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
Chapter 6
1.
You must configure a SQL Server Agent job to use a proxy account for access to the Windows
file system. The job runs on a SQL Server 2008 Standard Edition server. You have taken the
following steps:1. Right-click on the Credentials folder and select New Credential.2. In the New
Credential windows enter a Credential name of OSCommand and select a user with appropriate
permissions as the identity and provide a password.3. Click OK to create the Credential.When
you attempt to create the job step and use the proxy account, you see the Run As option
shown in Exhibit 1. What steps do you need to take to use the proxy account based on the
OSCommand credential? Select the best answer.
A. You must use the CREATE JOB STEP command since proxies are not available in
the GUI interface.
B. Proxies are only supported on Enterprise Edition.
C. Create a new Operating System (CmdExec) proxy in the Proxies container within the SQL
Server Agent node.
D. The proxy account is configured as the owner of the job and not using the Run As
parameter of a step.
Exhibit(s):
LearnSmart
Practice Exams
Audiobooks
Exam Manuals
2.
1-800-418-6789
You are creating a proxy object for use with a SQL Server Agent job. You want a single object to
provide all needed access for the job. The job runs several steps including the following
actions:-Powershell-Batch files-VBScript scripts-Transact-SQLWhat SQL Server Agent subsystems
should be enabled for the proxy object? Choose all that apply.
A. SQL
B. Operating System (CmdExec)
C. ActiveX Script
D. PowerShell
3.
Which of the following tools can be used to see the most recent run of a SQL Server Agent job?
Choose all that apply.
A. Configuration Manager
B. Job Activity Monitor
C. Job Manager
D. Log File Viewer
Chapter 7
1.
Which of the following are valid recommendations that DTA can make for performance
improvements after a workload analysis? Choose all that apply.
A. Adding indexes
B. Reserved memory recommendations
C. Deleting indexes
D. Partitioning tables
2.
You want to schedule a Database Engine Tuning Advisor analysis to run during the night.
You plan to implement a SQL Server Agent job. On what command line tool must the job call to
perform the analysis? Select the best answer.
A. DETA.EXE
B. DTA.VBS
C. DETA.VBS
D. DTA.EXE
3.
Which one of the following is NOT an item that you define within a SQL Server Profiler trace?
Select the best answer.
A. Events
B. Stored Procedure
C. Data columns
D. Filters
LearnSmart
Practice Exams
Audiobooks
Exam Manuals
4.
1-800-418-6789
You want to evaluate fragmentation statistics for each index in a database. Which DMV will
you query? Select the best answer.
A. sys.dm_db_index_operational_stats
B. sys.dm_db_index_fragments
C. sys.dm_db_index_physical_stats
D. sys.dm_db_index_physical_fragments
Chapter 8
1.
You are planning a database mirroring implementation. Which roles, related to mirroring, must
be configured on separate SQL Server instances? Choose all that apply.
A. Primary
B. Witness
C. Mirror
D. Reflection
2.
You are implementing replication. High availability is important. Youve chosen to implement
replication with the following options:-Immediate Updating Subscribers-Publisher-Distributor-
Subscriber Which one should be changed or removed to allow for high availability?
Select the best answer.
A. Publisher
B. Distributor
C. Immediate Updating Subscribers
D. Subscriber
LearnSmart
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
2. Answer: B
Explanation A. Incorrect. Configuring the logging level to verbose will provide more information about a
failure, but it does not provide for fault tolerance in any way.
Explanation B. Correct. Database mail allows for the configuration of multiple mail profiles so that a
secondary or tertiary profile may be used if the first mail profile is not functioning.
Explanation C. Incorrect. Setting the account retry delay to 90 seconds will potentially give the mail server
more time to become available again, but it does not create a fault tolerant scenario.
Explanation D. Incorrect. This setting determines how long the database mail send utility will remain
in memory. Keeping it in memory longer prevents reloading it again and again, but it does not improve
fault tolerance - even though it may improve performance.
3. Answers: A, D
Explanation A. Correct. The default for this setting is 60 seconds. In this scenario, setting the value to 90
seconds would increase the likelihood that the email could be sent on the second attempt.
Explanation B. Incorrect. Creating additional profiles will not help since you can only use the
one mail server.
Explanation C. Incorrect. Configuring the logging level only determines the detail of the logs.
Explanation D. Correct. By default the account will be retried only once. By increasing this value, you
ensure that the email will get out when the mail server becomes available.
LearnSmart
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
Chapter 2
1. Answers: A, B
Explanation A. Correct. You can execute any pre-configured job when an alert fires.
Explanation B. Correct. You may notify an operator when an alert fires.
Explanation C. Incorrect. You may be able to shutdown the server through a job, but the alert can only call
on the job.
Explanation D. Incorrect. You can check database integrity through a job, but the alert can only call
on the job.
2. Answer: B
Explanation A. Incorrect. The correct response is less than 78 percent.
Explanation B. Correct. Anything less than 78 percent would fire this alert.
Explanation C. Incorrect. The correct response is less than 78 percent.
Explanation D. Incorrect. The correct response is less than 78 percent.
3. Answer: D
Explanation A. Incorrect. Incremental backups are not supported by the internal SQL Server backup tools.
Explanation B. Incorrect. No such backup feature exists in SQL Server.
Explanation C. Incorrect. Backing up the transaction logs for a database in the simple recovery mode
would not provide full recoverability.
Explanation D. Correct. Differential backups are used to backup only changed data pages in the database.
Chapter 3
1. Answer: D
Explanation A. Incorrect. Backing up the database should not impact permissions.
Explanation B. Incorrect. Creating a snapshot should not impact permissions.
Explanation C. Incorrect. Adding users should not cause existing users to lose access.
Explanation D. Correct. The change in ownership has likely resulted in a break in the ownership change.
LearnSmart
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
2. Answers: A, B, C
Explanation A. Correct. The server audit specification will specify the item to audit and map it to
the audit object.
Explanation B. Correct. The final step is to enable the audit.
Explanation C. Correct. The audit object will define the target of the audit.
Explanation D. Incorrect. Permissions are not required since the audit procedure runs within the SQL
Server database engine service.
3. Answer: B
Explanation A. Incorrect. This action would prevent outgoing network connections as well.
Explanation B. Correct. The TCP/IP provider enables remote connections. You can disable them by turning
off the provider in the SQL Server Configuration Manager.
Explanation C. Incorrect. While this might prevent FTP traffic, it will do nothing to prevent SQL Server
access on port 1433.
Explanation D. Incorrect. No such command exists.
Chapter 4
1. Answer: A
Explanation A. Correct. By default, without the use of the TRUNCATE_ONLY parameter, the log file is
truncated after the backup completes. This truncation means that the backed up portion of the log is
emptied for reuse.
Explanation B. Incorrect. The BACKUP LOG command never backs up the database.
Explanation C. Incorrect. The RESTORE DATABASE command is used to restore a database.
Explanation D. Incorrect. While the command will truncate the log file, it will not do so to the database.
2. Answers: A, B, C
Explanation A. Correct. This is the second step in the process.
Explanation B. Correct. This is the third step in the process.
Explanation C. Correct. This is the first step in the process.
Explanation D. Incorrect. The proper WITH parameter is STOPAT and not UNTIL.
LearnSmart
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
3. Answers: B, D
Explanation A. Incorrect. Rebuild Index will drop the indexes and regenerate them.
Explanation B. Correct. Back UP Database (Full) will perform a complete backup of the database.
Explanation C. Incorrect. Shrink Database will shrink the physical size of the database file based on free
space within the file.
Explanation D. Correct. Check Database Integrity will perform a DBCC CECKDB operation
against selected databases.
Chapter 5
1. Answer: A
Explanation A. Correct. As long as the server is running Enterprise Edition, data compression can be used.
Data compression is not available in Standard Edition.
Explanation B. Incorrect. TDE does not conserve storage space.
Explanation C. Incorrect. Partitioning does not conserve storage space.
Explanation D. Incorrect. Filestreams do not conserve storage space.
2. Answers: A, B, C
Explanation A. Correct. The intermediate pages contain branches leading to other intermediate pages
or leaf pages.
Explanation B. Correct. The leaf pages contain the actual data stored in the index.
Explanation C. Correct. Each index contains one and only one root page.
Explanation D. Incorrect. While a B-tree structure is used, the pages are not called branch pages.
3. Answer: D
Explanation A. Incorrect. SQL Server 2008 Enterprise Edition is required, but Windows Server 2008
Standard Edition can run the Enterprise Edition of SQL Server.
Explanation B. Incorrect. If SQL Server 2008 Enterprise Edition were in use, the users could be connected
and the command could run.
Explanation C. Incorrect. SP1 is not needed for online operations; however the Enterprise Edition of SQL
Server 2008 is required for online operations.
Explanation D. Correct. Online index operations require SQL Server 2008 Enterprise Edition.
LearnSmart
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
4. Answers: A, B, C
Explanation A. Correct. The CI stands for case insensitivity.
Explanation B. Correct. The AI stands for accent insensitivity.
Explanation C. Correct. The word French clearly denotes the French language.
Explanation D. Incorrect. AI does not stand for Albanian icons, it stands for accent insensitivity.
Chapter 6
1. Answer: C
Explanation A. Incorrect. A proxy object must be created that is based on the credentials configured in
these three steps.
Explanation B. Incorrect. Proxies are supported on both Enterprise and Standard editions.
Explanation C. Correct. Proxies are based on credentials. Creating the credential is only the first step.
Next, you need to expand the SQL Server Agent node and then the Proxies node. Now, right-click on
the Operating System (CmdExec) node and selectn New Proxy. Enter the proxy name, credential name
of OSCommand and check Operating System (CmdExec) in the Subsystem section. Click OK to
create the proxy.
Explanation D. Incorrect. The proxy account is configured using the Run As parameter; however,
a credential is the foundation for a proxy object. The proxy object must be created in the Proxies node of
the SQL Server Agent node.
2. Answers: B, C, D
Explanation A. Incorrect. Transact-SQL commands are handled internally without the need for a proxy.
Explanation B. Correct. The batch files will require CmdExec to operate.
Explanation C. Correct. The VBScript code will require the ActiveX Script subsystem to function.
Explanation D. Correct. The PowerShell subsystem is used to execute PowerShell commands.
3. Answers: B, D
Explanation A. Incorrect. The SSCM is used to configure SQL Server services and it does not show
information about job histories.
Explanation B. Correct. The Job Activity Monitor can be used to see that last run and whether it was
successful or not.
Explanation C. Incorrect. No such utility exists in SQL Server.
Explanation D. Correct. The Log File Viewer is displayed when you right-click a job and select View History.
LearnSmart
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
Chapter 7
1. Answers: A, C, D
Explanation A. Correct. One of the simplest ways to improve database performance is to add
appropriate queries. DTA can recommend additional indexes.
Explanation B. Incorrect. DTA makes no recommendations related to instance configuration.
Explanation C. Correct. You can often improve performance by deleting indexes that are not
being utilized. DTA can recommend dropping indexes.
Explanation D. Correct. Table partitioning can be recommended by DTA, though it is not by default.
2. Answer: D
Explanation A. Incorrect. The acronym for the Database Engine Tuning Advisor is DTA and not DETA.
Therefore the command is DTA.EXE.
Explanation B. Incorrect. The command is an executable (DTA.EXE) and not a script.
Explanation C. Incorrect. The command is an executable (DTA.EXE) and not a script.
Explanation D. Correct. With DTA.EXE you can configure an analysis process within a job.
3. Answer: B
Explanation A. Incorrect. You select events when building the trace.
Explanation B. Correct. You can capture information about stored procedures, but they are not part of the
trace configuration.
Explanation C. Incorrect. You can select specific data columns related to the events when
building the trace.
Explanation D. Incorrect. You can create filters when building the trace.
4. Answer: C
Explanation A. Incorrect. This DMV reports locking, latching and access stats for the indexes
and not fragmentation.
Explanation B. Incorrect. No such DMV exists.
Explanation C. Correct. This DMV will report fragmentation statistics.
Explanation D. Incorrect. No such DMV exists.
LearnSmart
Practice Exams
Audiobooks
Exam Manuals
1-800-418-6789
Chapter 8
1. Answers: A, B, C
Explanation A. Correct. The Primary role must exist in a separate instance from the Mirror or Witness roles.
Explanation B. Correct. The Witness role must exist in a separate instance from the Primary
and Mirror roles.
Explanation C. Correct. The Mirror role must exist in a separate instance from the Primary
and Witness roles.
Explanation D. Incorrect. No such role exists in a mirror relationship.
2. Answer: C
Explanation A. Incorrect. The publisher is required. Without it, you will have no replication at all.
Explanation B. Incorrect. The distributor is required. It may be the same server as the publisher,
but it is required.
Explanation C. Correct. If high availability is important, you should use queued updating subscribers,
which can process transactions even if the publisher is unavailable.
Explanation D. Incorrect. If you have no subscribers, you have no need for replication.
LearnSmart
Practice Exams
Audiobooks
Exam Manuals