Showing posts with label DBA - Sql Server. Show all posts
Showing posts with label DBA - Sql Server. Show all posts

Friday, October 21, 2011

Finding a particular text in a stored procedure.

SELECT OBJECT_NAME(id)
    FROM syscomments
    WHERE [text] LIKE '%Decode%'
    AND OBJECTPROPERTY(id, 'IsProcedure') = 1
    GROUP BY OBJECT_NAME(id)

Thursday, September 15, 2011

Error:The backup set holds a backup of a database other than the existing database

USE [master]



RESTORE DATABASE Ambio360SProd


FROM DISK = 'C:\Ambio360S-15092011-0222am.bak'


WITH REPLACE

Wednesday, June 29, 2011

SQL SERVER – 2008 – Find If Index is Being Used in Database

 It is very often I get query that how to find if any index is being used in database or not. If any database has many indexes and not all indexes are used it can adversely affect performance. If number of index is higher it reduces the INSERT / UPDATE / DELETE operation but increase the SELECT operation. It is recommended to drop any unused indexes from table to improve the performance.

Before dropping the index it is important to check if index is being used or not. I have wrote quick script which can find out quickly if index is used or not. SQL Server 2005 and later editions have Dynamic Management Views (DMV) which can queried to retrieve necessary information.

We will run SELECT on Employee table of AdventureWorks database and check it uses Indexes or not. All the information about Index usage is stored in DMV – sys.dm_db_index_usage_stats. Let us run following query first and save our results.

USE AdventureWorks
GO
SELECT DISTINCT OBJECT_NAME(sis.OBJECT_ID) TableName, si.name AS IndexName, sc.Name AS ColumnName,
sic.Index_ID, sis.user_seeks, sis.user_scans, sis.user_lookups, sis.user_updates
FROM sys.dm_db_index_usage_stats sis
INNER JOIN sys.indexes si ON sis.OBJECT_ID = si.OBJECT_ID AND sis.Index_ID = si.Index_ID
INNER JOIN sys.index_columns sic ON sis.OBJECT_ID = sic.OBJECT_ID AND sic.Index_ID = si.Index_ID
INNER JOIN sys.columns sc ON sis.OBJECT_ID = sc.OBJECT_ID AND sic.Column_ID = sc.Column_ID
WHERE sis.Database_ID = DB_ID('AdventureWorks') AND sis.OBJECT_ID = OBJECT_ID('HumanResources.Employee');
GO
The query retrieving hits on index of a particular table
So, whenever the query is made on the particular table the table the
sys.dm_db_index_usage_stats and increment column user_seeks is automatically updated based on the particular index. The field which concerns us is the user_seeks which gets automatically incremeted when a particular index is beeing used.

Above whole process explains that any index usage is stored in the sys.dm_db_index_usage_stats. DMV sys.dm_db_index_usage_stats stores all the usage since SQL Server is restarted. Once SQL Server service is restarted sys.dm_db_index_usage_stats is reset to zero but over the period of the time it updates the values in the columns. 

If SQL Server services are not restarted in reasonable amount of time and if any index usage is not found, the index should be dropped. Again, make sure you have test your performance after dropping the index. If it gets worst put that index back and continue exercise.
 

Tuesday, June 21, 2011

Dropping multiple stored procedure in SQL SERVER

DROP PROCEDURE uspGetAddress, uspInsertAddress, uspDeleteAddress

Monday, May 9, 2011

How to restore a database when the database is in use??

Usually when we are restoring the database in the server.. when the database is in use it will throw the pop up error the database is in use. So inorder to fix that pop up error, do the following..

ALTER DATABASE <DBNAME> 
SET RESTRICTED_USER WITH ROLLBACK IMMEDIATE

Thursday, April 7, 2011

Clean unused space when a variable-length column is dropped from a SQL Server table or indexed view

Problem

Whenever rows or columns are deleted from a table, SQL Server reuses space to accommodate new data.  However, this is not the case when a variable length column is deleted.  The space used by this type of column is not reclaimed automatically.  Variable length columns can be of data types varchar, nvarchar, varchar(max), nvarchar(max), varbinary, varbinary(max), text, ntext, image, sql_variant, and xml.  In this tip, I have tried to put forth a solution to reclaim this space and this should work for SQL 2005 and later versions. 

--script 1
--Create Table

CREATE TABLE testfreespace
( column1 INT
,column2 CHAR(20)
,column3 VARCHAR(8000) )

--Populate with test data
DECLARE @count INT ;
SET @count = 0 ;
WHILE @count < 3000
BEGIN
SELECT
@count = @count + 1 ;
INSERT into testfreespace VALUES ( @count,'test row # ' + CAST(@count AS VARCHAR(10)),REPLICATE('TestData', 3000)) ;
END 



Script to test the free space

-- script 2
SELECT
alloc_unit_type_desc,
page_count,
avg_page_space_used_in_percent,
record_count
FROM sys.dm_db_index_physical_stats( DB_ID('Test'),OBJECT_ID(N'Testfreespace'),NULL,NULL,'Detailed') 

  
--script 3
ALTER TABLE Testfreespace DROP COLUMN Column3 

when we run the script 3 the table free space for the variable length column has not been released. 


In order to reclaim this space we have to use DBCC CLEANTABLE command.
Syntax: DBCC CLEANTABLE( Databasename, "Table_name or View_name")
In our case the command would be as follow.

-- script 4
DBCC CLEANTABLE (Test,"Testfreespace")
 
 
During the CLEANTABLE task the table is exclusively locked and if the size of a column is pretty large it might take a considerable amount of time.  Also, the minimum privilege to execute this command is to be a table owner.  At the server level, only DDL admin and/or sysadmin can do so.  

FYI
  • The same outcome can be achieved by rebuilding clustered index, but that maybe more resource intensive.
  • Developers can be suggested to use command after any variable length column is dropped in a table, this will optimize existing disk space.
  • For huge tables it might use a lot of transaction log space, so you may want to divide this into smaller batches.
  • DBCC CLEANTABLE is not supported on system tables or temporary tables. 
Resource:
http://www.mssqltips.com/tip.asp?tip=2348

Monday, April 4, 2011

Database Design

Examples: Sales

Transformation of System : Office -> Excel -> SQL-Server.


Tables:

Product Table -> ProductNo(int,pk) / Description (nvarchar(50) / UnitPrice money / Quantity int (it will be moved to ReceiptProduct table due to many-many relationship constraint).

Customer Table -> CustomerNo(int,pk) / Name nvarchar(50) / Address nvarchar(100) / Town nvarchar(50).

Receipt Table -> ReceiptNo(int, pk) / ProductNo(int)/CustomerNo(int) / Date datetime.

ReceiptProduct (Bridge Table) ->  ProductNo / ReceiptNo / Quantity.





The Normalization is the key concept to the database design.

Split the table into multiple tables to match the real world entity

The main theme of normalization was to use more than one table, with each table representing a single type of real-world thing. This meant recognising that receipts, customers, and product items were fundamentally different things in a sales process, and each should be stored in its own table.
 
The idea of breaking up one large table into several to reduce duplication is called normalization.

Identification of Rows 
There should be one key to identify the rows that's the purpose we have the primary key or unique key.

Dependencies on Keys

The Important part of table design is to ensure that each column in a table states a single fact about a primary key. So in a product table it contains fields like ProductNo(pk), Description, UnitPrice,Quantity.  Here Description and UnitPrice depends upon the primary key ProductNo. but the quantity depends upon the Both ProductNo. and ReceiptNo. in the Receipt Table. So why cant we have the ReceiptNo(Receipt table) in the Product Table. The reason is  because then the other columns (Description and UnitPrice) would depend on just part of the new key. It was essential that for each column in a table to depend on a key, the whole key, and nothing but the key.

Many-Many Relationship

The task of linking the Receipts and Products tables. The link between these tables was going to be a bit different because each product could appear on many different receipts, and each receipt could contain many different products. This, she said, was known as a many-to-many relationship.

It can be thought of  adding foreign keys to each table to represent this kind of link, and it made his head hurt almost immediately. Each table would need to hold a variable number of foreign keys in each row - which is not all we wanted.

Alice said that to model this relationship correctly, she would need to add another table to the design. It is called as Bridge Table, and it would sit between the Products and Receipts tables.

The bridge tables were necessary whenever two tables had a many-to-many logical relationship. Because the new table modelled the link between the Receipts and Products tables, its primary key would be a combination of the primary keys of the related tables (called as compound key).



Enforce one-to-many relationships using foreign keys and constraints.

Model many-to-many relationships using bridge tables


Resource: http://www.sqlservercentral.com/articles/Database+Design/72054/



Wednesday, January 19, 2011

How to centralize your SQL Server Event Logs


SQL Server running on Windows provides a wonderful feature many DBA's overlook. It's called the Event Log. SQL Server will automatically record all sorts of info about what SQL is doing into the Event Log. I am sure we have all read the Event Log looking for things like why a server rebooted unexpectedly or to try to figure out why something is not working. But there is a lot of info in there that we may overlook. Especially since most of it does not really apply to SQL.
There are many modern day tools that will make this function very easy, but they typically cost a lot of money. Not that I am opposed to spending the boss's money, but sometimes the boss will not let me. So I came up with a process that will get some important info about SQL into my hands without having to spend time on every single instance setting up individual alerts and SQL Agent jobs to find this information.

Step 1 - create the database

USE [MASTER]
GO
CREATE DATABASE [DBA]
GO

Step 2 - Create the Table

USE [DBA]
GO-----------------------------------
SET ANSI_NULLS ON
GO
-----------------------------------
SET QUOTED_IDENTIFIER ON
GO
-----------------------------------
CREATE TABLE [dbo].[EventLogStaging](
[RecordNumber] [int] NOT NULL,
[Category] [int] NOT NULL,
[ComputerName] [nvarchar](250) NOT NULL,
[EventCode] [int] NOT NULL,
[EventType] [int] NOT NULL,
[Message] [nvarchar](4000) NULL,
[SourceName] [nvarchar](250) NOT NULL,
[TimeGenerated] [datetime] NULL,
[TimeWritten] [datetime] NOT NULL
) ON [PRIMARY]
-----------------------------------
GO

Step 3 - Create a VB script as EventLog2DB.vbs

strComputer = "."
Set objConn = CreateObject("ADODB.Connection")
Set objRS = CreateObject("ADODB.Recordset")
objConn.Open "Provider=SQLOLEDB.1;Data Source=.;
Initial Catalog=DBA;Integrated Security=SSPI"

objRS.CursorLocation = 3
objRS.Open "SELECT * FROM EventLogStaging" , objConn, 3, 3
' Get to the Event Log
Set objWMIService = GetObject("winmgmts:" _& "{impersonationLevel=impersonate}!\\" & strComputer & "\root\cimv2")
' get the events we want 
query = "Select * from __InstanceCreationEvent" _& " " & "where TargetInstance isa 'Win32_NTLogEvent'" _& " " & "and TargetInstance.Logfile = 'Application'" _& " " & "and (TargetInstance.EventType = 1 or TargetInstance.EventType = 2)" _& " " & "and (TargetInstance.SourceName like 'MSSQL%')"
' get ready to insert into our DBA table 
Set colMonitoredEvents = objWMIService.ExecNotificationQuery(query)
Do
Set objLatestEvent = colMonitoredEvents.NextEvent
objRS.AddNew
objRS("RecordNumber") = objLatestEvent.TargetInstance.RecordNumber
objRS("Category") = objLatestEvent.TargetInstance.Category
objRS("ComputerName") = objLatestEvent.TargetInstance.ComputerName
objRS("EventCode") = objLatestEvent.TargetInstance.EventCode
objRS("EventType") = objLatestEvent.TargetInstance.EventType
objRS("Message") = objLatestEvent.TargetInstance.Message
objRS("SourceName") = objLatestEvent.TargetInstance.SourceName
objRS("TimeGenerated") = WMIDateStringToDate(objLatestEvent.TargetInstance.TimeGenerated)
objRS("TimeWritten") = WMIDateStringToDate(objLatestEvent.TargetInstance.TimeWritten)
objRS.UpdateLoop
' if we ever finish, we close cleanly.
objRS.Close
objConn.CloseSet objRS = Nothing
Set objConn = Nothing
'******************************************************************************
'* This conversion is necessary because WMI uses a different date/time format *
'******************************************************************************
Function WMIDateStringToDate(dtmInstallDate)
WMIDateStringToDate = CDate(Mid(dtmInstallDate, 5, 2) & "/" & _
Mid(dtmInstallDate, 7, 2) & "/" & Left(dtmInstallDate, 4) _& " " & Mid (dtmInstallDate, 9, 2) & ":" & _
Mid(dtmInstallDate, 11, 2) & ":" & Mid(dtmInstallDate, _13, 2))
End Function

Hey, what's this VB script doing to my system? Is it safe?

strComputer = "."
Set objConn = CreateObject("ADODB.Connection")
Set objRS = CreateObject("ADODB.Recordset")
objConn.Open "Provider=SQLOLEDB.1;Data Source=.;
Initial Catalog=DBA;Integrated Security=SSPI"
Here we are simply making a connection to the local instance on the  server
where this script is run from. You can see that we are using  integrated
security and the database we created called DBA. The  strComputer = "."
is a variable that we pass later on in the script.

objRS.CursorLocation = 3
objRS.Open "SELECT * FROM EventLogStaging" , objConn, 3, 3
' Get to the Event Log
Set objWMIService = GetObject("winmgmts:" _& "{impersonationLevel=impersonate}!\\" & strComputer & "\root\cimv2")
The connection we made on the previous step is connecting the table we  created
earlier called EventLogStaging. It is also going to connect to  the local
Event Log service.


Here you see we connect to and read from the Event Log. We are only  interested
in the Application log. That is where SQL likes to put its  info. We are only
interested in event types of 1 and 2. Those are  warnings and errors. The source
name of MSSQL% will guarantee that we  get the event logs of all the instances
installed.


' get ready to insert into our DBA table 
Set colMonitoredEvents = objWMIService.ExecNotificationQuery(query)
Do
Set objLatestEvent = colMonitoredEvents.NextEvent
objRS.AddNew
objRS("RecordNumber") = objLatestEvent.TargetInstance.RecordNumber
objRS("Category") = objLatestEvent.TargetInstance.Category
objRS("ComputerName") = objLatestEvent.TargetInstance.ComputerName
objRS("EventCode") = objLatestEvent.TargetInstance.EventCode
objRS("EventType") = objLatestEvent.TargetInstance.EventType
objRS("Message") = objLatestEvent.TargetInstance.Message
objRS("SourceName") = objLatestEvent.TargetInstance.SourceName
objRS("TimeGenerated") = WMIDateStringToDate(objLatestEvent.TargetInstance.TimeGenerated)
objRS("TimeWritten") = WMIDateStringToDate(objLatestEvent.TargetInstance.TimeWritten)
objRS.UpdateLoop

Step 4 - Create the SQL agent job to insert records in real time.

Pay attention to the line that says @command=N'cscript "E:\Monitor\EventLog2DB.vbs"',
That location E:\Monitor is where I placed the VB script we made. You can put yours wherever you want, but change that line to where you put it. Like C:\Monitor.
Also, just in case you do not run your SQL Server in mixed mode authentication, you will have to change the line that says @owner_login_name=N'sa', to a user that can run the job.
USE [msdb]
GO
------------------------------------------------------
BEGIN TRANSACTION
DECLARE @ReturnCode INT
SELECT @ReturnCode = 0
------------------------------------------------------
IF NOT EXISTS (SELECT name FROM msdb.dbo.syscategoriesWHERE name=N'[Uncategorized (Local)]' AND category_class=1)
BEGIN
EXEC @ReturnCode = msdb.dbo.sp_add_category @class=N'JOB', @type=N'LOCAL', @name=N'[Uncategorized (Local)]'
IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollback------------------------------------------------------
END
------------------------------------------------------
DECLARE @jobId BINARY(16)
EXEC @ReturnCode = msdb.dbo.sp_add_job @job_name=N'Monitor Event Log',
@enabled=1,
@notify_level_eventlog=0,
@notify_level_email=0,
@notify_level_netsend=0,
@notify_level_page=0,
@delete_level=0,
@description=N'No description available.',
@category_name=N'[Uncategorized (Local)]',
@owner_login_name=N'sa', @job_id = @jobId OUTPUTIF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollback------------------------------------------------------
EXEC @ReturnCode = msdb.dbo.sp_add_jobstep @job_id=@jobId, @step_name=N'always running',
@step_id=1,
@cmdexec_success_code=0,
@on_success_action=1,
@on_success_step_id=0,
@on_fail_action=2,
@on_fail_step_id=0,
@retry_attempts=0,
@retry_interval=0,
@os_run_priority=0, @subsystem=N'CmdExec',
@command=N'cscript "E:\Monitor\EventLog2DB.vbs"', 
@flags=0
IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollbackEXEC @ReturnCode = msdb.dbo.sp_update_job @job_id = @jobId, @start_step_id = 1
IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollbackEXEC @ReturnCode = msdb.dbo.sp_add_jobschedule @job_id=@jobId, @name=N'always',
@enabled=1,
@freq_type=64,
@freq_interval=0,
@freq_subday_type=0,
@freq_subday_interval=0,
@freq_relative_interval=0,
@freq_recurrence_factor=0,
@active_start_date=20100831,
@active_end_date=99991231,
@active_start_time=0,
@active_end_time=235959
IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollbackEXEC @ReturnCode = msdb.dbo.sp_add_jobschedule @job_id=@jobId, @name=N'check every 1 minute',
@enabled=1,
@freq_type=4,
@freq_interval=1,
@freq_subday_type=4,
@freq_subday_interval=1,
@freq_relative_interval=0,
@freq_recurrence_factor=0,
@active_start_date=20100901,
@active_end_date=99991231,
@active_start_time=0,
@active_end_time=235959
IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollbackEXEC @ReturnCode = msdb.dbo.sp_add_jobserver @job_id = @jobId, @server_name = N'(local)'
IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollbackCOMMIT TRANSACTION
GOTO EndSave
QuitWithRollback:
IF (@@TRANCOUNT > 0) ROLLBACK TRANSACTION
EndSave:
---------------------------------------------
GO

What it creates is a new SQL Server Agent job called Monitor Event Log.
It is set to start every time the SQL Agent starts and every 1 minute
after that. Of course, every minute the SQL Agent Job sees the previous
job still running, it will just skip it. Basically, we want our little
VB script to always be running. And this job will do just that.
Step 5 - Test that the events are being inserted.

Open SSMS and run a query on the instance you are monitoring. Copy and paste the code into your window.

raiserror ('working great',16,1) with log
The 'raiserror' command will write an event to the Windows Application log.
SELECT * FROM [DBA].[dbo].[EventLogStaging]
order by TimeWritten desc
Well, so it's working. Great! Now I have another place to see event log info. So why do I really care about that? Let's not forget about another feature SQL has had for many years called "SQL Server event forwarding". Consider that you might have 50 instances of SQL under your watchful eye. With a quick adjustment to the SQL Server Agent properties, you can forward SQL events from the Event log to where this job and VB script script are running.

(To enable that option go to Sql Server Agent -> Advanced -> Check 'Forward Events to different server', select a server name where the logs needs to be inserted.

Now I have a table with ALL my events in one place. I have used this method on over 150 instances forwarding their events to a central monitor server.

(Ref http://www.sqlservercentral.com/articles/Monitoring/71390/)