Saturday, August 27, 2016

SQL Saturday in Charlotte, NC. Sessions I want to visit.

Usually, I do my blog posts after visiting SQL Saturday events, but I want to change that tradition and talk about them before the actual event.

In this post I want to talk about sessions I have personal interest in and I understand that it might drive people from other sessions. However, it will definitely drive some audience in and I hope those people will thank me for an additional information/suggestions/guidance.

So, will start:
SQL Saturday #560, Charlotte, NC

Event date: September 17th, 2016.
Session schedule: http://www.sqlsaturday.com/560/Sessions/Schedule.aspx

Slot #1. 9:00 A.M. - 10:00 A.M.

In that slot there is ONLY ONE session I'm going to visit. That is my own session "Inside of Indexes". I'll be talking about:
- What indexes are;
- How SQL Server is using different types of indexes to retrieve data in the best possible way;
- What are Page Split and how they impact performance;
and some more.
I hope my session will be very interesting, but I'm afraid that half of the people, interested in indexing, will split between my session and Jeffrey Garbus' session "Indexing for performance".

If I didn't have my session in the very first slot I'd definitely go to Aaron Nelson's session "Achieve More in Less Time by Using the New SQL PowerShell".

Slot #2. 10:15 A.M. - 11:15 A.M.

In the second slot, the winner, without a doubt is Adam Machanic's "SQL Server 2016 "Hidden Gems". The room has to be big enough to accommodate all attendees. If you desperately want to visit his session, I'd suggest to come to the previous session of Angela Henry's "Lightning Talks" to secure your space.

Even though Adam Machanic will be the shining star in that slot I won't go for his session because I've already attended it on another event.

The second session in the priority is  Rick Heiges' "Winning with Columnstore - A Customer Story". Since Microsoft nailed down Columnstores in SQL Server 2016 and because of Rick's HIP (High Influential Power) - that would be the session to visit.

Despite on all advantages of Rick's presentation I won't go there. The trick is: I want to visit his session two weeks later on SQL Saturday #545 in in Pittsburgh.
So, I'll have a tough  choice between Rushabh Mehta's "Mastering and Enhancing Company Data" and William Assaf's "SQL Server Permissions and Security Principals".

Slot #3. 11:30 A.M. - 12:30 P.M.

The Best and "Must Go" in the third slot is Geoff Hiten's session "What’s your Imperative doing in my Declarative?". The name is not very descriptive, but that session can do a Inventor's Light bulb  effect on any beginner SQL Developer, who just graduated college with most of the experience in Java and C#. I guarantee, there will be the "AHA! moment" during that session, which will shift a paradigm of your programming experience for life.

So, because I already know all of that I won't go to that session.
The next in the line is Tom Norman's "Encrypting Data within Sql Server". I'm from DC region and "Security" here is always #1 priority.

Slot #4. 1:45 P.M. - 2:45 P.M.

After lunch session. There will be a lot of other performance tuning topics I like, but my choice will be Brian Carrig's session "Managing Very Large Databases with SQL Server".

Slot #5. 3:00 P.M. - 4:00 P.M.


Very good session is supposed to be of Kevin Feasel's "APPLY Yourself" about the art of using "Apply" operator.
Wonderful session of Jonathan Buller: "Tips and Tricks for the SSIS Catalog" about new abilities of managing SSIS packages directly from SQL. That  topic is not fully covered in SQL community yet and I'm waiting the time when we will discuss parallel executions of multiple queries using SSIS catalog and SSIS packages addressing their own scheduling manipulating SSISDB.

Because it will be the last slot of sessions and everybody, including me, will fill pretty tired, I'd probably will go an relax listening to "Replication v. AlwaysOn, the great debate", hosted by Chris Skorlinski and Curt Mathews. They encourage to bring your own questions to that discussion, so, I assume there might be several of very good blog posts delivered only based on that conversation.

See you in Charlotte!
And not hesitate to show your smile when I'll make your picture.

Wednesday, August 24, 2016

Four types of SQL Server Authentication

In SQL Server can be four types of user authentication.
To illustrate this I'll run following query in my test box:
SELECT authentication_type, name, type_desc, authentication_type_desc
FROM sys.database_principals
WHERE principal_id between 5 and 16383;

It gives me the following result:

So, there are four authentication types:
- 0 - NONE;
- 1 - INSTANCE;
- 2 - DATABASE;
- 3 - WINDOWS;

#1 - INSTANCE and #3 - WINDOWS are the most common authentication types: they are simply SQL and Windows authentications.

Will review the more exotic ones: #0 - NONE and #2 - DATABASE.

SQL Server User with "NONE" authentication:

"TestUser0" with NONE authentication is just a database user without SQL Server Login.
That means that user can't authenticate into a SQL Server. Because of that it can't be assigned to any person or an application using SQL Server.

That user account can be used within "EXECUTE AS" in functions, stored procedures and triggers.
That user account can own schemas and have privileges and permissions, which will allow restricted user to have an access to objects and actions via stored procedures or functions, which she/he is not granted to see or perform.

For example I can try to do a select as user "TestUser1" from newly created table.

USE TestDB;
GO
SELECT 1 as ID INTO tbl_TestTable;
GO
EXECUTE AS USER = 'TestUser1' ;
GO 
SELECT * FROM tbl_TestTable;
GO
REVERT ;
GO 

I'll get following error:
Msg 229, Level 14, State 5, Line 417
The SELECT permission was denied on the object 'tbl_TestTable', database 'TestDB', schema 'dbo'.

However, if I create a stored procedure with EXECUTE AS permissions for user "TestUser0" and grant access to that procedure to "TestUser1" then that user will be able successfully see content of a new table:
GRANT SELECT ON tbl_TestTable TO TestUser0;
GO
CREATE PROCEDURE usp_ViewTestTable
WITH EXECUTE AS 'TestUser0' AS
SELECT * FROM tbl_TestTable;
GO
GRANT EXECUTE ON usp_ViewTestTable TO TestUser1;
GO
EXECUTE AS USER = TestUser1';
GO 
EXEC usp_ViewTestTable;
GO
REVERT ;
GO 

Create user with NONE authentication is pretty simple, you just specify that you want to create a user with no SQL Server login:

USE TestDB;
GO
CREATE USER [TestUser0]  WITHOUT LOGIN;
GO

SQL Server User with "DATABASE" authentication:

That is absolutely unique type of authentication.
At first, when I've tried to connect to SQL Server using "TestUser2" I've got an error:

Then I've tried to go to additional connection "Options":

And explicitly specified the name of the database I want to connect to:

That made the trick. I was be able to connect and see my database:

Wow! I can see ONLY and ONLY one database I'm assigned to.

I've tried to see what else I can see:
SELECT * FROM sys.databases;

And here is the list of databases I can really see:
Along with "TestDB" I'm exposed to "master" and "tempdb".
I can see data from tables in "master", I can create "#Temp" tables, but can write directly only to "TestDB".

Now, here is how to set up "DATABASE" authenticated user. It is pretty simple, just specify a user name in current database with a password:
USE TestDB;
GO
CREATE USER [TestUser2]  WITH PASSWORD = 'TestUser2';
GO

 Actually, that is not only it. Before you be able to setup database authenticated user you have to switch your database into Partial Containment mode and to do so, you have to enable it on the server level first:
USE master
GO
sp_configure 'show advanced options', 1;
GO
RECONFIGURE
GO
sp_configure 'CONTAINED DATABASE AUTHENTICATION', 1;
GO
RECONFIGURE
GO
sp_configure 'show advanced options', 0 ;
GO
RECONFIGURE
GO
ALTER DATABASE TestDB SET CONTAINMENT = PARTIAL;
GO

Looks cool?
Yep, it is really cool that you can COMPLETELY separate a user form ANY OTHER databases on your server!!!

BUT. There is always "BUT". There no "Silver Bullet" nor "Free Cheese".
If you use Partially contained databases cannot use replication, change data capture, or change tracking and some other features.

The good news: you are good to go with AlwaysOn.

Conclusion:

Because Contained Database Users can easily coexist with all other types of users and they are highly restricted within boundaries of one database, since SQL Server 20014, it can be very good addition to your tool set of other security features.

MSDN links for further research:

Contained Databases: https://msdn.microsoft.com/en-us/library/ff929071.aspx
Contained Database Users: https://msdn.microsoft.com/en-us/library/ff929188.aspx
Create a Database User: https://msdn.microsoft.com/en-us/library/aa337545.aspx

Tuesday, August 23, 2016

Srinking database File with a help of "Spatial Fragmentation Report"

There are more than "many" questions in the internet from people who can't claim free unused space from their data files.

In this post I'll cover that problem using "Spatial Fragmentation Report"

At first, will identify the problem:
USE [TestFF];
GO
SELECT
Name, physical_name,
    size/128. as "File Size, Mb",
    FILEPROPERTY(name, 'SpaceUsed')/128. as "Space Used, Mb"
FROM sys.database_files
WHERE file_id = 1;

As you can see, my database is 10Gb and only 1.6 Gb are in use.

Will try to fix that problem the conventional way by trying to shrink database to 20% of it's current size:
DBCC SHRINKDATABASE('TestFF',20);

As the result, got following message:

DBCC SHRINKDATABASE: File ID 1 of database ID 5 was skipped because the file does not have enough free space to reclaim.
DBCC SHRINKDATABASE: File ID 2 of database ID 5 was skipped because the file does not have enough free space to reclaim.
DBCC execution completed. If DBCC printed error messages, contact your system administrator.

What is that??? I have 85% of free space in my File 1 !!! What is the problem?

Will try to target an individual file and reduce its size to ~2Gb:
DBCC SHRINKFILE('TestFF',2000);

After waiting for almost 2 minutes I've got following error:
Msg 0, Level 11, State 0, Line 144
A severe error occurred on the current command.  The results, if any, should be discarded.

Then I've tried to reduce file size for anything I can:
DBCC SHRINKFILE('TestFF');
Then almost immediately got following result:

After checking the size again see very strange behavior: size of the file have not reduced, but amount of USED space reduced by 300Mb:

Have you ever been in the same situation before when no matter what, but you can't shrink your database?

So, now the time to see WHY SQL Server can't shrink our file.
I use following script to see object allocations/fragmentation in a data file:

IF Object_ID('tempdb..#Used_Pages_List') Is not Null
 DROP TABLE #Used_Pages_List;
GO
SELECT object_id, allocated_page_page_id as page_id INTO #Used_Pages_List
FROM sys.dm_db_database_page_allocations(DB_ID(), NULL, NULL, NULL, NULL) as a
WHERE partition_id = 1
GO
CREATE CLUSTERED INDEX #CLIX_Used_Pages_List ON #Used_Pages_List(object_id,page_id)
GO
DECLARE @i INT = 0;
DECLARE @m INT = (SELECT MAX(object_id) FROM #Used_Pages_List);
DECLARE @g TABLE(Alloc GEOMETRY, Table_Object SYSNAME, object_id INT);
DECLARE @s INT = 131072
DECLARE @LineMegabytes INT = (SELECT Size/(64000)+1 FROM sys.database_files WHERE file_id = 1);
DECLARE @d INT = @LineMegabytes;
DECLARE @DBSize INT = (SELECT Size FROM sys.database_files WHERE file_id = 1) / @d + 1
DECLARE @t TABLE(ID INT, MessageText VARCHAR(100));

INSERT INTO @t(ID,MessageText) VALUES
(1,'Scale: 1 Square 10x10 = 1 Page (8 Kb)'),
(2,'Row: ' + CAST(@LineMegabytes as VARCHAR) + ' Mb'),
(3,'Vertical: 100 points = ' + CAST(10*@LineMegabytes as VARCHAR) + ' Mb')

SELECT * FROM @t ORDER BY ID;

WHILE @i < @m
BEGIN
 SELECT @i = MIN(object_id) FROM #Used_Pages_List
 WHERE @i < object_id

 PRINT CONVERT(VARCHAR,@i) 

 INSERT INTO @g(object_id, Table_Object, Alloc) 
 SELECT @i, '[' + Object_Schema_Name(@i)+ '].[' + Object_Name(@i) + ']'
  , CONVERT(GEOMETRY,'POLYGON(' + SUBSTRING( (
  SELECT 
   ',(' 
   + CONVERT(VARCHAR, (page_id % @d) * 10) + ' ' + CONVERT(VARCHAR, (page_id / -@d) * 10 ) + ','
   + CONVERT(VARCHAR, (page_id % @d+1) * 10) + ' ' + CONVERT(VARCHAR, (page_id / -@d) * 10 ) + ','
   + CONVERT(VARCHAR, (page_id % @d+1) * 10) + ' ' + CONVERT(VARCHAR, (page_id / -@d) * 10 - 10) + ','
   + CONVERT(VARCHAR, (page_id % @d) * 10) + ' ' + CONVERT(VARCHAR, (page_id / -@d) * 10 - 10) + ','
   + CONVERT(VARCHAR, (page_id % @d) * 10) + ' ' + CONVERT(VARCHAR, (page_id / -@d) * 10 ) + ')'
  FROM #Used_Pages_List
  WHERE @i = object_id
  FOR XML PATH ('')
 ),2,@@TEXTSIZE) + ')');
END

SELECT object_id, Table_Object, Alloc FROM @g
UNION ALL
SELECT 0, 'Database Size', CONVERT(GEOMETRY,'LINESTRING(0 ' 
 + CONVERT(VARCHAR, @DBSize * -10) + ', ' + CONVERT(VARCHAR, @d * 10) + ' ' + CONVERT(VARCHAR, @DBSize * -10) + ')')
ORDER BY object_id
GO

Here is the result for my database:
As you can see, I definitely have a lot of free space, but my data are so spread across the file and especially up to it's border, that there is no way to make file size smaller.

If we zoom at the very tail we can figure out the names of tables at the very end of the file, which prevent file from shrinking:

After I performed "Index Rebuild" for all user tables the Page Allocation picture changed to this:
As you can see, the most of the tables were moved to the beginning of the database, but some are still wondering somewhere in the middle. Lets Zoom and figure out which tables might prevent Shrinking of the file:
Unfortunately, it is the system table and we can do nothing about it.

However, will try to shrink the file again:
DBCC SHRINKFILE('TestFF',2000);

This time I was more lucky sis not get an error, but also had my file shrank:


Here is how fragmentation looked after the Shrink:

The main conclusion of that exercise is that if you struggle with shrinking your database, look at your index fragmentation, that can cause errors and inability to shrink files.

Monday, August 22, 2016

How to Copy a file using SQL Server in three easy steps

Might happen that you can't access SQL Server shared drives, but you desperately want to copy file from the server to another location.

Here is the solution I have:

Step 1. Reading a file.

At first, you have to read the file you want to copy into a SQL Server. You have to choose a database to perform that action. It can be Test database or you can create a new database to perform that action or it can be even TempDB. There is only two requirements for the database:
- It must not be a production Database;
- Database should have enough of space to accommodate the file you want to copy.

So, as the firs step in copying we create a table in TempDB and read backup file into that table:

USE tempdb
GO
DROP TABLE IF EXISTS tbl_StoredFile;
GO
CREATE TABLE
tbl_StoredFile(StoredFile VARBINARY(MAX));
GO
INSERT INTO tbl_StoredFile(StoredFile) SELECT BulkColumn
FROM OPENROWSET(BULK'C:\Backup\AdventureWorks2014.bak', SINGLE_BLOB) AS x;
GO
sp_spaceused tbl_StoredFile;
GO 


Step 2. Extract a File.

From this point you are going out of SSMS and run a NT Shell commands.
At first you press Win+R, then type "CMD" and then press enter.
That has to open new "CMD.EXE" window.
You have to choose a folder on your computer where you will extract your file.
You also have to provide SQL Server name and credentials to it:
bcp "SELECT StoredFile FROM tempdb.dbo.tbl_StoredFile;" queryout "L:\Temp\AdventureWorks2014.bak" -S <ServerName> -U <UserName>
- After you fill server name and user name and press Enter, BCP utility will ask you for your password.
- Then it will ask you to enter "Enter prefix-length of field StoredFile [8]:" - By default it is "8", but you have to enter "0" and press enter.
- Then it will ask you for "Enter length of field StoredFile [0]:" - just press enter;
- Then it will ask you for "Enter field terminator [none]:" - just press enter;
- Then it will ask you if you want to save this format information - answer "n" and press Enter.

As the result, file will be extracted to your file system.

Step 3. Cleanup.

Just cleanup the space in your database by dropping the table:
USE tempdb
GO
DROP TABLE tbl_StoredFile;

As you can see it is extremely easy to copy files using SQL Server, you just need to have enough of free space in a database to load a file.

Tuesday, August 16, 2016

Redirecting Linked Server

Have you been in the situation when all your SQL code is referring to Linked server XYZ and now you have to migrate it to Linked Server ABC?

What do you have to do? Re-Write all stored procedures, functions, views, triggers, which use old server name?

Not necessary. You can redirect your old Linked server name to another location.

Would say you have following Linked Server:
SELECT * FROM sys.servers
WHERE server_id > 0;

In my example my linked server has an IP addressed name "192.168.1.103" and pointed to the same SQL Server host.

If we want to redirect that linked server to another SQL Server host we can do following:
EXEC sp_setnetname
    @server = N'SQL2016_01', -- Original Linked Server Name
    @netname = N'SQL2016_02'; -- Newly redirected SQL Server

After running that statement we will see the following picture:

That means ALL our code, which uses OLD Linked Server name is pointed now to the new SQL Server.

Be very careful with that tool. Potentially you can create cross reference like this:

Linked Server "SQL2016_01" is now referring to actual SQL Server "SQL2016_02" and Linked Server "SQL2016_02" is now referring to actual SQL Server "SQL2016_01".
 That is complete mess which can lead not only to loss/discrepancy of the data, but easy loss of a job.

Monday, August 15, 2016

Seven steps to Welcome a Consultant.

In this post I want to talk about hard life of SQL Server consultants, who are hired by companies to perform SQL Server performance troubleshooting, and their interactions with SQL DBA and SYSADMIN.

That should be the guideline for SQL DBAs on how to setup  an access for temporary hired contractors.

Step #1. Mr. Consultant came to the office. Sysadmin created AD account to log to the network. SQL DBA creates Login for consultant to log in to the server.

If SQL Server allows "SQL Server Authentication" then SQL DBA can create local SQL Login for the new user:
USE [master]
GO
CREATE LOGIN [MrConsultant] WITH PASSWORD=N'MrConsultant20160815' MUST_CHANGE,
DEFAULT_DATABASE=[tempdb], CHECK_EXPIRATION=ON, CHECK_POLICY=ON 
GO
Just note that Default Database is "tempdb", not "master". That is a good practice for any new SQL Login account.

If "SQL Server Authentication" is not allowed, then DBA can create new Login based on AD account:
USE [master]
GO
CREATE LOGIN [Domain\MrConsultant] FROM WINDOWS WITH DEFAULT_DATABASE=[tempdb]
GO

At first, Mr. Contractor wants to look through the SQL Server Log:
EXEC sys.xp_readerrorlog;
However, it generates following error:
Msg 229, Level 14, State 5, Procedure xp_readerrorlog, Line 1 [Batch Start Line 39]
The EXECUTE permission was denied on the object 'xp_readerrorlog', database 'mssqlsystemresource', schema 'sys'.


In order to give the right to see Error Log DBA has to create user in "master" database and give rights to execute "xp_readerrorlog".

Step #2. Giving rights to execute "xp_readerrorlog".

USE [master]
GO
CREATE USER [MrConsultant] FOR LOGIN [MrConsultant]
GO
GRANT EXECUTE ON xp_readerrorlog TO [MrConsultant]
GO

Then Consultant tries to execute several commands to get an idea what is going on on the server:
SELECT * FROM sys.dm_exec_connections
SELECT TOP 10 * FROM sys.dm_exec_cached_plans
SELECT TOP 10 * FROM sys.dm_exec_query_stats
SELECT * FROM sys.dm_io_virtual_file_stats(NULL, NULL)
SELECT * FROM sys.dm_server_services
SELECT * FROM sys.dm_os_sys_info
SELECT * FROM sys.dm_os_windows_info
SELECT * FROM sys.dm_os_buffer_descriptors
SELECT * FROM sys.dm_os_sys_memory
SELECT * FROM sys.dm_os_process_memory
SELECT * FROM sys.dm_os_performance_counters
However, he/she immediately gets following error for each of these selects:
Msg 300, Level 14, State 1, Line 3
VIEW SERVER STATE permission was denied on object 'server', database 'master'.
Msg 297, Level 16, State 1, Line 3
The user does not have permission to perform this action.


So, SQL DBA has to solve the problem.

Step #3. Giving "VIEW SERVER STATE" permissions.

To give "VIEW SERVER STATE" permissions DBA has to execute following statement:
USE [master]
GO
GRANT VIEW SERVER STATE TO [MrConsultant];
GO

To see list of the databases, their sizes, file names, locations consultant has to run following query:

SELECT * FROM sys.master_files
However, there is nothing in return.
DBA has to grant permissions to see databases' definitions:

Step #4. Granting databases' definition access.

USE [master]
GO
GRANT VIEW ANY DEFINITION TO [MrConsultant];
GO

The main production database in the system is "AdventureWorks2014". Consultant tries to access that database, but gets following error:
Msg 916, Level 14, State 1, Line 20
The server principal "MrConsultant" is not able to access the database "AdventureWorks2014" under the current security context.


So, DBA has to permit consultant to access the database.

Step #5. Giving database access.

Administrator has to create a user for Consultant in the Database:

USE [AdventureWorks2014]
GO
CREATE USER [MrConsultant] FOR LOGIN [MrConsultant]
GO

In order to troubleshoot a query Mr Consultant runs following statement:
DBCC SHOW_STATISTICS('Person.Person','PK_Person_BusinessEntityID');
And gets following errors:
Msg 229, Level 14, State 5, Line 34
The SELECT permission was denied on the object 'Person', database 'AdventureWorks2014', schema 'Person'.
Msg 2557, Level 16, State 7, Line 34
User 'MrConsultant' does not have permission to run DBCC SHOW_STATISTICS for object 'Person.Person'.


Consultant needs to read objects within the database.

Step #6. Granting Selection access.

Administrator has to grant rights to "SELECT" for Consultant in the Database:
USE [AdventureWorks2014]
GO
GRANT SELECT TO [MrConsultant]
GO
SORRY, contractor have to be able to see your underlying data. If you have to keep the data secured, encrypt it.

Now Mr. Consultant wants to see actual query plan generated by a query:
SELECT * FROM Person.Person;

And gets an error:
Msg 262, Level 14, State 4, Line 30
SHOWPLAN permission denied in database 'AdventureWorks2014'.

DBA has to allow to see an Execution Plan.

Step #7. Granting Showplan access.

USE [AdventureWorks2014]
GO
GRANT SHOWPLAN TO [MrConsultant]
GO


From  this point the consultant has the most rights needed for performance and error troubleshooting.
He/she does not have any rights to:
- Insert/Update/Delete any record;
- Create/Alter/Drop any database object;
- Create/Drop Login or User;
- Grant/Deny/Revoke any permissions;
- Run DBCC commands such as "CHECKDB", "FREEPROCCACHE", "DROPCLEANBUFFERS", "PAGE", Etc.

That is the minimal level of trust you can maintain as a DBA with external contractor, which will allow him/her to perform the required research.

In case Contractor will be requested doing any changes DBA can assist with Object creation or giving individual permissions to the specific objects.

At the end will combine all scripts into one. If contractor needs access to more than one database you have to repeat last three steps for all databases he/she need the access:

USE [master]
GO
CREATE USER [MrConsultant] FOR LOGIN [MrConsultant];

GO
GRANT EXECUTE ON xp_readerrorlog TO [MrConsultant];

GO
GRANT VIEW SERVER STATE TO [MrConsultant];
GO
GRANT VIEW ANY DEFINITION TO [MrConsultant];
GO
USE [AdventureWorks2014]
GO
CREATE USER [MrConsultant] FOR LOGIN [MrConsultant];
GO
GRANT SELECT TO [MrConsultant];
GO
GRANT SHOWPLAN TO [MrConsultant];
GO

Step #Last One. After Consultant is done...

After consultant helped your company to solve a problem, do not forget to clean up after him/her:
USE [AdventureWorks2014]
GO
DROP USER [MrConsultant];
GO
USE [master]
GO
DROP USER [MrConsultant];
GO
DROP LOGIN [MrConsultant];
GO



Monday, August 8, 2016

Four ways to filter results of fn_dblog function.

Some of you might know/use SQL Server undocumented function "fn_dblog" to look at SQL Server Database log.

The syntax of it's usage is

SELECT * FROM fn_dblog (NULL, NULL);

That is pretty easy to use if you are in Simple transnational mode and just doing some testing/research, but if you are looking at production database with huge Log file?

Here are 4 ways to filter the results to make research and troubleshooting more manageable:

Solution #1. Copy Log file content into tempdb table or into another database:

SELECT * INTO #tbl_DBLog
FROM fn_dblog (NULL, NULL);
GO
SELECT * INTO tempdb.dbo.tbl_DBLog
FROM fn_dblog (NULL, NULL);
GO
SELECT * INTO TestDB.dbo.tbl_DBLog
FROM fn_dblog (NULL, NULL);
GO

In this sample code I copy content into a temp table, permanent table in temp db and permanent table in TestDB. By doing this you have to be aware of the following:
1. Never copy Transnational Log data into the same database you are doing research in. By doing that you will change the log.
2. Be aware of the size of the data. It might take a long while to extract whole log and the amount of space required for that table will exceed amount of space used by the Log file.

If you went this path you can easily select needed data using WHERE clause.

Solution #2. Extract only needed columns.

Function "fn_dblog" returns about 130 columns. It is obvious you do not need all of them.
You can limit an output by selecting only certain columns:
SELECT [Current LSN]
      ,[Operation]
      ,[Context]
      ,[AllocUnitName]
      ,[Transaction Name]
      ,[Description]
      ,[Begin Time]
      ,[End Time]
      ,[Prepare Time]
      ,[Lock Information]
      ,[Checkpoint Begin]
      ,[Checkpoint End]
      ,[RowLog Contents 0]
      ,[RowLog Contents 1]
      ,[RowLog Contents 2]
      ,[RowLog Contents 3]
      ,[RowLog Contents 4]
      ,[RowLog Contents 5] 
FROM fn_dblog (NULL, NULL);

However, if you have millions and millions records in the log file it still might be not appropriate solution.

Solution #3. Search for particular operation.

For instance you want to search for any changes in your database you can run following script:
SELECT [Current LSN]
      ,[Transaction Name]
      ,[Begin Time]
FROM fn_dblog (NULL, NULL) 
WHERE [Transaction Name] in ('INSERT','DELETE','UPDATE');

In my test DB I've got following:
In that smaller data set you can locate possible transnational troublemaker, but in order to get details, you still need to extract WHOLE log file content to look at the details.

Here comes my last script to extract LSN for ONLY needed transaction:

Solution #4. Extract only needed LSNs.

Would say we need to extract an information associated with an "UPDATE" for LSNs started at "0000004f:00000087:0001". You can just specify Starting and Ending LSNs as "fn_dblog" parameters:
DECLARE
    @SLSNA BIGINT = 0x4f,
    @SLSNB BIGINT = 0x87,
    @SLSNC BIGINT = 0x1,
    @ELSNA BIGINT = 0x4f,
    @ELSNB BIGINT = 0x88,
    @ELSNC BIGINT = 0x1
DECLARE
    @SLSN BIGINT = (@SLSNA * 10000000000 + @SLSNB) * 100000 + @SLSNC,
    @ELSN BIGINT = (@ELSNA * 10000000000 + @ELSNB) * 100000 + @ELSNC
SELECT [Current LSN]
      ,[Operation]
      ,[Context]
      ,[AllocUnitName]
      ,[Transaction Name]
      ,[Begin Time]
      ,[End Time]
      ,[Lock Information]
FROM fn_dblog (@SLSN,@ELSN);

That portion of code would return you ONLY Log records between LSNs "0000004f:00000087:0001" and "0000004f:00000088:0001".

That means you do not have to read entire log into a temp or staging table and do not have to scan entire log again and again  to do your research.