LogMiner Benefits
1. Pinpointing when a logical corruption to a database, such as errors made at the application level, may have begun. These might include errors such as those where the wrong rows were deleted because of incorrect values in a WHERE clause, rows were updated with incorrect values, the wrong index was dropped, and so forth.
2. Determining what actions you would have to take to perform fine-grained recovery at the transaction level. If you fully understand and take into account existing dependencies, it may be possible to perform a table-specific undo operation to return the table to its original state. This is achieved by applying table-specific reconstructed SQL statements that LogMiner provides in the reverse order from which they were originally issued.
3. Performance tuning and capacity planning through trend analysis. You can determine which tables get the most updates and inserts.
4. Performing postauditing. LogMiner can be used to track any data manipulation language (DML) and data definition language (DDL) statements executed on the database, the order in which they were executed, and who executed them.
LogMiner Configuration
1. The source database is the database that produces all the redo log files that you want LogMiner to analyze.
2. The mining database is the database that LogMiner uses when it performs the analysis.
3. The LogMiner dictionary allows LogMiner to provide table and column names, instead of internal object IDs, when it presents the redo log data that you request.
4. The redo log files contain the changes made to the database or database dictionary.
Note: Before you begin using LogMiner, it is important to understand how LogMiner works with the LogMiner dictionary file (or files) and redo log files.
LogMiner Dictionary Options
LogMiner gives you three options for supplying the dictionary:
1. Using the Online Catalog
Benefits.
1. To direct LogMiner to use the dictionary currently in use for the database, specify the online catalog as your dictionary source.
2. In addition to using the online catalog to analyze online redo log files, you can use it to analyze archived redo log files, if you are on the same system that generated the archived redo log files.
Requirements
1. The online catalog option requires that the database be open.
Example :
SQL> EXECUTE DBMS_LOGMNR.START_LOGMNR(-
OPTIONS => DBMS_LOGMNR.DICT_FROM_ONLINE_CATALOG);
Extracting a LogMiner Dictionary to the Redo Log Files
Requirements
1. To extract a LogMiner dictionary to the redo log files, the database must be open and in ARCHIVELOG mode and archiving must be enabled. While the dictionary is being extracted to the redo log stream, no DDL statements can be executed.
2. you can find out which redo log files contain the start and end of an extracted dictionary.
SQL> SELECT NAME FROM V$ARCHIVED_LOG WHERE DICTIONARY_BEGIN='YES';
SQL> SELECT NAME FROM V$ARCHIVED_LOG WHERE DICTIONARY_END='YES';
Example:
SQL> EXECUTE DBMS_LOGMNR_D.BUILD( -
OPTIONS=> DBMS_LOGMNR_D.STORE_IN_REDO_LOGS);
Extracting the LogMiner Dictionary to a Flat File
1. Be sure that no DDL operations occur while the dictionary is being built.
Requirements
1. set the initialization parameter, UTL_FILE_DIR, in the initialization parameter file.
SQL> alter system set UTL_FILE_DIR = 'c:\logminer' scope=spfile;
System altered.
SQL> --Create physically "logminer" folder on c:\ drive
SQL> startup force
ORACLE instance started.
Total System Global Area 285212672 bytes
Fixed Size 789060 bytes
Variable Size 95155644 bytes
Database Buffers 188743680 bytes
Redo Buffers 524288 bytes
Database mounted.
Database opened.
Supplemental Logging
Redo log files are generally used for instance recovery and media recovery. The data needed for such operations is automatically recorded in the redo log files. However, a redo-based application may require that additional columns be logged in the redo log files. The process of logging these additional columns is called supplemental logging.
There are two levels of supplemental logging.
Database-Level Supplemental Logging
Minimal Supplemental Logging
SQL> ALTER DATABASE ADD SUPPLEMENTAL LOG DATA;
Table-Level Supplemental Logging
Table-Level Identification Key Logging
SQL> ALTER TABLE HR.EMPLOYEES ADD SUPPLEMENTAL LOG DATA (ALL) COLUMNS;
No comments:
Post a Comment