oracle 分析归档日志

栏目:eas cloud知识作者:金蝶来源:金蝶云社区发布:2024-09-22浏览:1

oracle 分析归档日志

适用场景:

数据库在某个时间段产生了大量的归档日志,用户想要分析在这段时间数据具体做了什么操作


一、查询目标时间点的归档日志是否存在

set pagesize 900;

set linesize 900;

col name for a55;

ALTER SESSION SET NLS_DATE_FORMAT="YYYY-MM-DD HH24:MI:SS";

select name, first_time, next_time,SEQUENCE# from v$archived_log   where first_time > to_date('2021-11-15 16:00:00', 'yyyy-mm-dd hh24:mi:ss') and first_time < to_date('2021-11-15 16:30:00', 'yyyy-mm-dd hh24:mi:ss');


二、添加归档日志文件

EXECUTE DBMS_LOGMNR.ADD_LOGFILE('/home/u01/arch/1_253871_1013686939.dbf',dbms_logmnr.new);

EXECUTE DBMS_LOGMNR.ADD_LOGFILE('/home/u01/arch/1_253872_1013686939.dbf',dbms_logmnr.addfile);

EXECUTE DBMS_LOGMNR.ADD_LOGFILE('/home/u01/arch/1_253873_1013686939.dbf',dbms_logmnr.addfile);


三、启动logminer进行分析

execute dbms_logmnr.start_logmnr;

也可以指定其他条件,比如时间:

execute dbms_logmnr.start_logmnr(StartTime =>to_date('2021-11-15 16:00:00','YYYY-MM-DD HH24:MI:SS') EndTime =>to_date('2021-11-15 16:30:00','YYYY-MM-DD HH24:MI:SS '));


四、输出结果

可以将结果输出到一个文件中

spool  test.txt ;       --结果输出到当前目录的test.txt文件中

select sql_redo,info from v$logmnr_contents where UPPER(sql_redo) like '%UPDATE%'  or UPPER(sql_redo) like '%INSERT%' or UPPER(sql_redo) like '%DELETE%';   --获取DML语句

spool off;



也可以通过建表的方式将结果保存在一个表格中。(如果日志量大,不要建这个临时表

create table temp_result1 as  select * from v$logmnr_contents;


五、结束logminer

execute dbms_logmnr.end_logmnr;


六、分析结果

分析test.txt 或temp_result1表中数据

如果输出的结果中,对象名未知,对象id已知的话,可以根据以下语句查询对应的对象名

select owner,object_name  from dba_objects where object_id='xxxxx';


七、示例

例如:

想要查询2021年11月15日16:00到 2011年11月15日16:30,数据库做了哪些dml操作。

1.检查这个目标时间点的归档日志是否存在

set pagesize 900;

set linesize 900;

col name for a55;

ALTER SESSION SET NLS_DATE_FORMAT="YYYY-MM-DD HH24:MI:SS";

select name, first_time, next_time,SEQUENCE# from v$archived_log   where first_time > to_date('2021-11-15 16:00:00', 'yyyy-mm-dd hh24:mi:ss') and first_time < to_date('2021-11-15 16:30:00', 'yyyy-mm-dd hh24:mi:ss');

查询结果如下:


2.日志挖掘

EXECUTE DBMS_LOGMNR.ADD_LOGFILE('/home/u01/arch/1_253871_1013686939.dbf',dbms_logmnr.new);

EXECUTE DBMS_LOGMNR.ADD_LOGFILE('/home/u01/arch/1_253872_1013686939.dbf',dbms_logmnr.addfile);

EXECUTE DBMS_LOGMNR.ADD_LOGFILE('/home/u01/arch/1_253873_1013686939.dbf',dbms_logmnr.addfile);

EXECUTE DBMS_LOGMNR.ADD_LOGFILE('/home/u01/arch/1_253874_1013686939.dbf',dbms_logmnr.addfile);

EXECUTE DBMS_LOGMNR.ADD_LOGFILE('/home/u01/arch/1_253875_1013686939.dbf',dbms_logmnr.addfile);

EXECUTE DBMS_LOGMNR.ADD_LOGFILE('/home/u01/arch/1_253876_1013686939.dbf',dbms_logmnr.addfile);

EXECUTE DBMS_LOGMNR.ADD_LOGFILE('/home/u01/arch/1_253877_1013686939.dbf',dbms_logmnr.addfile);

EXECUTE DBMS_LOGMNR.ADD_LOGFILE('/home/u01/arch/1_253878_1013686939.dbf',dbms_logmnr.addfile);

EXECUTE DBMS_LOGMNR.ADD_LOGFILE('/home/u01/arch/1_253879_1013686939.dbf',dbms_logmnr.addfile);

EXECUTE DBMS_LOGMNR.ADD_LOGFILE('/home/u01/arch/1_253880_1013686939.dbf',dbms_logmnr.addfile);

EXECUTE DBMS_LOGMNR.ADD_LOGFILE('/home/u01/arch/1_253881_1013686939.dbf',dbms_logmnr.addfile);

EXECUTE DBMS_LOGMNR.ADD_LOGFILE('/home/u01/arch/1_253882_1013686939.dbf',dbms_logmnr.addfile);

EXECUTE DBMS_LOGMNR.ADD_LOGFILE('/home/u01/arch/1_253883_1013686939.dbf',dbms_logmnr.addfile);

EXECUTE DBMS_LOGMNR.ADD_LOGFILE('/home/u01/arch/1_253884_1013686939.dbf',dbms_logmnr.addfile);

EXECUTE DBMS_LOGMNR.ADD_LOGFILE('/home/u01/arch/1_253885_1013686939.dbf',dbms_logmnr.addfile);

EXECUTE DBMS_LOGMNR.ADD_LOGFILE('/home/u01/arch/1_253886_1013686939.dbf',dbms_logmnr.addfile);

EXECUTE DBMS_LOGMNR.ADD_LOGFILE('/home/u01/arch/1_253887_1013686939.dbf',dbms_logmnr.addfile);

EXECUTE DBMS_LOGMNR.ADD_LOGFILE('/home/u01/arch/1_253888_1013686939.dbf',dbms_logmnr.addfile);

EXECUTE DBMS_LOGMNR.ADD_LOGFILE('/home/u01/arch/1_253889_1013686939.dbf',dbms_logmnr.addfile);

EXECUTE DBMS_LOGMNR.ADD_LOGFILE('/home/u01/arch/1_253890_1013686939.dbf',dbms_logmnr.addfile);

EXECUTE DBMS_LOGMNR.ADD_LOGFILE('/home/u01/arch/1_253891_1013686939.dbf',dbms_logmnr.addfile);

EXECUTE DBMS_LOGMNR.ADD_LOGFILE('/home/u01/arch/1_253892_1013686939.dbf',dbms_logmnr.addfile);

EXECUTE DBMS_LOGMNR.ADD_LOGFILE('/home/u01/arch/1_253893_1013686939.dbf',dbms_logmnr.addfile);

EXECUTE DBMS_LOGMNR.ADD_LOGFILE('/home/u01/arch/1_253894_1013686939.dbf',dbms_logmnr.addfile);

EXECUTE DBMS_LOGMNR.ADD_LOGFILE('/home/u01/arch/1_253895_1013686939.dbf',dbms_logmnr.addfile);

EXECUTE DBMS_LOGMNR.ADD_LOGFILE('/home/u01/arch/1_253896_1013686939.dbf',dbms_logmnr.addfile);

EXECUTE DBMS_LOGMNR.ADD_LOGFILE('/home/u01/arch/1_253897_1013686939.dbf',dbms_logmnr.addfile);

EXECUTE DBMS_LOGMNR.ADD_LOGFILE('/home/u01/arch/1_253898_1013686939.dbf',dbms_logmnr.addfile);

EXECUTE DBMS_LOGMNR.ADD_LOGFILE('/home/u01/arch/1_253899_1013686939.dbf',dbms_logmnr.addfile);

EXECUTE DBMS_LOGMNR.ADD_LOGFILE('/home/u01/arch/1_253900_1013686939.dbf',dbms_logmnr.addfile);

EXECUTE DBMS_LOGMNR.ADD_LOGFILE('/home/u01/arch/1_253901_1013686939.dbf',dbms_logmnr.addfile);

EXECUTE DBMS_LOGMNR.ADD_LOGFILE('/home/u01/arch/1_253902_1013686939.dbf',dbms_logmnr.addfile);

EXECUTE DBMS_LOGMNR.ADD_LOGFILE('/home/u01/arch/1_253903_1013686939.dbf',dbms_logmnr.addfile);

EXECUTE DBMS_LOGMNR.ADD_LOGFILE('/home/u01/arch/1_253904_1013686939.dbf',dbms_logmnr.addfile);

EXECUTE DBMS_LOGMNR.ADD_LOGFILE('/home/u01/arch/1_253905_1013686939.dbf',dbms_logmnr.addfile);

EXECUTE DBMS_LOGMNR.ADD_LOGFILE('/home/u01/arch/1_253906_1013686939.dbf',dbms_logmnr.addfile);


3.启用logminer

PROMPT start LogMiner;

execute dbms_logmnr.start_logmnr;


4.结果输出

spool  test.txt ;       --结果输出到当前目录的test.txt文件中

select sql_redo,info from v$logmnr_contents where UPPER(sql_redo) like '%UPDATE%'  or UPPER(sql_redo) like '%INSERT%' or UPPER(sql_redo) like '%DELETE%';   --获取DML语句

spool off;


5.结束logminer

EXECUTE DBMS_LOGMNR.END_LOGMNR();


6.分析结果

分析test.txt具体在做哪些操作。




oracle 分析归档日志

适用场景:数据库在某个时间段产生了大量的归档日志,用户想要分析在这段时间数据具体做了什么操作一、查询目标时间点的归档日志是否存在se...
点击下载文档
确认删除?
回到顶部
客服QQ
  • 客服QQ点击这里给我发消息