Oct 21 11:19:18 [7663] bl460g1n6 corosync notice  [MAIN  ] main.c:main:1171 Corosync Cluster Engine ('2.3.2.4-805b3'): started and ready to provide service.
Oct 21 11:19:18 [7663] bl460g1n6 corosync info    [MAIN  ] main.c:main:1172 Corosync built-in features: watchdog upstart snmp pie relro bindnow
Oct 21 11:19:18 [7663] bl460g1n6 corosync notice  [SERV  ] service.c:corosync_service_link_and_init:174 Service engine loaded: corosync configuration map access [0]
Oct 21 11:19:18 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_service_init:865 Initializing IPC on cmap [0]
Oct 21 11:19:18 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_get_ipc_type:811 No configured qb.ipc_type. Using native ipc
Oct 21 11:19:18 [7663] bl460g1n6 corosync notice  [SERV  ] service.c:corosync_service_link_and_init:174 Service engine loaded: corosync configuration service [1]
Oct 21 11:19:18 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_service_init:865 Initializing IPC on cfg [1]
Oct 21 11:19:18 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_get_ipc_type:811 No configured qb.ipc_type. Using native ipc
Oct 21 11:19:18 [7663] bl460g1n6 corosync notice  [SERV  ] service.c:corosync_service_link_and_init:174 Service engine loaded: corosync cluster closed process group service v1.01 [2]
Oct 21 11:19:18 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_service_init:865 Initializing IPC on cpg [2]
Oct 21 11:19:18 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_get_ipc_type:811 No configured qb.ipc_type. Using native ipc
Oct 21 11:19:18 [7663] bl460g1n6 corosync notice  [SERV  ] service.c:corosync_service_link_and_init:174 Service engine loaded: corosync profile loading service [4]
Oct 21 11:19:18 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_service_init:851 NOT Initializing IPC on pload [4]
Oct 21 11:19:18 [7663] bl460g1n6 corosync info    [WD    ] wd.c:setup_watchdog:651 Watchdog is now been tickled by corosync.
Oct 21 11:19:18 [7663] bl460g1n6 corosync debug   [WD    ] wd.c:setup_watchdog:652 HP iLO2+ HW Watchdog Timer
Oct 21 11:19:18 [7663] bl460g1n6 corosync info    [WD    ] wd.c:wd_scan_resources:580 no resources configured.
Oct 21 11:19:18 [7663] bl460g1n6 corosync notice  [SERV  ] service.c:corosync_service_link_and_init:174 Service engine loaded: corosync watchdog service [7]
Oct 21 11:19:18 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_service_init:851 NOT Initializing IPC on wd [7]
Oct 21 11:19:18 [7663] bl460g1n6 corosync notice  [QUORUM] vsf_quorum.c:quorum_exec_init_fn:274 Using quorum provider corosync_votequorum
Oct 21 11:19:18 [7663] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:votequorum_readconfig:967 Reading configuration (runtime: 0)
Oct 21 11:19:18 [7663] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:votequorum_read_nodelist_configuration:886 No nodelist defined or our node is not in the nodelist
Oct 21 11:19:18 [7663] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:recalculate_quorum:851 total_votes=1, expected_votes=3
Oct 21 11:19:18 [7663] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:calculate_quorum:670 node 3232261592 state=1, votes=1, expected=3
Oct 21 11:19:18 [7663] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:decode_flags:587 flags: quorate: No Leaving: No WFA Status: No First: Yes Qdevice: No QdeviceAlive: No QdeviceCastVote: No QdeviceMasterWins: No
Oct 21 11:19:18 [7663] bl460g1n6 corosync notice  [SERV  ] service.c:corosync_service_link_and_init:174 Service engine loaded: corosync vote quorum service v1.0 [5]
Oct 21 11:19:18 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_service_init:865 Initializing IPC on votequorum [5]
Oct 21 11:19:18 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_get_ipc_type:811 No configured qb.ipc_type. Using native ipc
Oct 21 11:19:18 [7663] bl460g1n6 corosync notice  [SERV  ] service.c:corosync_service_link_and_init:174 Service engine loaded: corosync cluster quorum service v0.1 [3]
Oct 21 11:19:18 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_service_init:865 Initializing IPC on quorum [3]
Oct 21 11:19:18 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_get_ipc_type:811 No configured qb.ipc_type. Using native ipc
Oct 21 11:19:19 [7663] bl460g1n6 corosync debug   [MAIN  ] main.c:member_object_joined:333 Member joined: r(0) ip(192.168.101.216) r(1) ip(192.168.102.216) 
Oct 21 11:19:19 [7663] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:message_handler_req_exec_votequorum_nodeinfo:1604 got nodeinfo message from cluster node 3232261592
Oct 21 11:19:19 [7663] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:message_handler_req_exec_votequorum_nodeinfo:1609 nodeinfo message[3232261592]: votes: 1, expected: 3 flags: 8
Oct 21 11:19:19 [7663] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:decode_flags:587 flags: quorate: No Leaving: No WFA Status: No First: Yes Qdevice: No QdeviceAlive: No QdeviceCastVote: No QdeviceMasterWins: No
Oct 21 11:19:19 [7663] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:recalculate_quorum:851 total_votes=1, expected_votes=3
Oct 21 11:19:19 [7663] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:calculate_quorum:670 node 3232261592 state=1, votes=1, expected=3
Oct 21 11:19:19 [7663] bl460g1n6 corosync debug   [SYNC  ] sync.c:sync_barrier_handler:232 Committing synchronization for corosync configuration map access
Oct 21 11:19:19 [7663] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_sync_activate:386 Single node sync -> no action
Oct 21 11:19:19 [7663] bl460g1n6 corosync debug   [CPG   ] cpg.c:downlist_log:776 comparing: sender r(0) ip(192.168.101.216) r(1) ip(192.168.102.216) ; members(old:0 left:0)
Oct 21 11:19:19 [7663] bl460g1n6 corosync debug   [CPG   ] cpg.c:downlist_log:776 chosen downlist: sender r(0) ip(192.168.101.216) r(1) ip(192.168.102.216) ; members(old:0 left:0)
Oct 21 11:19:19 [7663] bl460g1n6 corosync debug   [SYNC  ] sync.c:sync_barrier_handler:232 Committing synchronization for corosync cluster closed process group service v1.01
Oct 21 11:19:19 [7663] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:decode_flags:587 flags: quorate: No Leaving: No WFA Status: No First: Yes Qdevice: No QdeviceAlive: No QdeviceCastVote: No QdeviceMasterWins: No
Oct 21 11:19:19 [7663] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:message_handler_req_exec_votequorum_nodeinfo:1604 got nodeinfo message from cluster node 3232261592
Oct 21 11:19:19 [7663] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:message_handler_req_exec_votequorum_nodeinfo:1609 nodeinfo message[3232261592]: votes: 1, expected: 3 flags: 8
Oct 21 11:19:19 [7663] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:decode_flags:587 flags: quorate: No Leaving: No WFA Status: No First: Yes Qdevice: No QdeviceAlive: No QdeviceCastVote: No QdeviceMasterWins: No
Oct 21 11:19:19 [7663] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:recalculate_quorum:851 total_votes=1, expected_votes=3
Oct 21 11:19:19 [7663] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:calculate_quorum:670 node 3232261592 state=1, votes=1, expected=3
Oct 21 11:19:19 [7663] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:message_handler_req_exec_votequorum_nodeinfo:1604 got nodeinfo message from cluster node 3232261592
Oct 21 11:19:19 [7663] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:message_handler_req_exec_votequorum_nodeinfo:1609 nodeinfo message[0]: votes: 0, expected: 0 flags: 0
Oct 21 11:19:19 [7663] bl460g1n6 corosync debug   [SYNC  ] sync.c:sync_barrier_handler:232 Committing synchronization for corosync vote quorum service v1.0
Oct 21 11:19:19 [7663] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:recalculate_quorum:851 total_votes=1, expected_votes=3
Oct 21 11:19:19 [7663] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:calculate_quorum:670 node 3232261592 state=1, votes=1, expected=3
Oct 21 11:19:19 [7663] bl460g1n6 corosync notice  [QUORUM] vsf_quorum.c:log_view_list:132 Members[1]: -1062705704
Oct 21 11:19:19 [7663] bl460g1n6 corosync debug   [QUORUM] vsf_quorum.c:send_library_notification:359 sending quorum notification to (nil), length = 52
Oct 21 11:19:19 [7663] bl460g1n6 corosync notice  [MAIN  ] main.c:corosync_sync_completed:276 Completed service synchronization, ready to provide service.
Oct 21 11:19:19 [7663] bl460g1n6 corosync debug   [QB    ] ipc_shm.c:qb_ipcs_shm_connect:295 connecting to client [7672]
Oct 21 11:19:19 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:19 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:19 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:19 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_created:272 connection created
Oct 21 11:19:19 [7663] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_dispatch_connection_request:757 HUP conn (7666-7672-26)
Oct 21 11:19:19 [7663] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_disconnect:605 qb_ipcs_disconnect(7666-7672-26) state:2
Oct 21 11:19:19 [7663] bl460g1n6 corosync debug   [QB    ] loop_poll_epoll.c:_del:117 epoll_ctl(del): Bad file descriptor (9)
Oct 21 11:19:19 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_closed:417 cs_ipcs_connection_closed() 
Oct 21 11:19:19 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_destroyed:390 cs_ipcs_connection_destroyed() 
Oct 21 11:19:19 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cfg-response-7666-7672-26-header
Oct 21 11:19:19 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cfg-event-7666-7672-26-header
Oct 21 11:19:19 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cfg-request-7666-7672-26-header
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:     info: crm_log_init: 	Changed active directory to /var/lib/heartbeat/cores/root
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: main: 	Checking for old instances of pacemakerd
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:     info: crm_ipc_connect: 	Could not establish pacemakerd connection: Connection refused (111)
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ipc_shm.c:qb_ipcs_shm_connect:295 connecting to client [7676]
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_created:272 connection created
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_init_fn:306 lib_init_fn: conn=0x7fb148dc33b0
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: get_cluster_type: 	Testing with Corosync
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ipc_shm.c:qb_ipcs_shm_connect:295 connecting to client [7676]
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_created:272 connection created
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_init_fn:306 lib_init_fn: conn=0x7fb148ec9a10
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-request-7666-7676-27-header
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-response-7666-7676-27-header
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-event-7666-7676-27-header
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:     info: get_cluster_type: 	Detected an active 'corosync' cluster
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:     info: mcp_read_config: 	Reading configure for stack: corosync
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_dispatch_connection_request:757 HUP conn (7666-7676-27)
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_disconnect:605 qb_ipcs_disconnect(7666-7676-27) state:2
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] loop_poll_epoll.c:_del:117 epoll_ctl(del): Bad file descriptor (9)
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_closed:417 cs_ipcs_connection_closed() 
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_exit_fn:325 exit_fn for conn=0x7fb148ec9a10
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_destroyed:390 cs_ipcs_connection_destroyed() 
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-response-7666-7676-27-header
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-event-7666-7676-27-header
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-request-7666-7676-27-header
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:   notice: mcp_read_config: 	Configured corosync to accept connections from group 189: OK (1)
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-request-7666-7676-26-header
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-response-7666-7676-26-header
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_dispatch_connection_request:757 HUP conn (7666-7676-26)
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-event-7666-7676-26-header
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_disconnect:605 qb_ipcs_disconnect(7666-7676-26) state:2
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] loop_poll_epoll.c:_del:117 epoll_ctl(del): Bad file descriptor (9)
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_closed:417 cs_ipcs_connection_closed() 
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_exit_fn:325 exit_fn for conn=0x7fb148dc33b0
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_destroyed:390 cs_ipcs_connection_destroyed() 
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-response-7666-7676-26-header
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:   notice: crm_add_logfile: 	Additional logging available in /var/log/ha-debug
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-event-7666-7676-26-header
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:   notice: main: 	Starting Pacemaker 1.1.11-0.302.b6d42ed.git.el6 (Build: b6d42ed):  generated-manpages agent-manpages ascii-docs ncurses libqb-logging libqb-ipc lha-fencing nagios  corosync-native snmp
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:   notice: main: 	Starting Pacemaker 1.1.11-0.302.b6d42ed.git.el6 (Build: b6d42ed):  generated-manpages agent-manpages ascii-docs ncurses libqb-logging libqb-ipc lha-fencing nagios  corosync-native snmp
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:     info: main: 	Maximum core file size is: 18446744073709551615
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:     info: main: 	Maximum core file size is: 18446744073709551615
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-request-7666-7676-26-header
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:     info: qb_ipcs_us_publish: 	server name: pacemakerd
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:     info: qb_ipcs_us_publish: 	server name: pacemakerd
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ipc_shm.c:qb_ipcs_shm_connect:295 connecting to client [7676]
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_created:272 connection created
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: cluster_connect_cfg: 	Our nodeid: -1062705704
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: cluster_connect_cfg: 	Our nodeid: -1062705704
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ipc_shm.c:qb_ipcs_shm_connect:295 connecting to client [7676]
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_created:272 connection created
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [CPG   ] cpg.c:cpg_lib_init_fn:1459 lib_init_fn: conn=0x7fb148fcac40, cpd=0x7fb148fcb394
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [MAIN  ] main.c:member_object_joined:333 Member joined: r(0) ip(192.168.101.217) r(1) ip(192.168.102.217) 
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [MAIN  ] main.c:member_object_joined:333 Member joined: r(0) ip(192.168.101.218) r(1) ip(192.168.102.218) 
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [SYNC  ] sync.c:sync_barrier_handler:232 Committing synchronization for corosync configuration map access
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: get_local_nodeid: 	Local nodeid is 3232261592
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: get_local_nodeid: 	Local nodeid is 3232261592
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_sync_activate:400 My config version is 0 -> no action
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [CPG   ] cpg.c:downlist_log:776 comparing: sender r(0) ip(192.168.101.217) r(1) ip(192.168.102.217) ; members(old:1 left:0)
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [CPG   ] cpg.c:downlist_log:776 comparing: sender r(0) ip(192.168.101.216) r(1) ip(192.168.102.216) ; members(old:1 left:0)
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [CPG   ] cpg.c:downlist_log:776 comparing: sender r(0) ip(192.168.101.218) r(1) ip(192.168.102.218) ; members(old:1 left:0)
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [CPG   ] cpg.c:downlist_log:776 chosen downlist: sender r(0) ip(192.168.101.216) r(1) ip(192.168.102.216) ; members(old:1 left:0)
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [SYNC  ] sync.c:sync_barrier_handler:232 Committing synchronization for corosync cluster closed process group service v1.01
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:decode_flags:587 flags: quorate: No Leaving: No WFA Status: No First: No Qdevice: No QdeviceAlive: No QdeviceCastVote: No QdeviceMasterWins: No
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:message_handler_req_exec_votequorum_nodeinfo:1604 got nodeinfo message from cluster node 3232261593
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:message_handler_req_exec_votequorum_nodeinfo:1609 nodeinfo message[3232261593]: votes: 1, expected: 3 flags: 0
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:decode_flags:587 flags: quorate: No Leaving: No WFA Status: No First: No Qdevice: No QdeviceAlive: No QdeviceCastVote: No QdeviceMasterWins: No
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:recalculate_quorum:851 total_votes=2, expected_votes=3
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:calculate_quorum:670 node 3232261592 state=1, votes=1, expected=3
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:calculate_quorum:670 node 3232261593 state=1, votes=1, expected=3
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:get_lowest_node_id:527 lowest node id: -1062705704 us: -1062705704
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:are_we_quorate:777 quorum regained, resuming activity
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:message_handler_req_exec_votequorum_nodeinfo:1604 got nodeinfo message from cluster node 3232261593
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:message_handler_req_exec_votequorum_nodeinfo:1609 nodeinfo message[0]: votes: 0, expected: 0 flags: 0
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:     info: crm_get_peer: 	Created entry 12edc4e5-0d77-4352-9cad-832b5ed0646f/0x27050b0 for node (null)/3232261592 (1 total)
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:     info: crm_get_peer: 	Created entry 12edc4e5-0d77-4352-9cad-832b5ed0646f/0x27050b0 for node (null)/3232261592 (1 total)
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:message_handler_req_exec_votequorum_nodeinfo:1604 got nodeinfo message from cluster node 3232261594
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:message_handler_req_exec_votequorum_nodeinfo:1609 nodeinfo message[3232261594]: votes: 1, expected: 3 flags: 0
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:decode_flags:587 flags: quorate: No Leaving: No WFA Status: No First: No Qdevice: No QdeviceAlive: No QdeviceCastVote: No QdeviceMasterWins: No
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:recalculate_quorum:851 total_votes=3, expected_votes=3
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:     info: crm_get_peer: 	Node 3232261592 has uuid 3232261592
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:calculate_quorum:670 node 3232261592 state=1, votes=1, expected=3
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:     info: crm_get_peer: 	Node 3232261592 has uuid 3232261592
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:calculate_quorum:670 node 3232261593 state=1, votes=1, expected=3
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:     info: crm_update_peer_proc: 	cluster_connect_cpg: Node (null)[3232261592] - corosync-cpg is now online
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:calculate_quorum:670 node 3232261594 state=1, votes=1, expected=3
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:     info: crm_update_peer_proc: 	cluster_connect_cpg: Node (null)[3232261592] - corosync-cpg is now online
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:get_lowest_node_id:527 lowest node id: -1062705704 us: -1062705704
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: cluster_connect_quorum: 	Configuring Pacemaker to obtain quorum from Corosync
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:message_handler_req_exec_votequorum_nodeinfo:1604 got nodeinfo message from cluster node 3232261594
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: cluster_connect_quorum: 	Configuring Pacemaker to obtain quorum from Corosync
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:message_handler_req_exec_votequorum_nodeinfo:1609 nodeinfo message[0]: votes: 0, expected: 0 flags: 0
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:message_handler_req_exec_votequorum_nodeinfo:1604 got nodeinfo message from cluster node 3232261592
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:message_handler_req_exec_votequorum_nodeinfo:1609 nodeinfo message[3232261592]: votes: 1, expected: 3 flags: 0
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:decode_flags:587 flags: quorate: No Leaving: No WFA Status: No First: No Qdevice: No QdeviceAlive: No QdeviceCastVote: No QdeviceMasterWins: No
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:recalculate_quorum:851 total_votes=3, expected_votes=3
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:calculate_quorum:670 node 3232261592 state=1, votes=1, expected=3
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:calculate_quorum:670 node 3232261593 state=1, votes=1, expected=3
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:calculate_quorum:670 node 3232261594 state=1, votes=1, expected=3
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:get_lowest_node_id:527 lowest node id: -1062705704 us: -1062705704
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:message_handler_req_exec_votequorum_nodeinfo:1604 got nodeinfo message from cluster node 3232261592
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:message_handler_req_exec_votequorum_nodeinfo:1609 nodeinfo message[0]: votes: 0, expected: 0 flags: 0
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [SYNC  ] sync.c:sync_barrier_handler:232 Committing synchronization for corosync vote quorum service v1.0
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:recalculate_quorum:851 total_votes=3, expected_votes=3
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:calculate_quorum:670 node 3232261592 state=1, votes=1, expected=3
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:calculate_quorum:670 node 3232261593 state=1, votes=1, expected=3
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:calculate_quorum:670 node 3232261594 state=1, votes=1, expected=3
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:get_lowest_node_id:527 lowest node id: -1062705704 us: -1062705704
Oct 21 11:19:21 [7663] bl460g1n6 corosync notice  [QUORUM] vsf_quorum.c:quorum_api_set_quorum:148 This node is within the primary component and will provide service.
Oct 21 11:19:21 [7663] bl460g1n6 corosync notice  [QUORUM] vsf_quorum.c:log_view_list:132 Members[3]: -1062705704 -1062705703 -1062705702
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QUORUM] vsf_quorum.c:send_library_notification:359 sending quorum notification to (nil), length = 60
Oct 21 11:19:21 [7663] bl460g1n6 corosync notice  [MAIN  ] main.c:corosync_sync_completed:276 Completed service synchronization, ready to provide service.
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [CPG   ] cpg.c:message_handler_req_exec_cpg_procjoin:1260 got procjoin message from cluster node -1062705704 (r(0) ip(192.168.101.216) r(1) ip(192.168.102.216) ) for pid 7676
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ipc_shm.c:qb_ipcs_shm_connect:295 connecting to client [7676]
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_created:272 connection created
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QUORUM] vsf_quorum.c:quorum_lib_init_fn:316 lib_init_fn: conn=0x7fb148fcd7d0
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QUORUM] vsf_quorum.c:message_handler_req_lib_quorum_gettype:471 got quorum_type request on 0x7fb148fcd7d0
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QUORUM] vsf_quorum.c:message_handler_req_lib_quorum_getquorate:395 got quorate request on 0x7fb148fcd7d0
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:   notice: cluster_connect_quorum: 	Quorum acquired
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:   notice: cluster_connect_quorum: 	Quorum acquired
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QUORUM] vsf_quorum.c:message_handler_req_lib_quorum_trackstart:412 got trackstart request on 0x7fb148fcd7d0
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QUORUM] vsf_quorum.c:message_handler_req_lib_quorum_trackstart:420 sending initial status to 0x7fb148fcd7d0
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QUORUM] vsf_quorum.c:send_library_notification:359 sending quorum notification to 0x7fb148fcd7d0, length = 60
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ipc_shm.c:qb_ipcs_shm_connect:295 connecting to client [7676]
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_created:272 connection created
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_init_fn:306 lib_init_fn: conn=0x7fb148fcdf20
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-request-7666-7676-29-header
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-request-7666-7676-29-header
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_dispatch_connection_request:757 HUP conn (7666-7676-29)
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-response-7666-7676-29-header
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-response-7666-7676-29-header
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_disconnect:605 qb_ipcs_disconnect(7666-7676-29) state:2
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-event-7666-7676-29-header
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-event-7666-7676-29-header
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] loop_poll_epoll.c:_del:117 epoll_ctl(del): Bad file descriptor (9)
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_closed:417 cs_ipcs_connection_closed() 
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_exit_fn:325 exit_fn for conn=0x7fb148fcdf20
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_destroyed:390 cs_ipcs_connection_destroyed() 
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-response-7666-7676-29-header
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-event-7666-7676-29-header
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-request-7666-7676-29-header
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ipc_shm.c:qb_ipcs_shm_connect:295 connecting to client [7676]
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_created:272 connection created
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_init_fn:306 lib_init_fn: conn=0x7fb148fcdf20
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-request-7666-7676-29-header
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-request-7666-7676-29-header
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_dispatch_connection_request:757 HUP conn (7666-7676-29)
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-response-7666-7676-29-header
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-response-7666-7676-29-header
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_disconnect:605 qb_ipcs_disconnect(7666-7676-29) state:2
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-event-7666-7676-29-header
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-event-7666-7676-29-header
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] loop_poll_epoll.c:_del:117 epoll_ctl(del): Bad file descriptor (9)
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_closed:417 cs_ipcs_connection_closed() 
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_exit_fn:325 exit_fn for conn=0x7fb148fcdf20
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:   notice: corosync_node_name: 	Unable to get node name for nodeid 3232261592
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:   notice: corosync_node_name: 	Unable to get node name for nodeid 3232261592
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_destroyed:390 cs_ipcs_connection_destroyed() 
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-response-7666-7676-29-header
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:   notice: get_node_name: 	Defaulting to uname -n for the local corosync node name
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:   notice: get_node_name: 	Defaulting to uname -n for the local corosync node name
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:     info: crm_get_peer: 	Node 3232261592 is now known as bl460g1n6
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:     info: crm_get_peer: 	Node 3232261592 is now known as bl460g1n6
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-event-7666-7676-29-header
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-request-7666-7676-29-header
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:     info: start_child: 	Using uid=189 and group=189 for process cib
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:     info: start_child: 	Using uid=189 and group=189 for process cib
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:     info: start_child: 	Forked child 7680 for process cib
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:     info: start_child: 	Forked child 7680 for process cib
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: update_node_processes: 	Node bl460g1n6 now has process list: 00000000000000000000000000000100 (was 00000000000000000000000004000000)
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: update_node_processes: 	Node bl460g1n6 now has process list: 00000000000000000000000000000100 (was 00000000000000000000000004000000)
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:     info: start_child: 	Forked child 7681 for process stonith-ng
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:     info: start_child: 	Forked child 7681 for process stonith-ng
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: update_node_processes: 	Node bl460g1n6 now has process list: 00000000000000000000000000100100 (was 00000000000000000000000000000100)
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: update_node_processes: 	Node bl460g1n6 now has process list: 00000000000000000000000000100100 (was 00000000000000000000000000000100)
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:     info: start_child: 	Forked child 7682 for process lrmd
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:     info: start_child: 	Forked child 7682 for process lrmd
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: update_node_processes: 	Node bl460g1n6 now has process list: 00000000000000000000000000100110 (was 00000000000000000000000000100100)
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: update_node_processes: 	Node bl460g1n6 now has process list: 00000000000000000000000000100110 (was 00000000000000000000000000100100)
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:     info: start_child: 	Using uid=189 and group=189 for process attrd
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:     info: start_child: 	Using uid=189 and group=189 for process attrd
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:     info: start_child: 	Forked child 7683 for process attrd
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:     info: start_child: 	Forked child 7683 for process attrd
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: update_node_processes: 	Node bl460g1n6 now has process list: 00000000000000000000000000101110 (was 00000000000000000000000000100110)
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: update_node_processes: 	Node bl460g1n6 now has process list: 00000000000000000000000000101110 (was 00000000000000000000000000100110)
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:     info: start_child: 	Using uid=189 and group=189 for process pengine
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:     info: start_child: 	Using uid=189 and group=189 for process pengine
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:     info: start_child: 	Forked child 7684 for process pengine
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:     info: start_child: 	Forked child 7684 for process pengine
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: update_node_processes: 	Node bl460g1n6 now has process list: 00000000000000000000000000111110 (was 00000000000000000000000000101110)
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: update_node_processes: 	Node bl460g1n6 now has process list: 00000000000000000000000000111110 (was 00000000000000000000000000101110)
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:     info: start_child: 	Using uid=189 and group=189 for process crmd
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:     info: start_child: 	Using uid=189 and group=189 for process crmd
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:     info: start_child: 	Forked child 7685 for process crmd
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:     info: start_child: 	Forked child 7685 for process crmd
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: update_node_processes: 	Node bl460g1n6 now has process list: 00000000000000000000000000111310 (was 00000000000000000000000000111110)
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: update_node_processes: 	Node bl460g1n6 now has process list: 00000000000000000000000000111310 (was 00000000000000000000000000111110)
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:     info: main: 	Starting mainloop
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:     info: main: 	Starting mainloop
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:     info: pcmk_quorum_notification: 	Membership 16: quorum retained (3)
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:     info: pcmk_quorum_notification: 	Membership 16: quorum retained (3)
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: pcmk_quorum_notification: 	Member[0] 3232261592 
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: pcmk_quorum_notification: 	Member[0] 3232261592 
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:   notice: crm_update_peer_state: 	pcmk_quorum_notification: Node bl460g1n6[3232261592] - state is now member (was (null))
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:   notice: crm_update_peer_state: 	pcmk_quorum_notification: Node bl460g1n6[3232261592] - state is now member (was (null))
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: pcmk_quorum_notification: 	Member[1] 3232261593 
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: pcmk_quorum_notification: 	Member[1] 3232261593 
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:     info: crm_get_peer: 	Created entry 617a50e0-e67d-4d8b-8840-23db550a42c3/0x2806dd0 for node (null)/3232261593 (2 total)
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:     info: crm_get_peer: 	Created entry 617a50e0-e67d-4d8b-8840-23db550a42c3/0x2806dd0 for node (null)/3232261593 (2 total)
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:     info: crm_get_peer: 	Node 3232261593 has uuid 3232261593
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:     info: crm_get_peer: 	Node 3232261593 has uuid 3232261593
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:     info: pcmk_quorum_notification: 	Obtaining name for new node 3232261593
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:     info: pcmk_quorum_notification: 	Obtaining name for new node 3232261593
Oct 21 11:19:21 [7680] bl460g1n6        cib:     info: crm_log_init: 	Changed active directory to /var/lib/heartbeat/cores/hacluster
Oct 21 11:19:21 [7680] bl460g1n6        cib:   notice: main: 	Using new config location: /var/lib/pacemaker/cib
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ipc_shm.c:qb_ipcs_shm_connect:295 connecting to client [7676]
Oct 21 11:19:21 [7680] bl460g1n6        cib:     info: get_cluster_type: 	Verifying cluster type: 'corosync'
Oct 21 11:19:21 [7680] bl460g1n6        cib:     info: get_cluster_type: 	Assuming an active 'corosync' cluster
Oct 21 11:19:21 [7680] bl460g1n6        cib:     info: retrieveCib: 	Reading cluster configuration from: /var/lib/pacemaker/cib/cib.xml (digest: /var/lib/pacemaker/cib/cib.xml.sig)
Oct 21 11:19:21 [7680] bl460g1n6        cib:  warning: retrieveCib: 	Cluster configuration not found: /var/lib/pacemaker/cib/cib.xml
Oct 21 11:19:21 [7680] bl460g1n6        cib:  warning: readCibXmlFile: 	Primary configuration corrupt or unusable, trying backups in /var/lib/pacemaker/cib
Oct 21 11:19:21 [7680] bl460g1n6        cib:  warning: readCibXmlFile: 	Continuing with an empty configuration.
Oct 21 11:19:21 [7680] bl460g1n6        cib:     info: validate_with_relaxng: 	Creating RNG parser context
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7683] bl460g1n6      attrd:     info: crm_log_init: 	Changed active directory to /var/lib/heartbeat/cores/hacluster
Oct 21 11:19:21 [7683] bl460g1n6      attrd:     info: main: 	Starting up
Oct 21 11:19:21 [7683] bl460g1n6      attrd:     info: get_cluster_type: 	Verifying cluster type: 'corosync'
Oct 21 11:19:21 [7683] bl460g1n6      attrd:     info: get_cluster_type: 	Assuming an active 'corosync' cluster
Oct 21 11:19:21 [7683] bl460g1n6      attrd:   notice: crm_cluster_connect: 	Connecting to cluster infrastructure: corosync
Oct 21 11:19:21 [7682] bl460g1n6       lrmd:     info: crm_log_init: 	Changed active directory to /var/lib/heartbeat/cores/root
Oct 21 11:19:21 [7682] bl460g1n6       lrmd:     info: qb_ipcs_us_publish: 	server name: lrmd
Oct 21 11:19:21 [7682] bl460g1n6       lrmd:     info: main: 	Starting
Oct 21 11:19:21 [7684] bl460g1n6    pengine:     info: crm_log_init: 	Changed active directory to /var/lib/heartbeat/cores/hacluster
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7684] bl460g1n6    pengine:    debug: main: 	Init server comms
Oct 21 11:19:21 [7684] bl460g1n6    pengine:     info: qb_ipcs_us_publish: 	server name: pengine
Oct 21 11:19:21 [7681] bl460g1n6 stonith-ng:     info: crm_log_init: 	Changed active directory to /var/lib/heartbeat/cores/root
Oct 21 11:19:21 [7684] bl460g1n6    pengine:     info: main: 	Starting pengine
Oct 21 11:19:21 [7681] bl460g1n6 stonith-ng:     info: get_cluster_type: 	Verifying cluster type: 'corosync'
Oct 21 11:19:21 [7681] bl460g1n6 stonith-ng:     info: get_cluster_type: 	Assuming an active 'corosync' cluster
Oct 21 11:19:21 [7681] bl460g1n6 stonith-ng:   notice: crm_cluster_connect: 	Connecting to cluster infrastructure: corosync
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_created:272 connection created
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_init_fn:306 lib_init_fn: conn=0x7fb148fcdd90
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ipc_shm.c:qb_ipcs_shm_connect:295 connecting to client [7683]
Oct 21 11:19:21 [7685] bl460g1n6       crmd:     info: crm_log_init: 	Changed active directory to /var/lib/heartbeat/cores/hacluster
Oct 21 11:19:21 [7685] bl460g1n6       crmd:   notice: main: 	CRM Git Version: b6d42ed
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7685] bl460g1n6       crmd:    debug: crmd_init: 	Starting crmd
Oct 21 11:19:21 [7685] bl460g1n6       crmd:    debug: s_crmd_fsa: 	Processing I_STARTUP: [ state=S_STARTING cause=C_STARTUP origin=crmd_init ]
Oct 21 11:19:21 [7685] bl460g1n6       crmd:     info: do_log: 	FSA: Input I_STARTUP from crmd_init() received in state S_STARTING
Oct 21 11:19:21 [7685] bl460g1n6       crmd:    debug: do_startup: 	Registering Signal Handlers
Oct 21 11:19:21 [7685] bl460g1n6       crmd:    debug: do_startup: 	Creating CIB and LRM objects
Oct 21 11:19:21 [7685] bl460g1n6       crmd:     info: get_cluster_type: 	Verifying cluster type: 'corosync'
Oct 21 11:19:21 [7685] bl460g1n6       crmd:     info: get_cluster_type: 	Assuming an active 'corosync' cluster
Oct 21 11:19:21 [7685] bl460g1n6       crmd:     info: crm_ipc_connect: 	Could not establish cib_shm connection: Connection refused (111)
Oct 21 11:19:21 [7685] bl460g1n6       crmd:    debug: cib_native_signon_raw: 	Connection unsuccessful (0 (nil))
Oct 21 11:19:21 [7685] bl460g1n6       crmd:    debug: cib_native_signon_raw: 	Connection to CIB failed: Transport endpoint is not connected
Oct 21 11:19:21 [7685] bl460g1n6       crmd:    debug: cib_native_signoff: 	Signing out of the CIB Service
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7680] bl460g1n6        cib:    debug: activateCibXml: 	Triggering CIB write for start op
Oct 21 11:19:21 [7680] bl460g1n6        cib:     info: startCib: 	CIB Initialization completed successfully
Oct 21 11:19:21 [7680] bl460g1n6        cib:   notice: crm_cluster_connect: 	Connecting to cluster infrastructure: corosync
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_created:272 connection created
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [CPG   ] cpg.c:cpg_lib_init_fn:1459 lib_init_fn: conn=0x7fb1490d2590, cpd=0x7fb1490cfc44
Oct 21 11:19:21 [7683] bl460g1n6      attrd:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-request-7666-7676-29-header
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-request-7666-7676-29-header
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-response-7666-7676-29-header
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-response-7666-7676-29-header
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-event-7666-7676-29-header
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-event-7666-7676-29-header
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [CPG   ] cpg.c:message_handler_req_exec_cpg_procjoin:1260 got procjoin message from cluster node -1062705702 (r(0) ip(192.168.101.218) r(1) ip(192.168.102.218) ) for pid 1591
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:   notice: corosync_node_name: 	Unable to get node name for nodeid 3232261593
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:   notice: corosync_node_name: 	Unable to get node name for nodeid 3232261593
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:   notice: crm_update_peer_state: 	pcmk_quorum_notification: Node (null)[3232261593] - state is now member (was (null))
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:   notice: crm_update_peer_state: 	pcmk_quorum_notification: Node (null)[3232261593] - state is now member (was (null))
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: pcmk_quorum_notification: 	Member[2] 3232261594 
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: pcmk_quorum_notification: 	Member[2] 3232261594 
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:     info: crm_get_peer: 	Created entry 07ba0450-e6db-4bac-b6c8-19517f3be316/0x2806b40 for node (null)/3232261594 (3 total)
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:     info: crm_get_peer: 	Created entry 07ba0450-e6db-4bac-b6c8-19517f3be316/0x2806b40 for node (null)/3232261594 (3 total)
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:     info: crm_get_peer: 	Node 3232261594 has uuid 3232261594
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:     info: crm_get_peer: 	Node 3232261594 has uuid 3232261594
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:     info: pcmk_quorum_notification: 	Obtaining name for new node 3232261594
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:     info: pcmk_quorum_notification: 	Obtaining name for new node 3232261594
Oct 21 11:19:21 [7683] bl460g1n6      attrd:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ipc_shm.c:qb_ipcs_shm_connect:295 connecting to client [7681]
Oct 21 11:19:21 [7683] bl460g1n6      attrd:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_created:272 connection created
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [CPG   ] cpg.c:cpg_lib_init_fn:1459 lib_init_fn: conn=0x7fb1490cfe70, cpd=0x7fb1490d06c4
Oct 21 11:19:21 [7681] bl460g1n6 stonith-ng:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_dispatch_connection_request:757 HUP conn (7666-7676-29)
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_disconnect:605 qb_ipcs_disconnect(7666-7676-29) state:2
Oct 21 11:19:21 [7681] bl460g1n6 stonith-ng:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] loop_poll_epoll.c:_del:117 epoll_ctl(del): Bad file descriptor (9)
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_closed:417 cs_ipcs_connection_closed() 
Oct 21 11:19:21 [7681] bl460g1n6 stonith-ng:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_exit_fn:325 exit_fn for conn=0x7fb148fcdd90
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_destroyed:390 cs_ipcs_connection_destroyed() 
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-response-7666-7676-29-header
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-event-7666-7676-29-header
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-request-7666-7676-29-header
Oct 21 11:19:21 [7683] bl460g1n6      attrd:    debug: get_local_nodeid: 	Local nodeid is 3232261592
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ipc_shm.c:qb_ipcs_shm_connect:295 connecting to client [7680]
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_created:272 connection created
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [CPG   ] cpg.c:cpg_lib_init_fn:1459 lib_init_fn: conn=0x7fb148fcdd90, cpd=0x7fb148fcea04
Oct 21 11:19:21 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7681] bl460g1n6 stonith-ng:    debug: get_local_nodeid: 	Local nodeid is 3232261592
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ipc_shm.c:qb_ipcs_shm_connect:295 connecting to client [7676]
Oct 21 11:19:21 [7683] bl460g1n6      attrd:     info: crm_get_peer: 	Created entry 8c9db8f7-1eda-496b-b58d-681d03932c25/0x14dd120 for node (null)/3232261592 (1 total)
Oct 21 11:19:21 [7683] bl460g1n6      attrd:     info: crm_get_peer: 	Node 3232261592 has uuid 3232261592
Oct 21 11:19:21 [7683] bl460g1n6      attrd:     info: crm_update_peer_proc: 	cluster_connect_cpg: Node (null)[3232261592] - corosync-cpg is now online
Oct 21 11:19:21 [7683] bl460g1n6      attrd:   notice: crm_update_peer_state: 	attrd_peer_change_cb: Node (null)[3232261592] - state is now member (was (null))
Oct 21 11:19:21 [7683] bl460g1n6      attrd:     info: init_cs_connection_once: 	Connection to 'corosync': established
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_created:272 connection created
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_init_fn:306 lib_init_fn: conn=0x7fb1490ced10
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [CPG   ] cpg.c:message_handler_req_exec_cpg_procjoin:1260 got procjoin message from cluster node -1062705704 (r(0) ip(192.168.101.216) r(1) ip(192.168.102.216) ) for pid 7683
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7680] bl460g1n6        cib:    debug: get_local_nodeid: 	Local nodeid is 3232261592
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ipc_shm.c:qb_ipcs_shm_connect:295 connecting to client [7683]
Oct 21 11:19:21 [7681] bl460g1n6 stonith-ng:     info: crm_get_peer: 	Created entry fc0668eb-0b02-442e-babc-17e8d7a5f995/0x1dd8660 for node (null)/3232261592 (1 total)
Oct 21 11:19:21 [7681] bl460g1n6 stonith-ng:     info: crm_get_peer: 	Node 3232261592 has uuid 3232261592
Oct 21 11:19:21 [7681] bl460g1n6 stonith-ng:     info: crm_update_peer_proc: 	cluster_connect_cpg: Node (null)[3232261592] - corosync-cpg is now online
Oct 21 11:19:21 [7681] bl460g1n6 stonith-ng:     info: init_cs_connection_once: 	Connection to 'corosync': established
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_created:272 connection created
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_init_fn:306 lib_init_fn: conn=0x7fb1490d55d0
Oct 21 11:19:21 [7683] bl460g1n6      attrd:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7683] bl460g1n6      attrd:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7683] bl460g1n6      attrd:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-request-7666-7676-32-header
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-request-7666-7676-32-header
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-response-7666-7676-32-header
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-response-7666-7676-32-header
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-event-7666-7676-32-header
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-event-7666-7676-32-header
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:   notice: corosync_node_name: 	Unable to get node name for nodeid 3232261594
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:   notice: corosync_node_name: 	Unable to get node name for nodeid 3232261594
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:   notice: crm_update_peer_state: 	pcmk_quorum_notification: Node (null)[3232261594] - state is now member (was (null))
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:   notice: crm_update_peer_state: 	pcmk_quorum_notification: Node (null)[3232261594] - state is now member (was (null))
Oct 21 11:19:21 [7680] bl460g1n6        cib:     info: crm_get_peer: 	Created entry 7f72d0e1-a812-47e8-a2d4-a4ae433f79a8/0xb4bfd0 for node (null)/3232261592 (1 total)
Oct 21 11:19:21 [7680] bl460g1n6        cib:     info: crm_get_peer: 	Node 3232261592 has uuid 3232261592
Oct 21 11:19:21 [7680] bl460g1n6        cib:     info: crm_update_peer_proc: 	cluster_connect_cpg: Node (null)[3232261592] - corosync-cpg is now online
Oct 21 11:19:21 [7680] bl460g1n6        cib:     info: init_cs_connection_once: 	Connection to 'corosync': established
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_dispatch_connection_request:757 HUP conn (7666-7676-32)
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_disconnect:605 qb_ipcs_disconnect(7666-7676-32) state:2
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] loop_poll_epoll.c:_del:117 epoll_ctl(del): Bad file descriptor (9)
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_closed:417 cs_ipcs_connection_closed() 
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_exit_fn:325 exit_fn for conn=0x7fb1490ced10
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_destroyed:390 cs_ipcs_connection_destroyed() 
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-response-7666-7676-32-header
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-event-7666-7676-32-header
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-request-7666-7676-32-header
Oct 21 11:19:21 [7683] bl460g1n6      attrd:    debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Oct 21 11:19:21 [7683] bl460g1n6      attrd:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-request-7666-7683-33-header
Oct 21 11:19:21 [7683] bl460g1n6      attrd:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-response-7666-7683-33-header
Oct 21 11:19:21 [7683] bl460g1n6      attrd:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-event-7666-7683-33-header
Oct 21 11:19:21 [7683] bl460g1n6      attrd:   notice: corosync_node_name: 	Unable to get node name for nodeid 3232261592
Oct 21 11:19:21 [7683] bl460g1n6      attrd:   notice: get_node_name: 	Defaulting to uname -n for the local corosync node name
Oct 21 11:19:21 [7683] bl460g1n6      attrd:     info: crm_get_peer: 	Node 3232261592 is now known as bl460g1n6
Oct 21 11:19:21 [7683] bl460g1n6      attrd:     info: main: 	Cluster connection active
Oct 21 11:19:21 [7683] bl460g1n6      attrd:     info: qb_ipcs_us_publish: 	server name: attrd
Oct 21 11:19:21 [7683] bl460g1n6      attrd:     info: main: 	Accepting attribute updates
Oct 21 11:19:21 [7683] bl460g1n6      attrd:    debug: attrd_cib_connect: 	CIB signon attempt 1
Oct 21 11:19:21 [7683] bl460g1n6      attrd:     info: crm_ipc_connect: 	Could not establish cib_rw connection: Connection refused (111)
Oct 21 11:19:21 [7683] bl460g1n6      attrd:    debug: cib_native_signon_raw: 	Connection unsuccessful (0 (nil))
Oct 21 11:19:21 [7683] bl460g1n6      attrd:    debug: cib_native_signon_raw: 	Connection to CIB failed: Transport endpoint is not connected
Oct 21 11:19:21 [7683] bl460g1n6      attrd:    debug: cib_native_signoff: 	Signing out of the CIB Service
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ipc_shm.c:qb_ipcs_shm_connect:295 connecting to client [7681]
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_created:272 connection created
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_init_fn:306 lib_init_fn: conn=0x7fb1490ced10
Oct 21 11:19:21 [7681] bl460g1n6 stonith-ng:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7681] bl460g1n6 stonith-ng:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7681] bl460g1n6 stonith-ng:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_dispatch_connection_request:757 HUP conn (7666-7683-33)
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_disconnect:605 qb_ipcs_disconnect(7666-7683-33) state:2
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] loop_poll_epoll.c:_del:117 epoll_ctl(del): Bad file descriptor (9)
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_closed:417 cs_ipcs_connection_closed() 
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:     info: crm_get_peer: 	Node 3232261594 is now known as bl460g1n8
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_exit_fn:325 exit_fn for conn=0x7fb1490d55d0
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:     info: crm_get_peer: 	Node 3232261594 is now known as bl460g1n8
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_destroyed:390 cs_ipcs_connection_destroyed() 
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: update_node_processes: 	Node bl460g1n8 now has process list: 00000000000000000000000000000100 (was 00000000000000000000000000000000)
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: update_node_processes: 	Node bl460g1n8 now has process list: 00000000000000000000000000000100 (was 00000000000000000000000000000000)
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-response-7666-7683-33-header
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: update_node_processes: 	Node bl460g1n8 now has process list: 00000000000000000000000000100100 (was 00000000000000000000000000000100)
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: update_node_processes: 	Node bl460g1n8 now has process list: 00000000000000000000000000100100 (was 00000000000000000000000000000100)
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: update_node_processes: 	Node bl460g1n8 now has process list: 00000000000000000000000000100110 (was 00000000000000000000000000100100)
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: update_node_processes: 	Node bl460g1n8 now has process list: 00000000000000000000000000100110 (was 00000000000000000000000000100100)
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: update_node_processes: 	Node bl460g1n8 now has process list: 00000000000000000000000000101110 (was 00000000000000000000000000100110)
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: update_node_processes: 	Node bl460g1n8 now has process list: 00000000000000000000000000101110 (was 00000000000000000000000000100110)
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: update_node_processes: 	Node bl460g1n8 now has process list: 00000000000000000000000000111110 (was 00000000000000000000000000101110)
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: update_node_processes: 	Node bl460g1n8 now has process list: 00000000000000000000000000111110 (was 00000000000000000000000000101110)
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-event-7666-7683-33-header
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-request-7666-7683-33-header
Oct 21 11:19:21 [7681] bl460g1n6 stonith-ng:    debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Oct 21 11:19:21 [7681] bl460g1n6 stonith-ng:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-request-7666-7681-32-header
Oct 21 11:19:21 [7681] bl460g1n6 stonith-ng:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-response-7666-7681-32-header
Oct 21 11:19:21 [7681] bl460g1n6 stonith-ng:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-event-7666-7681-32-header
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: update_node_processes: 	Node bl460g1n8 now has process list: 00000000000000000000000000111310 (was 00000000000000000000000000111110)
Oct 21 11:19:21 [7676] bl460g1n6 pacemakerd:    debug: update_node_processes: 	Node bl460g1n8 now has process list: 00000000000000000000000000111310 (was 00000000000000000000000000111110)
Oct 21 11:19:21 [7681] bl460g1n6 stonith-ng:   notice: corosync_node_name: 	Unable to get node name for nodeid 3232261592
Oct 21 11:19:21 [7681] bl460g1n6 stonith-ng:   notice: get_node_name: 	Defaulting to uname -n for the local corosync node name
Oct 21 11:19:21 [7681] bl460g1n6 stonith-ng:     info: crm_get_peer: 	Node 3232261592 is now known as bl460g1n6
Oct 21 11:19:21 [7681] bl460g1n6 stonith-ng:     info: crm_ipc_connect: 	Could not establish cib_rw connection: Connection refused (111)
Oct 21 11:19:21 [7681] bl460g1n6 stonith-ng:    debug: cib_native_signon_raw: 	Connection unsuccessful (0 (nil))
Oct 21 11:19:21 [7681] bl460g1n6 stonith-ng:    debug: cib_native_signon_raw: 	Connection to CIB failed: Transport endpoint is not connected
Oct 21 11:19:21 [7681] bl460g1n6 stonith-ng:    debug: cib_native_signoff: 	Signing out of the CIB Service
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [CPG   ] cpg.c:message_handler_req_exec_cpg_procjoin:1260 got procjoin message from cluster node -1062705704 (r(0) ip(192.168.101.216) r(1) ip(192.168.102.216) ) for pid 7681
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [CPG   ] cpg.c:message_handler_req_exec_cpg_procjoin:1260 got procjoin message from cluster node -1062705704 (r(0) ip(192.168.101.216) r(1) ip(192.168.102.216) ) for pid 7680
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ipc_shm.c:qb_ipcs_shm_connect:295 connecting to client [7680]
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_created:272 connection created
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_init_fn:306 lib_init_fn: conn=0x7fb1490d9320
Oct 21 11:19:21 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_dispatch_connection_request:757 HUP conn (7666-7681-32)
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_disconnect:605 qb_ipcs_disconnect(7666-7681-32) state:2
Oct 21 11:19:21 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] loop_poll_epoll.c:_del:117 epoll_ctl(del): Bad file descriptor (9)
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_closed:417 cs_ipcs_connection_closed() 
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_exit_fn:325 exit_fn for conn=0x7fb1490ced10
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_destroyed:390 cs_ipcs_connection_destroyed() 
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-response-7666-7681-32-header
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-event-7666-7681-32-header
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-request-7666-7681-32-header
Oct 21 11:19:21 [7680] bl460g1n6        cib:    debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Oct 21 11:19:21 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-request-7666-7680-33-header
Oct 21 11:19:21 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-response-7666-7680-33-header
Oct 21 11:19:21 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-event-7666-7680-33-header
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [CPG   ] cpg.c:message_handler_req_exec_cpg_procjoin:1260 got procjoin message from cluster node -1062705702 (r(0) ip(192.168.101.218) r(1) ip(192.168.102.218) ) for pid 1598
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [CPG   ] cpg.c:message_handler_req_exec_cpg_procjoin:1260 got procjoin message from cluster node -1062705702 (r(0) ip(192.168.101.218) r(1) ip(192.168.102.218) ) for pid 1596
Oct 21 11:19:21 [7680] bl460g1n6        cib:   notice: corosync_node_name: 	Unable to get node name for nodeid 3232261592
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_dispatch_connection_request:757 HUP conn (7666-7680-33)
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_disconnect:605 qb_ipcs_disconnect(7666-7680-33) state:2
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] loop_poll_epoll.c:_del:117 epoll_ctl(del): Bad file descriptor (9)
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_closed:417 cs_ipcs_connection_closed() 
Oct 21 11:19:21 [7680] bl460g1n6        cib:   notice: get_node_name: 	Defaulting to uname -n for the local corosync node name
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_exit_fn:325 exit_fn for conn=0x7fb1490d9320
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_destroyed:390 cs_ipcs_connection_destroyed() 
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-response-7666-7680-33-header
Oct 21 11:19:21 [7680] bl460g1n6        cib:     info: crm_get_peer: 	Node 3232261592 is now known as bl460g1n6
Oct 21 11:19:21 [7680] bl460g1n6        cib:     info: qb_ipcs_us_publish: 	server name: cib_ro
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-event-7666-7680-33-header
Oct 21 11:19:21 [7680] bl460g1n6        cib:     info: qb_ipcs_us_publish: 	server name: cib_rw
Oct 21 11:19:21 [7680] bl460g1n6        cib:     info: qb_ipcs_us_publish: 	server name: cib_shm
Oct 21 11:19:21 [7680] bl460g1n6        cib:     info: cib_init: 	Starting cib mainloop
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-request-7666-7680-33-header
Oct 21 11:19:21 [7680] bl460g1n6        cib:     info: pcmk_cpg_membership: 	Joined[0.0] cib.3232261592 
Oct 21 11:19:21 [7680] bl460g1n6        cib:     info: pcmk_cpg_membership: 	Member[0.0] cib.3232261592 
Oct 21 11:19:21 [7680] bl460g1n6        cib:    debug: get_last_sequence: 	Series file /var/lib/pacemaker/cib/cib.last does not exist
Oct 21 11:19:21 [7680] bl460g1n6        cib:    debug: write_cib_contents: 	Writing CIB to disk
Oct 21 11:19:21 [7663] bl460g1n6 corosync debug   [CPG   ] cpg.c:message_handler_req_exec_cpg_procjoin:1260 got procjoin message from cluster node -1062705702 (r(0) ip(192.168.101.218) r(1) ip(192.168.102.218) ) for pid 1595
Oct 21 11:19:21 [7680] bl460g1n6        cib:     info: pcmk_cpg_membership: 	Joined[1.0] cib.3232261594 
Oct 21 11:19:21 [7680] bl460g1n6        cib:     info: pcmk_cpg_membership: 	Member[1.0] cib.3232261592 
Oct 21 11:19:21 [7680] bl460g1n6        cib:     info: crm_get_peer: 	Created entry 46185373-3066-4128-887a-8c25d4103e1e/0xb4e8e0 for node (null)/3232261594 (2 total)
Oct 21 11:19:21 [7680] bl460g1n6        cib:     info: crm_get_peer: 	Node 3232261594 has uuid 3232261594
Oct 21 11:19:21 [7680] bl460g1n6        cib:     info: pcmk_cpg_membership: 	Member[1.1] cib.3232261594 
Oct 21 11:19:21 [7680] bl460g1n6        cib:     info: crm_update_peer_proc: 	pcmk_cpg_membership: Node (null)[3232261594] - corosync-cpg is now online
Oct 21 11:19:21 [7680] bl460g1n6        cib:     info: write_cib_contents: 	Wrote version 0.0.0 of the CIB to disk (digest: 38e2a365180d27f2831950bf8df46420)
Oct 21 11:19:21 [7680] bl460g1n6        cib:    debug: write_cib_contents: 	Wrote digest 38e2a365180d27f2831950bf8df46420 to disk
Oct 21 11:19:21 [7680] bl460g1n6        cib:     info: retrieveCib: 	Reading cluster configuration from: /var/lib/pacemaker/cib/cib.TfyIjn (digest: /var/lib/pacemaker/cib/cib.KYJvj6)
Oct 21 11:19:21 [7680] bl460g1n6        cib:    debug: write_cib_contents: 	Activating /var/lib/pacemaker/cib/cib.TfyIjn
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [CPG   ] cpg.c:message_handler_req_exec_cpg_procjoin:1260 got procjoin message from cluster node -1062705703 (r(0) ip(192.168.101.217) r(1) ip(192.168.102.217) ) for pid 12477
Oct 21 11:19:22 [7680] bl460g1n6        cib:     info: crm_client_new: 	Connecting 0xb4ef70 for uid=189 gid=189 pid=7685 id=11403f73-2fb1-433c-b6cb-5b581b7e190d
Oct 21 11:19:22 [7680] bl460g1n6        cib:    debug: handle_new_connection: 	IPC credentials authenticated (7680-7685-10)
Oct 21 11:19:22 [7680] bl460g1n6        cib:    debug: qb_ipcs_shm_connect: 	connecting to client [7685]
Oct 21 11:19:22 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:19:22 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:19:22 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:19:22 [7685] bl460g1n6       crmd:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:19:22 [7685] bl460g1n6       crmd:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:19:22 [7685] bl460g1n6       crmd:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:19:22 [7685] bl460g1n6       crmd:    debug: cib_native_signon_raw: 	Connection to CIB successful
Oct 21 11:19:22 [7680] bl460g1n6        cib:    debug: cib_common_callback_worker: 	Setting cib_refresh_notify callbacks for crmd (11403f73-2fb1-433c-b6cb-5b581b7e190d): on
Oct 21 11:19:22 [7680] bl460g1n6        cib:    debug: cib_common_callback_worker: 	Setting cib_diff_notify callbacks for crmd (11403f73-2fb1-433c-b6cb-5b581b7e190d): on
Oct 21 11:19:22 [7685] bl460g1n6       crmd:     info: do_cib_control: 	CIB connection established
Oct 21 11:19:22 [7685] bl460g1n6       crmd:   notice: crm_cluster_connect: 	Connecting to cluster infrastructure: corosync
Oct 21 11:19:22 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_query operation for section 'all': OK (rc=0, origin=local/crmd/2, version=0.0.0)
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [QB    ] ipc_shm.c:qb_ipcs_shm_connect:295 connecting to client [7685]
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_created:272 connection created
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [CPG   ] cpg.c:cpg_lib_init_fn:1459 lib_init_fn: conn=0x7fb1490d5870, cpd=0x7fb1490d9af4
Oct 21 11:19:22 [7685] bl460g1n6       crmd:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:22 [7685] bl460g1n6       crmd:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:22 [7685] bl460g1n6       crmd:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:22 [7685] bl460g1n6       crmd:    debug: get_local_nodeid: 	Local nodeid is 3232261592
Oct 21 11:19:22 [7685] bl460g1n6       crmd:     info: crm_get_peer: 	Created entry 6e66452b-3f17-4424-abab-2a1a7c2eb7b0/0x1da9ec0 for node (null)/3232261592 (1 total)
Oct 21 11:19:22 [7685] bl460g1n6       crmd:     info: crm_get_peer: 	Node 3232261592 has uuid 3232261592
Oct 21 11:19:22 [7685] bl460g1n6       crmd:     info: crm_update_peer_proc: 	cluster_connect_cpg: Node (null)[3232261592] - corosync-cpg is now online
Oct 21 11:19:22 [7685] bl460g1n6       crmd:     info: init_cs_connection_once: 	Connection to 'corosync': established
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [QB    ] ipc_shm.c:qb_ipcs_shm_connect:295 connecting to client [7685]
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_created:272 connection created
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_init_fn:306 lib_init_fn: conn=0x7fb1490d61e0
Oct 21 11:19:22 [7685] bl460g1n6       crmd:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:22 [7685] bl460g1n6       crmd:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:22 [7685] bl460g1n6       crmd:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:22 [7685] bl460g1n6       crmd:    debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Oct 21 11:19:22 [7685] bl460g1n6       crmd:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-request-7666-7685-33-header
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_dispatch_connection_request:757 HUP conn (7666-7685-33)
Oct 21 11:19:22 [7685] bl460g1n6       crmd:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-response-7666-7685-33-header
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_disconnect:605 qb_ipcs_disconnect(7666-7685-33) state:2
Oct 21 11:19:22 [7685] bl460g1n6       crmd:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-event-7666-7685-33-header
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [QB    ] loop_poll_epoll.c:_del:117 epoll_ctl(del): Bad file descriptor (9)
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_closed:417 cs_ipcs_connection_closed() 
Oct 21 11:19:22 [7685] bl460g1n6       crmd:   notice: corosync_node_name: 	Unable to get node name for nodeid 3232261592
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_exit_fn:325 exit_fn for conn=0x7fb1490d61e0
Oct 21 11:19:22 [7685] bl460g1n6       crmd:   notice: get_node_name: 	Defaulting to uname -n for the local corosync node name
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_destroyed:390 cs_ipcs_connection_destroyed() 
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-response-7666-7685-33-header
Oct 21 11:19:22 [7685] bl460g1n6       crmd:     info: crm_get_peer: 	Node 3232261592 is now known as bl460g1n6
Oct 21 11:19:22 [7685] bl460g1n6       crmd:     info: peer_update_callback: 	bl460g1n6 is now (null)
Oct 21 11:19:22 [7685] bl460g1n6       crmd:    debug: cluster_connect_quorum: 	Configuring Pacemaker to obtain quorum from Corosync
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-event-7666-7685-33-header
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-request-7666-7685-33-header
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [QB    ] ipc_shm.c:qb_ipcs_shm_connect:295 connecting to client [7685]
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_created:272 connection created
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [QUORUM] vsf_quorum.c:quorum_lib_init_fn:316 lib_init_fn: conn=0x7fb1490d61e0
Oct 21 11:19:22 [7685] bl460g1n6       crmd:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:22 [7685] bl460g1n6       crmd:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:22 [7685] bl460g1n6       crmd:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [QUORUM] vsf_quorum.c:message_handler_req_lib_quorum_gettype:471 got quorum_type request on 0x7fb1490d61e0
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [QUORUM] vsf_quorum.c:message_handler_req_lib_quorum_getquorate:395 got quorate request on 0x7fb1490d61e0
Oct 21 11:19:22 [7685] bl460g1n6       crmd:   notice: cluster_connect_quorum: 	Quorum acquired
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [QUORUM] vsf_quorum.c:message_handler_req_lib_quorum_trackstart:412 got trackstart request on 0x7fb1490d61e0
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [QUORUM] vsf_quorum.c:message_handler_req_lib_quorum_trackstart:420 sending initial status to 0x7fb1490d61e0
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [QUORUM] vsf_quorum.c:send_library_notification:359 sending quorum notification to 0x7fb1490d61e0, length = 60
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [QB    ] ipc_shm.c:qb_ipcs_shm_connect:295 connecting to client [7685]
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_created:272 connection created
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_init_fn:306 lib_init_fn: conn=0x7fb1490da4d0
Oct 21 11:19:22 [7685] bl460g1n6       crmd:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:22 [7685] bl460g1n6       crmd:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:22 [7685] bl460g1n6       crmd:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:22 [7685] bl460g1n6       crmd:    debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Oct 21 11:19:22 [7685] bl460g1n6       crmd:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-request-7666-7685-34-header
Oct 21 11:19:22 [7685] bl460g1n6       crmd:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-response-7666-7685-34-header
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_dispatch_connection_request:757 HUP conn (7666-7685-34)
Oct 21 11:19:22 [7685] bl460g1n6       crmd:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-event-7666-7685-34-header
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_disconnect:605 qb_ipcs_disconnect(7666-7685-34) state:2
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [QB    ] loop_poll_epoll.c:_del:117 epoll_ctl(del): Bad file descriptor (9)
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_closed:417 cs_ipcs_connection_closed() 
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_exit_fn:325 exit_fn for conn=0x7fb1490da4d0
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_destroyed:390 cs_ipcs_connection_destroyed() 
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-response-7666-7685-34-header
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-event-7666-7685-34-header
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-request-7666-7685-34-header
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [QB    ] ipc_shm.c:qb_ipcs_shm_connect:295 connecting to client [7685]
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_created:272 connection created
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_init_fn:306 lib_init_fn: conn=0x7fb1490da4d0
Oct 21 11:19:22 [7685] bl460g1n6       crmd:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:22 [7685] bl460g1n6       crmd:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:22 [7685] bl460g1n6       crmd:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:22 [7685] bl460g1n6       crmd:    debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Oct 21 11:19:22 [7685] bl460g1n6       crmd:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-request-7666-7685-34-header
Oct 21 11:19:22 [7685] bl460g1n6       crmd:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-response-7666-7685-34-header
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_dispatch_connection_request:757 HUP conn (7666-7685-34)
Oct 21 11:19:22 [7685] bl460g1n6       crmd:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-event-7666-7685-34-header
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_disconnect:605 qb_ipcs_disconnect(7666-7685-34) state:2
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [QB    ] loop_poll_epoll.c:_del:117 epoll_ctl(del): Bad file descriptor (9)
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_closed:417 cs_ipcs_connection_closed() 
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_exit_fn:325 exit_fn for conn=0x7fb1490da4d0
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_destroyed:390 cs_ipcs_connection_destroyed() 
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-response-7666-7685-34-header
Oct 21 11:19:22 [7685] bl460g1n6       crmd:     info: do_ha_control: 	Connected to the cluster
Oct 21 11:19:22 [7685] bl460g1n6       crmd:    debug: do_lrm_control: 	Connecting to the LRM
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-event-7666-7685-34-header
Oct 21 11:19:22 [7685] bl460g1n6       crmd:     info: lrmd_ipc_connect: 	Connecting to lrmd
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-request-7666-7685-34-header
Oct 21 11:19:22 [7682] bl460g1n6       lrmd:     info: crm_client_new: 	Connecting 0x1d8fd10 for uid=189 gid=189 pid=7685 id=666c66a8-da92-4298-beea-12fd671d2b0d
Oct 21 11:19:22 [7682] bl460g1n6       lrmd:    debug: handle_new_connection: 	IPC credentials authenticated (7682-7685-6)
Oct 21 11:19:22 [7682] bl460g1n6       lrmd:    debug: qb_ipcs_shm_connect: 	connecting to client [7685]
Oct 21 11:19:22 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section nodes: OK (rc=0, origin=local/crmd/3, version=0.0.0)
Oct 21 11:19:22 [7682] bl460g1n6       lrmd:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:19:22 [7682] bl460g1n6       lrmd:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:19:22 [7682] bl460g1n6       lrmd:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:19:22 [7685] bl460g1n6       crmd:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:19:22 [7685] bl460g1n6       crmd:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:19:22 [7685] bl460g1n6       crmd:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:19:22 [7682] bl460g1n6       lrmd:    debug: process_lrmd_message: 	Processed register operation from 666c66a8-da92-4298-beea-12fd671d2b0d: rc=0, reply=0, notify=0, exit=4201864
Oct 21 11:19:22 [7685] bl460g1n6       crmd:     info: do_lrm_control: 	LRM connection established
Oct 21 11:19:22 [7685] bl460g1n6       crmd:     info: do_started: 	Delaying start, no membership data (0000000000100000)
Oct 21 11:19:22 [7685] bl460g1n6       crmd:    debug: register_fsa_input_adv: 	Stalling the FSA pending further input: source=do_started cause=C_FSA_INTERNAL data=(nil) queue=0
Oct 21 11:19:22 [7683] bl460g1n6      attrd:    debug: attrd_cib_connect: 	CIB signon attempt 2
Oct 21 11:19:22 [7685] bl460g1n6       crmd:    debug: s_crmd_fsa: 	Exiting the FSA: queue=0, fsa_actions=0x2, stalled=true
Oct 21 11:19:22 [7685] bl460g1n6       crmd:     info: pcmk_quorum_notification: 	Membership 16: quorum retained (3)
Oct 21 11:19:22 [7680] bl460g1n6        cib:     info: crm_client_new: 	Connecting 0x99dbe0 for uid=189 gid=189 pid=7683 id=c9763361-9f68-4c1a-a43e-90d67258e704
Oct 21 11:19:22 [7685] bl460g1n6       crmd:    debug: pcmk_quorum_notification: 	Member[0] 3232261592 
Oct 21 11:19:22 [7680] bl460g1n6        cib:    debug: handle_new_connection: 	IPC credentials authenticated (7680-7683-11)
Oct 21 11:19:22 [7680] bl460g1n6        cib:    debug: qb_ipcs_shm_connect: 	connecting to client [7683]
Oct 21 11:19:22 [7685] bl460g1n6       crmd:   notice: crm_update_peer_state: 	pcmk_quorum_notification: Node bl460g1n6[3232261592] - state is now member (was (null))
Oct 21 11:19:22 [7685] bl460g1n6       crmd:     info: peer_update_callback: 	bl460g1n6 is now member (was (null))
Oct 21 11:19:22 [7685] bl460g1n6       crmd:    debug: pcmk_quorum_notification: 	Member[1] 3232261593 
Oct 21 11:19:22 [7685] bl460g1n6       crmd:     info: crm_get_peer: 	Created entry da1f328d-b11b-40fc-bd69-75a8f1488e52/0x1eefaa0 for node (null)/3232261593 (2 total)
Oct 21 11:19:22 [7685] bl460g1n6       crmd:     info: crm_get_peer: 	Node 3232261593 has uuid 3232261593
Oct 21 11:19:22 [7685] bl460g1n6       crmd:     info: pcmk_quorum_notification: 	Obtaining name for new node 3232261593
Oct 21 11:19:22 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:19:22 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [QB    ] ipc_shm.c:qb_ipcs_shm_connect:295 connecting to client [7685]
Oct 21 11:19:22 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:19:22 [7683] bl460g1n6      attrd:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:19:22 [7683] bl460g1n6      attrd:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:19:22 [7683] bl460g1n6      attrd:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:19:22 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_query operation for section crm_config: OK (rc=0, origin=local/crmd/4, version=0.0.0)
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:22 [7683] bl460g1n6      attrd:    debug: cib_native_signon_raw: 	Connection to CIB successful
Oct 21 11:19:22 [7683] bl460g1n6      attrd:     info: attrd_cib_connect: 	Connected to the CIB after 2 attempts
Oct 21 11:19:22 [7680] bl460g1n6        cib:    debug: cib_common_callback_worker: 	Setting cib_refresh_notify callbacks for attrd (c9763361-9f68-4c1a-a43e-90d67258e704): on
Oct 21 11:19:22 [7683] bl460g1n6      attrd:     info: main: 	CIB connection active
Oct 21 11:19:22 [7683] bl460g1n6      attrd:     info: pcmk_cpg_membership: 	Joined[0.0] attrd.3232261592 
Oct 21 11:19:22 [7683] bl460g1n6      attrd:     info: pcmk_cpg_membership: 	Member[0.0] attrd.3232261592 
Oct 21 11:19:22 [7683] bl460g1n6      attrd:     info: pcmk_cpg_membership: 	Joined[1.0] attrd.3232261594 
Oct 21 11:19:22 [7683] bl460g1n6      attrd:     info: pcmk_cpg_membership: 	Member[1.0] attrd.3232261592 
Oct 21 11:19:22 [7683] bl460g1n6      attrd:     info: crm_get_peer: 	Created entry 1fa5dc9b-db68-48a8-bab2-700e86b5cfd9/0x14e3070 for node (null)/3232261594 (2 total)
Oct 21 11:19:22 [7683] bl460g1n6      attrd:     info: crm_get_peer: 	Node 3232261594 has uuid 3232261594
Oct 21 11:19:22 [7683] bl460g1n6      attrd:     info: pcmk_cpg_membership: 	Member[1.1] attrd.3232261594 
Oct 21 11:19:22 [7683] bl460g1n6      attrd:     info: crm_update_peer_proc: 	pcmk_cpg_membership: Node (null)[3232261594] - corosync-cpg is now online
Oct 21 11:19:22 [7683] bl460g1n6      attrd:   notice: crm_update_peer_state: 	attrd_peer_change_cb: Node (null)[3232261594] - state is now member (was (null))
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_created:272 connection created
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_init_fn:306 lib_init_fn: conn=0x7fb1490da4d0
Oct 21 11:19:22 [7685] bl460g1n6       crmd:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:22 [7685] bl460g1n6       crmd:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:22 [7685] bl460g1n6       crmd:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:22 [7685] bl460g1n6       crmd:    debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Oct 21 11:19:22 [7685] bl460g1n6       crmd:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-request-7666-7685-34-header
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_dispatch_connection_request:757 HUP conn (7666-7685-34)
Oct 21 11:19:22 [7685] bl460g1n6       crmd:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-response-7666-7685-34-header
Oct 21 11:19:22 [7685] bl460g1n6       crmd:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-event-7666-7685-34-header
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_disconnect:605 qb_ipcs_disconnect(7666-7685-34) state:2
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [QB    ] loop_poll_epoll.c:_del:117 epoll_ctl(del): Bad file descriptor (9)
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_closed:417 cs_ipcs_connection_closed() 
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_exit_fn:325 exit_fn for conn=0x7fb1490da4d0
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_destroyed:390 cs_ipcs_connection_destroyed() 
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-response-7666-7685-34-header
Oct 21 11:19:22 [7685] bl460g1n6       crmd:   notice: corosync_node_name: 	Unable to get node name for nodeid 3232261593
Oct 21 11:19:22 [7685] bl460g1n6       crmd:   notice: crm_update_peer_state: 	pcmk_quorum_notification: Node (null)[3232261593] - state is now member (was (null))
Oct 21 11:19:22 [7685] bl460g1n6       crmd:    debug: pcmk_quorum_notification: 	Member[2] 3232261594 
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-event-7666-7685-34-header
Oct 21 11:19:22 [7685] bl460g1n6       crmd:     info: crm_get_peer: 	Created entry 1ca3e7b6-c791-45f6-94d4-788142ea7703/0x1eef990 for node (null)/3232261594 (3 total)
Oct 21 11:19:22 [7685] bl460g1n6       crmd:     info: crm_get_peer: 	Node 3232261594 has uuid 3232261594
Oct 21 11:19:22 [7685] bl460g1n6       crmd:     info: pcmk_quorum_notification: 	Obtaining name for new node 3232261594
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-request-7666-7685-34-header
Oct 21 11:19:22 [7680] bl460g1n6        cib:     info: crm_client_new: 	Connecting 0xb70000 for uid=0 gid=0 pid=7681 id=4adfbc39-bfd1-48d5-8128-03ba3eff38a3
Oct 21 11:19:22 [7680] bl460g1n6        cib:    debug: handle_new_connection: 	IPC credentials authenticated (7680-7681-12)
Oct 21 11:19:22 [7680] bl460g1n6        cib:    debug: qb_ipcs_shm_connect: 	connecting to client [7681]
Oct 21 11:19:22 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [QB    ] ipc_shm.c:qb_ipcs_shm_connect:295 connecting to client [7685]
Oct 21 11:19:22 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:19:22 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:19:22 [7681] bl460g1n6 stonith-ng:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:19:22 [7681] bl460g1n6 stonith-ng:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:19:22 [7681] bl460g1n6 stonith-ng:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:22 [7681] bl460g1n6 stonith-ng:    debug: cib_native_signon_raw: 	Connection to CIB successful
Oct 21 11:19:22 [7680] bl460g1n6        cib:    debug: cib_common_callback_worker: 	Setting cib_diff_notify callbacks for crmd (4adfbc39-bfd1-48d5-8128-03ba3eff38a3): on
Oct 21 11:19:22 [7681] bl460g1n6 stonith-ng:   notice: setup_cib: 	Watching for stonith topology changes
Oct 21 11:19:22 [7681] bl460g1n6 stonith-ng:     info: qb_ipcs_us_publish: 	server name: stonith-ng
Oct 21 11:19:22 [7681] bl460g1n6 stonith-ng:     info: main: 	Starting stonith-ng mainloop
Oct 21 11:19:22 [7681] bl460g1n6 stonith-ng:     info: pcmk_cpg_membership: 	Joined[0.0] stonith-ng.3232261592 
Oct 21 11:19:22 [7681] bl460g1n6 stonith-ng:     info: pcmk_cpg_membership: 	Member[0.0] stonith-ng.3232261592 
Oct 21 11:19:22 [7681] bl460g1n6 stonith-ng:     info: pcmk_cpg_membership: 	Joined[1.0] stonith-ng.3232261594 
Oct 21 11:19:22 [7681] bl460g1n6 stonith-ng:     info: pcmk_cpg_membership: 	Member[1.0] stonith-ng.3232261592 
Oct 21 11:19:22 [7681] bl460g1n6 stonith-ng:     info: crm_get_peer: 	Created entry 8483663e-93ed-4df0-b2a6-07e7aacb90ba/0x1dda8e0 for node (null)/3232261594 (2 total)
Oct 21 11:19:22 [7681] bl460g1n6 stonith-ng:     info: crm_get_peer: 	Node 3232261594 has uuid 3232261594
Oct 21 11:19:22 [7681] bl460g1n6 stonith-ng:     info: pcmk_cpg_membership: 	Member[1.1] stonith-ng.3232261594 
Oct 21 11:19:22 [7681] bl460g1n6 stonith-ng:     info: crm_update_peer_proc: 	pcmk_cpg_membership: Node (null)[3232261594] - corosync-cpg is now online
Oct 21 11:19:22 [7681] bl460g1n6 stonith-ng:    debug: st_peer_update_callback: 	Broadcasting our uname because of node 3232261594
Oct 21 11:19:22 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_query operation for section 'all': OK (rc=0, origin=local/crmd/2, version=0.0.0)
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_created:272 connection created
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_init_fn:306 lib_init_fn: conn=0x7fb1490da4d0
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [QB    ] ipc_shm.c:qb_ipcs_shm_connect:295 connecting to client [7681]
Oct 21 11:19:22 [7685] bl460g1n6       crmd:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:22 [7685] bl460g1n6       crmd:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:22 [7685] bl460g1n6       crmd:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_created:272 connection created
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_init_fn:306 lib_init_fn: conn=0x7fb1497e12e0
Oct 21 11:19:22 [7681] bl460g1n6 stonith-ng:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:22 [7685] bl460g1n6       crmd:    debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Oct 21 11:19:22 [7685] bl460g1n6       crmd:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-request-7666-7685-34-header
Oct 21 11:19:22 [7681] bl460g1n6 stonith-ng:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:22 [7685] bl460g1n6       crmd:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-response-7666-7685-34-header
Oct 21 11:19:22 [7685] bl460g1n6       crmd:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-event-7666-7685-34-header
Oct 21 11:19:22 [7681] bl460g1n6 stonith-ng:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:22 [7685] bl460g1n6       crmd:   notice: corosync_node_name: 	Unable to get node name for nodeid 3232261594
Oct 21 11:19:22 [7685] bl460g1n6       crmd:   notice: crm_update_peer_state: 	pcmk_quorum_notification: Node (null)[3232261594] - state is now member (was (null))
Oct 21 11:19:22 [7685] bl460g1n6       crmd:    debug: post_cache_update: 	Updated cache after membership event 16.
Oct 21 11:19:22 [7685] bl460g1n6       crmd:    debug: post_cache_update: 	post_cache_update added action A_ELECTION_CHECK to the FSA
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_dispatch_connection_request:757 HUP conn (7666-7685-34)
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_disconnect:605 qb_ipcs_disconnect(7666-7685-34) state:2
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [QB    ] loop_poll_epoll.c:_del:117 epoll_ctl(del): Bad file descriptor (9)
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_closed:417 cs_ipcs_connection_closed() 
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_exit_fn:325 exit_fn for conn=0x7fb1490da4d0
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_destroyed:390 cs_ipcs_connection_destroyed() 
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-response-7666-7685-34-header
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-event-7666-7685-34-header
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-request-7666-7685-34-header
Oct 21 11:19:22 [7681] bl460g1n6 stonith-ng:    debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Oct 21 11:19:22 [7681] bl460g1n6 stonith-ng:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-request-7666-7681-35-header
Oct 21 11:19:22 [7681] bl460g1n6 stonith-ng:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-response-7666-7681-35-header
Oct 21 11:19:22 [7681] bl460g1n6 stonith-ng:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-event-7666-7681-35-header
Oct 21 11:19:22 [7681] bl460g1n6 stonith-ng:   notice: corosync_node_name: 	Unable to get node name for nodeid 3232261592
Oct 21 11:19:22 [7681] bl460g1n6 stonith-ng:   notice: get_node_name: 	Defaulting to uname -n for the local corosync node name
Oct 21 11:19:22 [7681] bl460g1n6 stonith-ng:     info: init_cib_cache_cb: 	Updating device list from the cib: init
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [QB    ] ipc_shm.c:qb_ipcs_shm_connect:295 connecting to client [7685]
Oct 21 11:19:22 [7681] bl460g1n6 stonith-ng:    debug: unpack_config: 	STONITH timeout: 60000
Oct 21 11:19:22 [7681] bl460g1n6 stonith-ng:    debug: unpack_config: 	STONITH of failed nodes is enabled
Oct 21 11:19:22 [7681] bl460g1n6 stonith-ng:    debug: unpack_config: 	Stop all active resources: false
Oct 21 11:19:22 [7681] bl460g1n6 stonith-ng:    debug: unpack_config: 	Cluster is symmetric - resources can run anywhere by default
Oct 21 11:19:22 [7681] bl460g1n6 stonith-ng:    debug: unpack_config: 	Default stickiness: 0
Oct 21 11:19:22 [7681] bl460g1n6 stonith-ng:    debug: unpack_config: 	On loss of CCM Quorum: Stop ALL resources
Oct 21 11:19:22 [7681] bl460g1n6 stonith-ng:    debug: unpack_config: 	Node scores: 'red' = -INFINITY, 'yellow' = 0, 'green' = 0
Oct 21 11:19:22 [7681] bl460g1n6 stonith-ng:     info: unpack_nodes: 	Creating a fake local node
Oct 21 11:19:22 [7681] bl460g1n6 stonith-ng:    debug: unpack_domains: 	Unpacking domains
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_created:272 connection created
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_init_fn:306 lib_init_fn: conn=0x7fb1490dabd0
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_dispatch_connection_request:757 HUP conn (7666-7681-35)
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_disconnect:605 qb_ipcs_disconnect(7666-7681-35) state:2
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [QB    ] loop_poll_epoll.c:_del:117 epoll_ctl(del): Bad file descriptor (9)
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_closed:417 cs_ipcs_connection_closed() 
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_exit_fn:325 exit_fn for conn=0x7fb1497e12e0
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_destroyed:390 cs_ipcs_connection_destroyed() 
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-response-7666-7681-35-header
Oct 21 11:19:22 [7685] bl460g1n6       crmd:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-event-7666-7681-35-header
Oct 21 11:19:22 [7685] bl460g1n6       crmd:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:22 [7685] bl460g1n6       crmd:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-request-7666-7681-35-header
Oct 21 11:19:22 [7685] bl460g1n6       crmd:    debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Oct 21 11:19:22 [7685] bl460g1n6       crmd:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-request-7666-7685-34-header
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_dispatch_connection_request:757 HUP conn (7666-7685-34)
Oct 21 11:19:22 [7685] bl460g1n6       crmd:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-response-7666-7685-34-header
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_disconnect:605 qb_ipcs_disconnect(7666-7685-34) state:2
Oct 21 11:19:22 [7685] bl460g1n6       crmd:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-event-7666-7685-34-header
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [QB    ] loop_poll_epoll.c:_del:117 epoll_ctl(del): Bad file descriptor (9)
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_closed:417 cs_ipcs_connection_closed() 
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_exit_fn:325 exit_fn for conn=0x7fb1490dabd0
Oct 21 11:19:22 [7685] bl460g1n6       crmd:   notice: corosync_node_name: 	Unable to get node name for nodeid 3232261592
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_destroyed:390 cs_ipcs_connection_destroyed() 
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-response-7666-7685-34-header
Oct 21 11:19:22 [7685] bl460g1n6       crmd:   notice: get_node_name: 	Defaulting to uname -n for the local corosync node name
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-event-7666-7685-34-header
Oct 21 11:19:22 [7685] bl460g1n6       crmd:     info: do_started: 	Delaying start, Config not read (0000000000000040)
Oct 21 11:19:22 [7685] bl460g1n6       crmd:    debug: register_fsa_input_adv: 	Stalling the FSA pending further input: source=do_started cause=C_FSA_INTERNAL data=(nil) queue=0
Oct 21 11:19:22 [7685] bl460g1n6       crmd:    debug: s_crmd_fsa: 	Exiting the FSA: queue=0, fsa_actions=0x200000002, stalled=true
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-request-7666-7685-34-header
Oct 21 11:19:22 [7685] bl460g1n6       crmd:    debug: config_query_callback: 	Call 4 : Parsing CIB options
Oct 21 11:19:22 [7685] bl460g1n6       crmd:    debug: config_query_callback: 	Shutdown escalation occurs after: 1200000ms
Oct 21 11:19:22 [7685] bl460g1n6       crmd:    debug: config_query_callback: 	Checking for expired actions every 900000ms
Oct 21 11:19:22 [7685] bl460g1n6       crmd:    debug: do_started: 	Init server comms
Oct 21 11:19:22 [7685] bl460g1n6       crmd:     info: qb_ipcs_us_publish: 	server name: crmd
Oct 21 11:19:22 [7685] bl460g1n6       crmd:   notice: do_started: 	The local CRM is operational
Oct 21 11:19:22 [7685] bl460g1n6       crmd:    debug: do_election_check: 	Ignore election check: we not in an election
Oct 21 11:19:22 [7685] bl460g1n6       crmd:    debug: s_crmd_fsa: 	Processing I_PENDING: [ state=S_STARTING cause=C_FSA_INTERNAL origin=do_started ]
Oct 21 11:19:22 [7685] bl460g1n6       crmd:     info: do_log: 	FSA: Input I_PENDING from do_started() received in state S_STARTING
Oct 21 11:19:22 [7685] bl460g1n6       crmd:   notice: do_state_transition: 	State transition S_STARTING -> S_PENDING [ input=I_PENDING cause=C_FSA_INTERNAL origin=do_started ]
Oct 21 11:19:22 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_slave operation for section 'all': OK (rc=0, origin=local/crmd/5, version=0.0.0)
Oct 21 11:19:22 [7676] bl460g1n6 pacemakerd:     info: crm_get_peer: 	Node 3232261593 is now known as bl460g1n7
Oct 21 11:19:22 [7676] bl460g1n6 pacemakerd:     info: crm_get_peer: 	Node 3232261593 is now known as bl460g1n7
Oct 21 11:19:22 [7676] bl460g1n6 pacemakerd:    debug: update_node_processes: 	Node bl460g1n7 now has process list: 00000000000000000000000000000100 (was 00000000000000000000000000000000)
Oct 21 11:19:22 [7676] bl460g1n6 pacemakerd:    debug: update_node_processes: 	Node bl460g1n7 now has process list: 00000000000000000000000000000100 (was 00000000000000000000000000000000)
Oct 21 11:19:22 [7676] bl460g1n6 pacemakerd:    debug: update_node_processes: 	Node bl460g1n7 now has process list: 00000000000000000000000000100100 (was 00000000000000000000000000000100)
Oct 21 11:19:22 [7676] bl460g1n6 pacemakerd:    debug: update_node_processes: 	Node bl460g1n7 now has process list: 00000000000000000000000000100100 (was 00000000000000000000000000000100)
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [CPG   ] cpg.c:message_handler_req_exec_cpg_procjoin:1260 got procjoin message from cluster node -1062705703 (r(0) ip(192.168.101.217) r(1) ip(192.168.102.217) ) for pid 12483
Oct 21 11:19:22 [7676] bl460g1n6 pacemakerd:    debug: update_node_processes: 	Node bl460g1n7 now has process list: 00000000000000000000000000100110 (was 00000000000000000000000000100100)
Oct 21 11:19:22 [7676] bl460g1n6 pacemakerd:    debug: update_node_processes: 	Node bl460g1n7 now has process list: 00000000000000000000000000100110 (was 00000000000000000000000000100100)
Oct 21 11:19:22 [7681] bl460g1n6 stonith-ng:     info: pcmk_cpg_membership: 	Joined[2.0] stonith-ng.3232261593 
Oct 21 11:19:22 [7676] bl460g1n6 pacemakerd:    debug: update_node_processes: 	Node bl460g1n7 now has process list: 00000000000000000000000000101110 (was 00000000000000000000000000100110)
Oct 21 11:19:22 [7676] bl460g1n6 pacemakerd:    debug: update_node_processes: 	Node bl460g1n7 now has process list: 00000000000000000000000000101110 (was 00000000000000000000000000100110)
Oct 21 11:19:22 [7681] bl460g1n6 stonith-ng:     info: pcmk_cpg_membership: 	Member[2.0] stonith-ng.3232261592 
Oct 21 11:19:22 [7676] bl460g1n6 pacemakerd:    debug: update_node_processes: 	Node bl460g1n7 now has process list: 00000000000000000000000000111110 (was 00000000000000000000000000101110)
Oct 21 11:19:22 [7676] bl460g1n6 pacemakerd:    debug: update_node_processes: 	Node bl460g1n7 now has process list: 00000000000000000000000000111110 (was 00000000000000000000000000101110)
Oct 21 11:19:22 [7681] bl460g1n6 stonith-ng:     info: crm_get_peer: 	Created entry 42dd69a2-315c-49f3-8195-04f4452be23f/0x1de3ea0 for node (null)/3232261593 (3 total)
Oct 21 11:19:22 [7676] bl460g1n6 pacemakerd:    debug: update_node_processes: 	Node bl460g1n7 now has process list: 00000000000000000000000000111310 (was 00000000000000000000000000111110)
Oct 21 11:19:22 [7676] bl460g1n6 pacemakerd:    debug: update_node_processes: 	Node bl460g1n7 now has process list: 00000000000000000000000000111310 (was 00000000000000000000000000111110)
Oct 21 11:19:22 [7681] bl460g1n6 stonith-ng:     info: crm_get_peer: 	Node 3232261593 has uuid 3232261593
Oct 21 11:19:22 [7681] bl460g1n6 stonith-ng:     info: pcmk_cpg_membership: 	Member[2.1] stonith-ng.3232261593 
Oct 21 11:19:22 [7681] bl460g1n6 stonith-ng:     info: crm_update_peer_proc: 	pcmk_cpg_membership: Node (null)[3232261593] - corosync-cpg is now online
Oct 21 11:19:22 [7681] bl460g1n6 stonith-ng:    debug: st_peer_update_callback: 	Broadcasting our uname because of node 3232261593
Oct 21 11:19:22 [7681] bl460g1n6 stonith-ng:     info: pcmk_cpg_membership: 	Member[2.2] stonith-ng.3232261594 
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [CPG   ] cpg.c:message_handler_req_exec_cpg_procjoin:1260 got procjoin message from cluster node -1062705703 (r(0) ip(192.168.101.217) r(1) ip(192.168.102.217) ) for pid 12485
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [CPG   ] cpg.c:message_handler_req_exec_cpg_procjoin:1260 got procjoin message from cluster node -1062705703 (r(0) ip(192.168.101.217) r(1) ip(192.168.102.217) ) for pid 12482
Oct 21 11:19:22 [7683] bl460g1n6      attrd:     info: pcmk_cpg_membership: 	Joined[2.0] attrd.3232261593 
Oct 21 11:19:22 [7683] bl460g1n6      attrd:     info: pcmk_cpg_membership: 	Member[2.0] attrd.3232261592 
Oct 21 11:19:22 [7680] bl460g1n6        cib:     info: pcmk_cpg_membership: 	Joined[2.0] cib.3232261593 
Oct 21 11:19:22 [7680] bl460g1n6        cib:     info: pcmk_cpg_membership: 	Member[2.0] cib.3232261592 
Oct 21 11:19:22 [7680] bl460g1n6        cib:     info: crm_get_peer: 	Created entry eea9a1a5-713e-4ea8-ba06-01a6f9829bef/0xbb6840 for node (null)/3232261593 (3 total)
Oct 21 11:19:22 [7680] bl460g1n6        cib:     info: crm_get_peer: 	Node 3232261593 has uuid 3232261593
Oct 21 11:19:22 [7680] bl460g1n6        cib:     info: pcmk_cpg_membership: 	Member[2.1] cib.3232261593 
Oct 21 11:19:22 [7680] bl460g1n6        cib:     info: crm_update_peer_proc: 	pcmk_cpg_membership: Node (null)[3232261593] - corosync-cpg is now online
Oct 21 11:19:22 [7680] bl460g1n6        cib:     info: pcmk_cpg_membership: 	Member[2.2] cib.3232261594 
Oct 21 11:19:22 [7683] bl460g1n6      attrd:     info: crm_get_peer: 	Created entry bcb66211-3e6b-4e14-9472-757e22ab0e6e/0x14e30e0 for node (null)/3232261593 (3 total)
Oct 21 11:19:22 [7683] bl460g1n6      attrd:     info: crm_get_peer: 	Node 3232261593 has uuid 3232261593
Oct 21 11:19:22 [7683] bl460g1n6      attrd:     info: pcmk_cpg_membership: 	Member[2.1] attrd.3232261593 
Oct 21 11:19:22 [7683] bl460g1n6      attrd:     info: crm_update_peer_proc: 	pcmk_cpg_membership: Node (null)[3232261593] - corosync-cpg is now online
Oct 21 11:19:22 [7683] bl460g1n6      attrd:   notice: crm_update_peer_state: 	attrd_peer_change_cb: Node (null)[3232261593] - state is now member (was (null))
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [CPG   ] cpg.c:message_handler_req_exec_cpg_procjoin:1260 got procjoin message from cluster node -1062705702 (r(0) ip(192.168.101.218) r(1) ip(192.168.102.218) ) for pid 1600
Oct 21 11:19:22 [7683] bl460g1n6      attrd:     info: pcmk_cpg_membership: 	Member[2.2] attrd.3232261594 
Oct 21 11:19:22 [7681] bl460g1n6 stonith-ng:     info: crm_get_peer: 	Node 3232261594 is now known as bl460g1n8
Oct 21 11:19:22 [7681] bl460g1n6 stonith-ng:    debug: st_peer_update_callback: 	Broadcasting our uname because of node 3232261594
Oct 21 11:19:22 [7663] bl460g1n6 corosync debug   [CPG   ] cpg.c:message_handler_req_exec_cpg_procjoin:1260 got procjoin message from cluster node -1062705704 (r(0) ip(192.168.101.216) r(1) ip(192.168.102.216) ) for pid 7685
Oct 21 11:19:22 [7680] bl460g1n6        cib:     info: crm_client_new: 	Connecting 0xb70560 for uid=0 gid=0 pid=30979 id=9976496d-abc5-4df0-b696-4cf03d0e8ce2
Oct 21 11:19:22 [7680] bl460g1n6        cib:    debug: handle_new_connection: 	IPC credentials authenticated (7680-30979-13)
Oct 21 11:19:22 [7680] bl460g1n6        cib:    debug: qb_ipcs_shm_connect: 	connecting to client [30979]
Oct 21 11:19:22 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:19:22 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:19:22 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:19:22 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_query operation for section 'all': OK (rc=0, origin=local/crm_mon/5, version=0.0.0)
Oct 21 11:19:22 [7680] bl460g1n6        cib:    debug: cib_common_callback_worker: 	Setting cib_diff_notify callbacks for crm_mon (9976496d-abc5-4df0-b696-4cf03d0e8ce2): off
Oct 21 11:19:22 [7680] bl460g1n6        cib:    debug: cib_common_callback_worker: 	Setting cib_diff_notify callbacks for crm_mon (9976496d-abc5-4df0-b696-4cf03d0e8ce2): on
Oct 21 11:19:23 [7663] bl460g1n6 corosync debug   [CPG   ] cpg.c:message_handler_req_exec_cpg_procjoin:1260 got procjoin message from cluster node -1062705703 (r(0) ip(192.168.101.217) r(1) ip(192.168.102.217) ) for pid 12487
Oct 21 11:19:23 [7681] bl460g1n6 stonith-ng:     info: crm_get_peer: 	Node 3232261593 is now known as bl460g1n7
Oct 21 11:19:23 [7681] bl460g1n6 stonith-ng:    debug: st_peer_update_callback: 	Broadcasting our uname because of node 3232261593
Oct 21 11:19:23 [7685] bl460g1n6       crmd:    debug: do_cl_join_query: 	Querying for a DC
Oct 21 11:19:23 [7685] bl460g1n6       crmd:    debug: crm_timer_start: 	Started Election Trigger (I_DC_TIMEOUT:20000ms), src=16
Oct 21 11:19:23 [7685] bl460g1n6       crmd:     info: pcmk_cpg_membership: 	Joined[0.0] crmd.3232261592 
Oct 21 11:19:23 [7685] bl460g1n6       crmd:     info: pcmk_cpg_membership: 	Member[0.0] crmd.3232261592 
Oct 21 11:19:23 [7685] bl460g1n6       crmd:     info: pcmk_cpg_membership: 	Member[0.1] crmd.3232261594 
Oct 21 11:19:23 [7685] bl460g1n6       crmd:     info: crm_update_peer_proc: 	pcmk_cpg_membership: Node (null)[3232261594] - corosync-cpg is now online
Oct 21 11:19:23 [7685] bl460g1n6       crmd:     info: pcmk_cpg_membership: 	Joined[1.0] crmd.3232261593 
Oct 21 11:19:23 [7685] bl460g1n6       crmd:     info: pcmk_cpg_membership: 	Member[1.0] crmd.3232261592 
Oct 21 11:19:23 [7685] bl460g1n6       crmd:     info: pcmk_cpg_membership: 	Member[1.1] crmd.3232261593 
Oct 21 11:19:23 [7685] bl460g1n6       crmd:     info: crm_update_peer_proc: 	pcmk_cpg_membership: Node (null)[3232261593] - corosync-cpg is now online
Oct 21 11:19:23 [7685] bl460g1n6       crmd:     info: pcmk_cpg_membership: 	Member[1.2] crmd.3232261594 
Oct 21 11:19:23 [7685] bl460g1n6       crmd:     info: crm_get_peer: 	Node 3232261593 is now known as bl460g1n7
Oct 21 11:19:23 [7685] bl460g1n6       crmd:     info: peer_update_callback: 	bl460g1n7 is now member
Oct 21 11:19:23 [7685] bl460g1n6       crmd:    debug: te_connect_stonith: 	Attempting connection to fencing daemon...
Oct 21 11:19:24 [7681] bl460g1n6 stonith-ng:     info: crm_client_new: 	Connecting 0x1ddfee0 for uid=189 gid=189 pid=7685 id=84785fd7-02d0-4985-849e-ce75461ab298
Oct 21 11:19:24 [7681] bl460g1n6 stonith-ng:    debug: handle_new_connection: 	IPC credentials authenticated (7681-7685-9)
Oct 21 11:19:24 [7681] bl460g1n6 stonith-ng:    debug: qb_ipcs_shm_connect: 	connecting to client [7685]
Oct 21 11:19:24 [7681] bl460g1n6 stonith-ng:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:19:24 [7681] bl460g1n6 stonith-ng:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:19:24 [7681] bl460g1n6 stonith-ng:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:19:24 [7685] bl460g1n6       crmd:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:19:24 [7685] bl460g1n6       crmd:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:19:24 [7685] bl460g1n6       crmd:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:19:24 [7681] bl460g1n6 stonith-ng:    debug: stonith_command: 	Processing register 9 from crmd.7685 (               0)
Oct 21 11:19:24 [7681] bl460g1n6 stonith-ng:     info: stonith_command: 	Processed register from crmd.7685: OK (0)
Oct 21 11:19:24 [7685] bl460g1n6       crmd:    debug: stonith_api_signon: 	Connection to STONITH successful
Oct 21 11:19:24 [7681] bl460g1n6 stonith-ng:    debug: stonith_command: 	Processing st_notify 10 from crmd.7685 (               0)
Oct 21 11:19:24 [7681] bl460g1n6 stonith-ng:    debug: handle_request: 	Setting st_notify_disconnect callbacks for crmd.7685 (84785fd7-02d0-4985-849e-ce75461ab298): ON
Oct 21 11:19:24 [7681] bl460g1n6 stonith-ng:     info: stonith_command: 	Processed st_notify from crmd.7685: OK (0)
Oct 21 11:19:24 [7681] bl460g1n6 stonith-ng:    debug: stonith_command: 	Processing st_notify 11 from crmd.7685 (               0)
Oct 21 11:19:24 [7681] bl460g1n6 stonith-ng:    debug: handle_request: 	Setting st_notify_fence callbacks for crmd.7685 (84785fd7-02d0-4985-849e-ce75461ab298): ON
Oct 21 11:19:24 [7681] bl460g1n6 stonith-ng:     info: stonith_command: 	Processed st_notify from crmd.7685: OK (0)
Oct 21 11:19:24 [7685] bl460g1n6       crmd:     info: crm_get_peer: 	Node 3232261594 is now known as bl460g1n8
Oct 21 11:19:24 [7685] bl460g1n6       crmd:     info: peer_update_callback: 	bl460g1n8 is now member
Oct 21 11:19:43 [7685] bl460g1n6       crmd:     info: crm_timer_popped: 	Election Trigger (I_DC_TIMEOUT) just popped (20000ms)
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: s_crmd_fsa: 	Processing I_DC_TIMEOUT: [ state=S_PENDING cause=C_TIMER_POPPED origin=crm_timer_popped ]
Oct 21 11:19:43 [7685] bl460g1n6       crmd:  warning: do_log: 	FSA: Input I_DC_TIMEOUT from crm_timer_popped() received in state S_PENDING
Oct 21 11:19:43 [7685] bl460g1n6       crmd:     info: do_state_transition: 	State transition S_PENDING -> S_ELECTION [ input=I_DC_TIMEOUT cause=C_TIMER_POPPED origin=crm_timer_popped ]
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: crm_uptime: 	Current CPU usage is: 0s, 15997us
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: election_vote: 	Started election 1
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: election_count_vote: 	Created voted hash
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: election_count_vote: 	Election 1 (current: 1, owner: 3232261592): Processed vote from bl460g1n6 (Recorded)
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: election_check: 	Still waiting on 2 non-votes (3 total)
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: election_count_vote: 	Election 1 (current: 1, owner: 3232261592): Processed no-vote from bl460g1n7 (Recorded)
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: election_check: 	Still waiting on 1 non-votes (3 total)
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: election_count_vote: 	Election 1 (current: 1, owner: 3232261592): Processed no-vote from bl460g1n8 (Recorded)
Oct 21 11:19:43 [7685] bl460g1n6       crmd:     info: election_timer_cb: 	Election election-0 complete
Oct 21 11:19:43 [7685] bl460g1n6       crmd:     info: election_timeout_popped: 	Election failed: Declaring ourselves the winner
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: s_crmd_fsa: 	Processing I_ELECTION_DC: [ state=S_ELECTION cause=C_TIMER_POPPED origin=election_timeout_popped ]
Oct 21 11:19:43 [7685] bl460g1n6       crmd:     info: do_log: 	FSA: Input I_ELECTION_DC from election_timeout_popped() received in state S_ELECTION
Oct 21 11:19:43 [7685] bl460g1n6       crmd:   notice: do_state_transition: 	State transition S_ELECTION -> S_INTEGRATION [ input=I_ELECTION_DC cause=C_TIMER_POPPED origin=election_timeout_popped ]
Oct 21 11:19:43 [7685] bl460g1n6       crmd:     info: do_te_control: 	Registering TE UUID: 38db9a68-056c-4e65-8658-75f0c3cc91e5
Oct 21 11:19:43 [7680] bl460g1n6        cib:    debug: cib_common_callback_worker: 	Setting cib_diff_notify callbacks for crmd (11403f73-2fb1-433c-b6cb-5b581b7e190d): on
Oct 21 11:19:43 [7685] bl460g1n6       crmd:     info: set_graph_functions: 	Setting custom graph functions
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: do_te_control: 	Transitioner is now active
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: unpack_graph: 	Unpacked transition -1: 0 actions in 0 synapses
Oct 21 11:19:43 [7684] bl460g1n6    pengine:     info: crm_client_new: 	Connecting 0x255a700 for uid=189 gid=189 pid=7685 id=58012edf-f8de-4855-9af0-0667eb45afd9
Oct 21 11:19:43 [7684] bl460g1n6    pengine:    debug: handle_new_connection: 	IPC credentials authenticated (7684-7685-6)
Oct 21 11:19:43 [7684] bl460g1n6    pengine:    debug: qb_ipcs_shm_connect: 	connecting to client [7685]
Oct 21 11:19:43 [7684] bl460g1n6    pengine:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:19:43 [7684] bl460g1n6    pengine:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:19:43 [7684] bl460g1n6    pengine:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: crm_timer_start: 	Started Integration Timer (I_INTEGRATED:180000ms), src=20
Oct 21 11:19:43 [7685] bl460g1n6       crmd:     info: do_dc_takeover: 	Taking over DC status for this partition
Oct 21 11:19:43 [7680] bl460g1n6        cib:     info: cib_process_readwrite: 	We are now in R/W mode
Oct 21 11:19:43 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_master operation for section 'all': OK (rc=0, origin=local/crmd/6, version=0.0.0)
Oct 21 11:19:43 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section cib: OK (rc=0, origin=local/crmd/7, version=0.0.1)
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: --- 0.0.0
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: +++ 0.0.1 4ebc77531279ad4ef9b647069d43ab1e
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-- <cib num_updates="0"/>
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++ <cib epoch="0" num_updates="1" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7"/>
Oct 21 11:19:43 [7663] bl460g1n6 corosync debug   [QB    ] ipc_shm.c:qb_ipcs_shm_connect:295 connecting to client [7680]
Oct 21 11:19:43 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:43 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:43 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:43 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_created:272 connection created
Oct 21 11:19:43 [7663] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_init_fn:306 lib_init_fn: conn=0x7fb1490da9a0
Oct 21 11:19:43 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:43 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:43 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:43 [7680] bl460g1n6        cib:    debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Oct 21 11:19:43 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-request-7666-7680-34-header
Oct 21 11:19:43 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-response-7666-7680-34-header
Oct 21 11:19:43 [7663] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_dispatch_connection_request:757 HUP conn (7666-7680-34)
Oct 21 11:19:43 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-event-7666-7680-34-header
Oct 21 11:19:43 [7663] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_disconnect:605 qb_ipcs_disconnect(7666-7680-34) state:2
Oct 21 11:19:43 [7663] bl460g1n6 corosync debug   [QB    ] loop_poll_epoll.c:_del:117 epoll_ctl(del): Bad file descriptor (9)
Oct 21 11:19:43 [7680] bl460g1n6        cib:   notice: corosync_node_name: 	Unable to get node name for nodeid 3232261592
Oct 21 11:19:43 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_closed:417 cs_ipcs_connection_closed() 
Oct 21 11:19:43 [7680] bl460g1n6        cib:   notice: get_node_name: 	Defaulting to uname -n for the local corosync node name
Oct 21 11:19:43 [7663] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_exit_fn:325 exit_fn for conn=0x7fb1490da9a0
Oct 21 11:19:43 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_destroyed:390 cs_ipcs_connection_destroyed() 
Oct 21 11:19:43 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-response-7666-7680-34-header
Oct 21 11:19:43 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-event-7666-7680-34-header
Oct 21 11:19:43 [7680] bl460g1n6        cib:    debug: cib_process_xpath: 	cib_query: //cib/configuration/crm_config//cluster_property_set//nvpair[@name='dc-version'] does not exist
Oct 21 11:19:43 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_query operation for section //cib/configuration/crm_config//cluster_property_set//nvpair[@name='dc-version']: No such device or address (rc=-6, origin=local/crmd/8, version=0.0.1)
Oct 21 11:19:43 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-request-7666-7680-34-header
Oct 21 11:19:43 [7680] bl460g1n6        cib:    debug: activateCibXml: 	Triggering CIB write for cib_modify op
Oct 21 11:19:43 [7680] bl460g1n6        cib:   notice: log_cib_diff: 	cib:diff: Local-only Change: 0.1.1
Oct 21 11:19:43 [7680] bl460g1n6        cib:   notice: cib:diff: 	-- <cib admin_epoch="0" epoch="0" num_updates="1"/>
Oct 21 11:19:43 [7680] bl460g1n6        cib:   notice: cib:diff: 	++       <cluster_property_set id="cib-bootstrap-options">
Oct 21 11:19:43 [7680] bl460g1n6        cib:   notice: cib:diff: 	++         <nvpair id="cib-bootstrap-options-dc-version" name="dc-version" value="1.1.11-0.302.b6d42ed.git.el6-b6d42ed"/>
Oct 21 11:19:43 [7680] bl460g1n6        cib:   notice: cib:diff: 	++       </cluster_property_set>
Oct 21 11:19:43 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section crm_config: OK (rc=0, origin=local/crmd/9, version=0.1.1)
Oct 21 11:19:43 [7680] bl460g1n6        cib:    debug: cib_process_xpath: 	cib_query: //cib/configuration/crm_config//cluster_property_set//nvpair[@name='cluster-infrastructure'] does not exist
Oct 21 11:19:43 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_query operation for section //cib/configuration/crm_config//cluster_property_set//nvpair[@name='cluster-infrastructure']: No such device or address (rc=-6, origin=local/crmd/10, version=0.1.1)
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: log_cib_diff: 	Config update: Local-only Change: 0.1.1
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-- <cib admin_epoch="0" epoch="0" num_updates="1"/>
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  <cib epoch="1" num_updates="1" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:19:43 2013" update-origin="bl460g1n6" update-client="crmd">
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    <configuration>
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      <crm_config>
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++       <cluster_property_set id="cib-bootstrap-options">
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         <nvpair id="cib-bootstrap-options-dc-version" name="dc-version" value="1.1.11-0.302.b6d42ed.git.el6-b6d42ed"/>
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++       </cluster_property_set>
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      </crm_config>
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    </configuration>
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  </cib>
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: initialize_join: 	join-1: Initializing join data (flag=true)
Oct 21 11:19:43 [7685] bl460g1n6       crmd:     info: join_make_offer: 	Making join offers based on membership 16
Oct 21 11:19:43 [7685] bl460g1n6       crmd:     info: join_make_offer: 	join-1: Sending offer to bl460g1n7
Oct 21 11:19:43 [7685] bl460g1n6       crmd:     info: crm_update_peer_join: 	join_make_offer: Node bl460g1n7[3232261593] - join-1 phase 0 -> 1
Oct 21 11:19:43 [7685] bl460g1n6       crmd:     info: join_make_offer: 	join-1: Sending offer to bl460g1n8
Oct 21 11:19:43 [7685] bl460g1n6       crmd:     info: crm_update_peer_join: 	join_make_offer: Node bl460g1n8[3232261594] - join-1 phase 0 -> 1
Oct 21 11:19:43 [7685] bl460g1n6       crmd:     info: join_make_offer: 	join-1: Sending offer to bl460g1n6
Oct 21 11:19:43 [7685] bl460g1n6       crmd:     info: crm_update_peer_join: 	join_make_offer: Node bl460g1n6[3232261592] - join-1 phase 0 -> 1
Oct 21 11:19:43 [7685] bl460g1n6       crmd:     info: do_dc_join_offer_all: 	join-1: Waiting on 3 outstanding join acks
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: s_crmd_fsa: 	Processing I_ELECTION_DC: [ state=S_INTEGRATION cause=C_FSA_INTERNAL origin=do_election_check ]
Oct 21 11:19:43 [7685] bl460g1n6       crmd:  warning: do_log: 	FSA: Input I_ELECTION_DC from do_election_check() received in state S_INTEGRATION
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: election_vote: 	Started election 2
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: initialize_join: 	join-2: Initializing join data (flag=true)
Oct 21 11:19:43 [7685] bl460g1n6       crmd:     info: crm_update_peer_join: 	initialize_join: Node bl460g1n7[3232261593] - join-2 phase 1 -> 0
Oct 21 11:19:43 [7685] bl460g1n6       crmd:     info: crm_update_peer_join: 	initialize_join: Node bl460g1n8[3232261594] - join-2 phase 1 -> 0
Oct 21 11:19:43 [7685] bl460g1n6       crmd:     info: crm_update_peer_join: 	initialize_join: Node bl460g1n6[3232261592] - join-2 phase 1 -> 0
Oct 21 11:19:43 [7685] bl460g1n6       crmd:     info: join_make_offer: 	join-2: Sending offer to bl460g1n7
Oct 21 11:19:43 [7685] bl460g1n6       crmd:     info: crm_update_peer_join: 	join_make_offer: Node bl460g1n7[3232261593] - join-2 phase 0 -> 1
Oct 21 11:19:43 [7680] bl460g1n6        cib:    debug: activateCibXml: 	Triggering CIB write for cib_modify op
Oct 21 11:19:43 [7685] bl460g1n6       crmd:     info: join_make_offer: 	join-2: Sending offer to bl460g1n8
Oct 21 11:19:43 [7685] bl460g1n6       crmd:     info: crm_update_peer_join: 	join_make_offer: Node bl460g1n8[3232261594] - join-2 phase 0 -> 1
Oct 21 11:19:43 [7685] bl460g1n6       crmd:     info: join_make_offer: 	join-2: Sending offer to bl460g1n6
Oct 21 11:19:43 [7685] bl460g1n6       crmd:     info: crm_update_peer_join: 	join_make_offer: Node bl460g1n6[3232261592] - join-2 phase 0 -> 1
Oct 21 11:19:43 [7685] bl460g1n6       crmd:     info: do_dc_join_offer_all: 	join-2: Waiting on 3 outstanding join acks
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: log_cib_diff: 	Config update: Local-only Change: 0.2.1
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-- <cib admin_epoch="0" epoch="1" num_updates="1"/>
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  <cib epoch="2" num_updates="1" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:19:43 2013" update-origin="bl460g1n6" update-client="crmd">
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    <configuration>
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      <crm_config>
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        <cluster_property_set id="cib-bootstrap-options">
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         <nvpair id="cib-bootstrap-options-cluster-infrastructure" name="cluster-infrastructure" value="corosync"/>
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        </cluster_property_set>
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      </crm_config>
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    </configuration>
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  </cib>
Oct 21 11:19:43 [7680] bl460g1n6        cib:    debug: get_last_sequence: 	Series file /var/lib/pacemaker/cib/cib.last does not exist
Oct 21 11:19:43 [7680] bl460g1n6        cib:   notice: log_cib_diff: 	cib:diff: Local-only Change: 0.2.1
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: handle_request: 	Raising I_JOIN_OFFER: join-1
Oct 21 11:19:43 [7680] bl460g1n6        cib:   notice: cib:diff: 	-- <cib admin_epoch="0" epoch="1" num_updates="1"/>
Oct 21 11:19:43 [7680] bl460g1n6        cib:   notice: cib:diff: 	++         <nvpair id="cib-bootstrap-options-cluster-infrastructure" name="cluster-infrastructure" value="corosync"/>
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: handle_request: 	Raising I_JOIN_OFFER: join-2
Oct 21 11:19:43 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section crm_config: OK (rc=0, origin=local/crmd/11, version=0.2.1)
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: s_crmd_fsa: 	Processing I_JOIN_OFFER: [ state=S_INTEGRATION cause=C_HA_MESSAGE origin=route_message ]
Oct 21 11:19:43 [7685] bl460g1n6       crmd:     info: update_dc: 	Set DC to bl460g1n6 (3.0.7)
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: do_cl_join_offer_respond: 	do_cl_join_offer_respond added action A_DC_TIMER_STOP to the FSA
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: election_count_vote: 	Created voted hash
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: election_count_vote: 	Election 2 (current: 2, owner: 3232261592): Processed vote from bl460g1n6 (Recorded)
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: do_election_check: 	Ignore election check: we not in an election
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: s_crmd_fsa: 	Processing I_JOIN_OFFER: [ state=S_INTEGRATION cause=C_HA_MESSAGE origin=route_message ]
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: do_cl_join_offer_respond: 	do_cl_join_offer_respond added action A_DC_TIMER_STOP to the FSA
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: election_count_vote: 	Election 2 (current: 2, owner: 3232261592): Processed no-vote from bl460g1n8 (Recorded)
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: do_election_check: 	Ignore election check: we not in an election
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: s_crmd_fsa: 	Processing I_JOIN_REQUEST: [ state=S_INTEGRATION cause=C_HA_MESSAGE origin=route_message ]
Oct 21 11:19:43 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_query operation for section crm_config: OK (rc=0, origin=local/crmd/12, version=0.2.1)
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: do_dc_join_filter_offer: 	Processing req from bl460g1n7
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: do_dc_join_filter_offer: 	Invalid response from bl460g1n7: join-1 vs. join-2
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: check_join_state: 	Invoked by do_dc_join_filter_offer in state: S_INTEGRATION
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: election_count_vote: 	Election 2 (current: 2, owner: 3232261592): Processed no-vote from bl460g1n7 (Recorded)
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: do_election_check: 	Ignore election check: we not in an election
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: config_query_callback: 	Call 12 : Parsing CIB options
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: config_query_callback: 	Shutdown escalation occurs after: 1200000ms
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: config_query_callback: 	Checking for expired actions every 900000ms
Oct 21 11:19:43 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_query operation for section crm_config: OK (rc=0, origin=local/crmd/13, version=0.2.1)
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: config_query_callback: 	Call 13 : Parsing CIB options
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: config_query_callback: 	Shutdown escalation occurs after: 1200000ms
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: config_query_callback: 	Checking for expired actions every 900000ms
Oct 21 11:19:43 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_query operation for section 'all': OK (rc=0, origin=local/crmd/14, version=0.2.1)
Oct 21 11:19:43 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_query operation for section 'all': OK (rc=0, origin=local/crmd/15, version=0.2.1)
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: join_query_callback: 	Respond to join offer join-2
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: join_query_callback: 	Acknowledging bl460g1n6 as our DC
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: s_crmd_fsa: 	Processing I_JOIN_REQUEST: [ state=S_INTEGRATION cause=C_HA_MESSAGE origin=route_message ]
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: do_dc_join_filter_offer: 	Processing req from bl460g1n6
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: do_dc_join_filter_offer: 	join-2: Welcoming node bl460g1n6 (ref join_request-crmd-1382321983-10)
Oct 21 11:19:43 [7685] bl460g1n6       crmd:     info: crm_update_peer_join: 	do_dc_join_filter_offer: Node bl460g1n6[3232261592] - join-2 phase 1 -> 2
Oct 21 11:19:43 [7685] bl460g1n6       crmd:     info: crm_update_peer_expected: 	do_dc_join_filter_offer: Node bl460g1n6[3232261592] - expected state is now member
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: do_dc_join_filter_offer: 	1 nodes have been integrated into join-2
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: check_join_state: 	Invoked by do_dc_join_filter_offer in state: S_INTEGRATION
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: do_dc_join_filter_offer: 	join-2: Still waiting on 2 outstanding offers
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: s_crmd_fsa: 	Processing I_JOIN_REQUEST: [ state=S_INTEGRATION cause=C_HA_MESSAGE origin=route_message ]
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: do_dc_join_filter_offer: 	Processing req from bl460g1n7
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: do_dc_join_filter_offer: 	join-2: Welcoming node bl460g1n7 (ref join_request-crmd-1382321983-5)
Oct 21 11:19:43 [7685] bl460g1n6       crmd:     info: crm_update_peer_join: 	do_dc_join_filter_offer: Node bl460g1n7[3232261593] - join-2 phase 1 -> 2
Oct 21 11:19:43 [7685] bl460g1n6       crmd:     info: crm_update_peer_expected: 	do_dc_join_filter_offer: Node bl460g1n7[3232261593] - expected state is now member
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: do_dc_join_filter_offer: 	2 nodes have been integrated into join-2
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: check_join_state: 	Invoked by do_dc_join_filter_offer in state: S_INTEGRATION
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: do_dc_join_filter_offer: 	join-2: Still waiting on 1 outstanding offers
Oct 21 11:19:43 [7680] bl460g1n6        cib:     info: write_cib_contents: 	Archived previous version as /var/lib/pacemaker/cib/cib-0.raw
Oct 21 11:19:43 [7680] bl460g1n6        cib:    debug: write_cib_contents: 	Writing CIB to disk
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: s_crmd_fsa: 	Processing I_JOIN_REQUEST: [ state=S_INTEGRATION cause=C_HA_MESSAGE origin=route_message ]
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: do_dc_join_filter_offer: 	Processing req from bl460g1n8
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: do_dc_join_filter_offer: 	join-2: Welcoming node bl460g1n8 (ref join_request-crmd-1382321983-4)
Oct 21 11:19:43 [7685] bl460g1n6       crmd:     info: crm_update_peer_join: 	do_dc_join_filter_offer: Node bl460g1n8[3232261594] - join-2 phase 1 -> 2
Oct 21 11:19:43 [7685] bl460g1n6       crmd:     info: crm_update_peer_expected: 	do_dc_join_filter_offer: Node bl460g1n8[3232261594] - expected state is now member
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: do_dc_join_filter_offer: 	3 nodes have been integrated into join-2
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: check_join_state: 	Invoked by do_dc_join_filter_offer in state: S_INTEGRATION
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: check_join_state: 	join-2: Integration of 3 peers complete: do_dc_join_filter_offer
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: s_crmd_fsa: 	Processing I_INTEGRATED: [ state=S_INTEGRATION cause=C_FSA_INTERNAL origin=check_join_state ]
Oct 21 11:19:43 [7685] bl460g1n6       crmd:     info: do_state_transition: 	State transition S_INTEGRATION -> S_FINALIZE_JOIN [ input=I_INTEGRATED cause=C_FSA_INTERNAL origin=check_join_state ]
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: do_state_transition: 	All 3 cluster nodes responded to the join offer.
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: crm_timer_start: 	Started Finalization Timer (I_ELECTION:1800000ms), src=27
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: do_dc_join_finalize: 	Finializing join-2 for 3 clients
Oct 21 11:19:43 [7685] bl460g1n6       crmd:     info: crmd_join_phase_log: 	join-2: bl460g1n7=integrated
Oct 21 11:19:43 [7685] bl460g1n6       crmd:     info: crmd_join_phase_log: 	join-2: bl460g1n8=integrated
Oct 21 11:19:43 [7685] bl460g1n6       crmd:     info: crmd_join_phase_log: 	join-2: bl460g1n6=integrated
Oct 21 11:19:43 [7685] bl460g1n6       crmd:     info: do_dc_join_finalize: 	join-2: Syncing our CIB to the rest of the cluster
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: do_dc_join_finalize: 	Requested version   <generation_tuple epoch="2" num_updates="1" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:19:43 2013" update-origin="bl460g1n6" update-client="crmd"/>
Oct 21 11:19:43 [7680] bl460g1n6        cib:    debug: sync_our_cib: 	Syncing CIB to all peers
Oct 21 11:19:43 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_sync operation for section 'all': OK (rc=0, origin=local/crmd/16, version=0.2.1)
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: check_join_state: 	Invoked by finalize_sync_callback in state: S_FINALIZE_JOIN
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: check_join_state: 	join-2: Still waiting on 3 integrated nodes
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: crmd_join_phase_log: 	join-2: bl460g1n7=integrated
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: crmd_join_phase_log: 	join-2: bl460g1n8=integrated
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: crmd_join_phase_log: 	join-2: bl460g1n6=integrated
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: finalize_sync_callback: 	Notifying 3 clients of join-2 results
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: finalize_join_for: 	join-2: ACK'ing join request from bl460g1n7
Oct 21 11:19:43 [7685] bl460g1n6       crmd:     info: crm_update_peer_join: 	finalize_join_for: Node bl460g1n7[3232261593] - join-2 phase 2 -> 3
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: finalize_join_for: 	join-2: ACK'ing join request from bl460g1n8
Oct 21 11:19:43 [7685] bl460g1n6       crmd:     info: crm_update_peer_join: 	finalize_join_for: Node bl460g1n8[3232261594] - join-2 phase 2 -> 3
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: finalize_join_for: 	join-2: ACK'ing join request from bl460g1n6
Oct 21 11:19:43 [7685] bl460g1n6       crmd:     info: crm_update_peer_join: 	finalize_join_for: Node bl460g1n6[3232261592] - join-2 phase 2 -> 3
Oct 21 11:19:43 [7680] bl460g1n6        cib:    debug: activateCibXml: 	Triggering CIB write for cib_modify op
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: log_cib_diff: 	Config update: Local-only Change: 0.3.1
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-- <cib admin_epoch="0" epoch="2" num_updates="1"/>
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  <cib epoch="3" num_updates="1" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:19:43 2013" update-origin="bl460g1n6" update-client="crmd">
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    <configuration>
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      <nodes>
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++       <node id="3232261593" uname="bl460g1n7"/>
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      </nodes>
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    </configuration>
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  </cib>
Oct 21 11:19:43 [7680] bl460g1n6        cib:   notice: log_cib_diff: 	cib:diff: Local-only Change: 0.3.1
Oct 21 11:19:43 [7680] bl460g1n6        cib:   notice: cib:diff: 	-- <cib admin_epoch="0" epoch="2" num_updates="1"/>
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: handle_request: 	Raising I_JOIN_RESULT: join-2
Oct 21 11:19:43 [7680] bl460g1n6        cib:   notice: cib:diff: 	++       <node id="3232261593" uname="bl460g1n7"/>
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: s_crmd_fsa: 	Processing I_JOIN_RESULT: [ state=S_FINALIZE_JOIN cause=C_HA_MESSAGE origin=route_message ]
Oct 21 11:19:43 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section nodes: OK (rc=0, origin=local/crmd/17, version=0.3.1)
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: do_cl_join_finalize_respond: 	Confirming join join-2: join_ack_nack
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: do_cl_join_finalize_respond: 	join-2: Join complete.  Sending local LRM status to bl460g1n6
Oct 21 11:19:43 [7685] bl460g1n6       crmd:     info: erase_status_tag: 	Deleting xpath: //node_state[@uname='bl460g1n6']/transient_attributes
Oct 21 11:19:43 [7685] bl460g1n6       crmd:     info: update_attrd_helper: 	Connecting to attrd... 5 retries remaining
Oct 21 11:19:43 [7680] bl460g1n6        cib:    debug: activateCibXml: 	Triggering CIB write for cib_modify op
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: log_cib_diff: 	Config update: Local-only Change: 0.4.1
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-- <cib admin_epoch="0" epoch="3" num_updates="1"/>
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  <cib epoch="4" num_updates="1" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:19:43 2013" update-origin="bl460g1n6" update-client="crmd">
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    <configuration>
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      <nodes>
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++       <node id="3232261594" uname="bl460g1n8"/>
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      </nodes>
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    </configuration>
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  </cib>
Oct 21 11:19:43 [7683] bl460g1n6      attrd:     info: crm_client_new: 	Connecting 0x14e0460 for uid=189 gid=189 pid=7685 id=8867b5f3-c85c-4336-be21-852b967b540d
Oct 21 11:19:43 [7683] bl460g1n6      attrd:    debug: handle_new_connection: 	IPC credentials authenticated (7683-7685-9)
Oct 21 11:19:43 [7683] bl460g1n6      attrd:    debug: qb_ipcs_shm_connect: 	connecting to client [7685]
Oct 21 11:19:43 [7683] bl460g1n6      attrd:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:19:43 [7683] bl460g1n6      attrd:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:19:43 [7680] bl460g1n6        cib:   notice: log_cib_diff: 	cib:diff: Local-only Change: 0.4.1
Oct 21 11:19:43 [7683] bl460g1n6      attrd:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:19:43 [7680] bl460g1n6        cib:   notice: cib:diff: 	-- <cib admin_epoch="0" epoch="3" num_updates="1"/>
Oct 21 11:19:43 [7680] bl460g1n6        cib:   notice: cib:diff: 	++       <node id="3232261594" uname="bl460g1n8"/>
Oct 21 11:19:43 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section nodes: OK (rc=0, origin=local/crmd/18, version=0.4.1)
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:19:43 [7680] bl460g1n6        cib:    debug: activateCibXml: 	Triggering CIB write for cib_modify op
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: attrd_update_delegate: 	Sent update: terminate=(null) for bl460g1n6
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: attrd_update_delegate: 	Sent update: shutdown=(null) for bl460g1n6
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: log_cib_diff: 	Config update: Local-only Change: 0.5.1
Oct 21 11:19:43 [7683] bl460g1n6      attrd:     info: attrd_client_message: 	Starting an election to determine the writer
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-- <cib admin_epoch="0" epoch="4" num_updates="1"/>
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  <cib epoch="5" num_updates="1" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:19:43 2013" update-origin="bl460g1n6" update-client="crmd">
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    <configuration>
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      <nodes>
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++       <node id="3232261592" uname="bl460g1n6"/>
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      </nodes>
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: do_dc_join_ack: 	Ignoring op=join_ack_nack message from bl460g1n6
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    </configuration>
Oct 21 11:19:43 [7683] bl460g1n6      attrd:    debug: crm_uptime: 	Current CPU usage is: 0s, 10998us
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  </cib>
Oct 21 11:19:43 [7663] bl460g1n6 corosync debug   [QB    ] ipc_shm.c:qb_ipcs_shm_connect:295 connecting to client [7683]
Oct 21 11:19:43 [7680] bl460g1n6        cib:   notice: log_cib_diff: 	cib:diff: Local-only Change: 0.5.1
Oct 21 11:19:43 [7680] bl460g1n6        cib:   notice: cib:diff: 	-- <cib admin_epoch="0" epoch="4" num_updates="1"/>
Oct 21 11:19:43 [7680] bl460g1n6        cib:   notice: cib:diff: 	++       <node id="3232261592" uname="bl460g1n6"/>
Oct 21 11:19:43 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section nodes: OK (rc=0, origin=local/crmd/19, version=0.5.1)
Oct 21 11:19:43 [7680] bl460g1n6        cib:     info: crm_get_peer: 	Node 3232261594 is now known as bl460g1n8
Oct 21 11:19:43 [7680] bl460g1n6        cib:    debug: cib_process_xpath: 	//node_state[@uname='bl460g1n8']/transient_attributes was already removed
Oct 21 11:19:43 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_delete operation for section //node_state[@uname='bl460g1n8']/transient_attributes: OK (rc=0, origin=bl460g1n8/crmd/10, version=0.5.1)
Oct 21 11:19:43 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:43 [7680] bl460g1n6        cib:    debug: cib_process_xpath: 	//node_state[@uname='bl460g1n6']/transient_attributes was already removed
Oct 21 11:19:43 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_delete operation for section //node_state[@uname='bl460g1n6']/transient_attributes: OK (rc=0, origin=local/crmd/20, version=0.5.1)
Oct 21 11:19:43 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: erase_xpath_callback: 	Deletion of "//node_state[@uname='bl460g1n6']/transient_attributes": OK (rc=0)
Oct 21 11:19:43 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:43 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_created:272 connection created
Oct 21 11:19:43 [7663] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_init_fn:306 lib_init_fn: conn=0x7fb1490da8f0
Oct 21 11:19:43 [7683] bl460g1n6      attrd:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:43 [7683] bl460g1n6      attrd:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:43 [7683] bl460g1n6      attrd:    debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: s_crmd_fsa: 	Processing I_JOIN_RESULT: [ state=S_FINALIZE_JOIN cause=C_HA_MESSAGE origin=route_message ]
Oct 21 11:19:43 [7685] bl460g1n6       crmd:     info: crm_update_peer_join: 	do_dc_join_ack: Node bl460g1n8[3232261594] - join-2 phase 3 -> 4
Oct 21 11:19:43 [7685] bl460g1n6       crmd:     info: do_dc_join_ack: 	join-2: Updating node state to member for bl460g1n8
Oct 21 11:19:43 [7685] bl460g1n6       crmd:     info: erase_status_tag: 	Deleting xpath: //node_state[@uname='bl460g1n8']/lrm
Oct 21 11:19:43 [7680] bl460g1n6        cib:    debug: cib_process_xpath: 	//node_state[@uname='bl460g1n8']/lrm was already removed
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: do_dc_join_ack: 	join-2: Registered callback for LRM update 22
Oct 21 11:19:43 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_delete operation for section //node_state[@uname='bl460g1n8']/lrm: OK (rc=0, origin=local/crmd/21, version=0.5.1)
Oct 21 11:19:43 [7683] bl460g1n6      attrd:    debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Oct 21 11:19:43 [7683] bl460g1n6      attrd:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-request-7666-7683-34-header
Oct 21 11:19:43 [7683] bl460g1n6      attrd:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-response-7666-7683-34-header
Oct 21 11:19:43 [7683] bl460g1n6      attrd:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-event-7666-7683-34-header
Oct 21 11:19:43 [7683] bl460g1n6      attrd:   notice: corosync_node_name: 	Unable to get node name for nodeid 3232261592
Oct 21 11:19:43 [7683] bl460g1n6      attrd:   notice: get_node_name: 	Defaulting to uname -n for the local corosync node name
Oct 21 11:19:43 [7663] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_dispatch_connection_request:757 HUP conn (7666-7683-34)
Oct 21 11:19:43 [7663] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_disconnect:605 qb_ipcs_disconnect(7666-7683-34) state:2
Oct 21 11:19:43 [7683] bl460g1n6      attrd:    debug: election_vote: 	Started election 1
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: erase_xpath_callback: 	Deletion of "//node_state[@uname='bl460g1n8']/lrm": OK (rc=0)
Oct 21 11:19:43 [7683] bl460g1n6      attrd:     info: attrd_client_message: 	Broadcasting terminate[bl460g1n6] = (null)
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: --- 0.5.1
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: +++ 0.5.2 bd5cc5ae8671e585f4b461c49309b67a
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-- <cib num_updates="1"/>
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  <cib epoch="5" num_updates="2" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:19:43 2013" update-origin="bl460g1n6" update-client="crmd">
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    <status>
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++     <node_state id="3232261594" uname="bl460g1n8" in_ccm="true" crmd="online" crm-debug-origin="do_lrm_query_internal" join="member" expected="member">
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++       <lrm id="3232261594">
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         <lrm_resources/>
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++       </lrm>
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++     </node_state>
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    </status>
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  </cib>
Oct 21 11:19:43 [7683] bl460g1n6      attrd:     info: attrd_client_message: 	Broadcasting shutdown[bl460g1n6] = (null)
Oct 21 11:19:43 [7663] bl460g1n6 corosync debug   [QB    ] loop_poll_epoll.c:_del:117 epoll_ctl(del): Bad file descriptor (9)
Oct 21 11:19:43 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_closed:417 cs_ipcs_connection_closed() 
Oct 21 11:19:43 [7663] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_exit_fn:325 exit_fn for conn=0x7fb1490da8f0
Oct 21 11:19:43 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_destroyed:390 cs_ipcs_connection_destroyed() 
Oct 21 11:19:43 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-response-7666-7683-34-header
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: s_crmd_fsa: 	Processing I_JOIN_RESULT: [ state=S_FINALIZE_JOIN cause=C_HA_MESSAGE origin=route_message ]
Oct 21 11:19:43 [7685] bl460g1n6       crmd:     info: crm_update_peer_join: 	do_dc_join_ack: Node bl460g1n6[3232261592] - join-2 phase 3 -> 4
Oct 21 11:19:43 [7685] bl460g1n6       crmd:     info: do_dc_join_ack: 	join-2: Updating node state to member for bl460g1n6
Oct 21 11:19:43 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-event-7666-7683-34-header
Oct 21 11:19:43 [7685] bl460g1n6       crmd:     info: erase_status_tag: 	Deleting xpath: //node_state[@uname='bl460g1n6']/lrm
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: do_dc_join_ack: 	join-2: Registered callback for LRM update 24
Oct 21 11:19:43 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=local/crmd/22, version=0.5.2)
Oct 21 11:19:43 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-request-7666-7683-34-header
Oct 21 11:19:43 [7680] bl460g1n6        cib:    debug: cib_process_xpath: 	//node_state[@uname='bl460g1n6']/lrm was already removed
Oct 21 11:19:43 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_delete operation for section //node_state[@uname='bl460g1n6']/lrm: OK (rc=0, origin=local/crmd/23, version=0.5.2)
Oct 21 11:19:43 [7683] bl460g1n6      attrd:     info: crm_get_peer: 	Node 3232261594 is now known as bl460g1n8
Oct 21 11:19:43 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=local/crmd/24, version=0.5.3)
Oct 21 11:19:43 [7683] bl460g1n6      attrd:    debug: election_count_vote: 	Created voted hash
Oct 21 11:19:43 [7683] bl460g1n6      attrd:    debug: crm_compare_age: 	Win: 0.10998 vs 0.6998 (usec)
Oct 21 11:19:43 [7683] bl460g1n6      attrd:     info: election_count_vote: 	Election 1 (owner: 3232261594) pass: vote from bl460g1n8 (Uptime)
Oct 21 11:19:43 [7683] bl460g1n6      attrd:    debug: election_vote: 	Started election 2
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: --- 0.5.2
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: +++ 0.5.3 82a3f0e4ec16092ad71b6e85ce3cc493
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-- <cib num_updates="2"/>
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  <cib epoch="5" num_updates="3" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:19:43 2013" update-origin="bl460g1n6" update-client="crmd">
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    <status>
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++     <node_state id="3232261592" uname="bl460g1n6" in_ccm="true" crmd="online" crm-debug-origin="do_lrm_query_internal" join="member" expected="member">
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++       <lrm id="3232261592">
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         <lrm_resources/>
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++       </lrm>
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++     </node_state>
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    </status>
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  </cib>
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: erase_xpath_callback: 	Deletion of "//node_state[@uname='bl460g1n6']/lrm": OK (rc=0)
Oct 21 11:19:43 [7683] bl460g1n6      attrd:    debug: election_count_vote: 	Created voted hash
Oct 21 11:19:43 [7683] bl460g1n6      attrd:    debug: election_check: 	Still waiting on 3 non-votes (3 total)
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: join_update_complete_callback: 	Join update 22 complete
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: check_join_state: 	Invoked by join_update_complete_callback in state: S_FINALIZE_JOIN
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: check_join_state: 	join-2: Still waiting on 1 finalized nodes
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: crmd_join_phase_log: 	join-2: bl460g1n7=finalized
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: crmd_join_phase_log: 	join-2: bl460g1n8=confirmed
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: crmd_join_phase_log: 	join-2: bl460g1n6=confirmed
Oct 21 11:19:43 [7680] bl460g1n6        cib:     info: write_cib_contents: 	Wrote version 0.1.0 of the CIB to disk (digest: 2b5d37bb9a92bb6b3485bb82e926f2e5)
Oct 21 11:19:43 [7680] bl460g1n6        cib:     info: crm_get_peer: 	Node 3232261593 is now known as bl460g1n7
Oct 21 11:19:43 [7680] bl460g1n6        cib:    debug: cib_process_xpath: 	//node_state[@uname='bl460g1n7']/transient_attributes was already removed
Oct 21 11:19:43 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_delete operation for section //node_state[@uname='bl460g1n7']/transient_attributes: OK (rc=0, origin=bl460g1n7/crmd/10, version=0.5.3)
Oct 21 11:19:43 [7683] bl460g1n6      attrd:     info: crm_get_peer: 	Node 3232261593 is now known as bl460g1n7
Oct 21 11:19:43 [7683] bl460g1n6      attrd:    debug: election_check: 	Still waiting on 3 non-votes (3 total)
Oct 21 11:19:43 [7683] bl460g1n6      attrd:    debug: election_count_vote: 	Election 2 (current: 2, owner: 3232261592): Processed vote from bl460g1n6 (Recorded)
Oct 21 11:19:43 [7683] bl460g1n6      attrd:    debug: election_check: 	Still waiting on 2 non-votes (3 total)
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: join_update_complete_callback: 	Join update 24 complete
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: check_join_state: 	Invoked by join_update_complete_callback in state: S_FINALIZE_JOIN
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: check_join_state: 	join-2: Still waiting on 1 finalized nodes
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: crmd_join_phase_log: 	join-2: bl460g1n7=finalized
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: crmd_join_phase_log: 	join-2: bl460g1n8=confirmed
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: crmd_join_phase_log: 	join-2: bl460g1n6=confirmed
Oct 21 11:19:43 [7683] bl460g1n6      attrd:    debug: election_check: 	Still waiting on 2 non-votes (3 total)
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: s_crmd_fsa: 	Processing I_JOIN_RESULT: [ state=S_FINALIZE_JOIN cause=C_HA_MESSAGE origin=route_message ]
Oct 21 11:19:43 [7685] bl460g1n6       crmd:     info: crm_update_peer_join: 	do_dc_join_ack: Node bl460g1n7[3232261593] - join-2 phase 3 -> 4
Oct 21 11:19:43 [7685] bl460g1n6       crmd:     info: do_dc_join_ack: 	join-2: Updating node state to member for bl460g1n7
Oct 21 11:19:43 [7685] bl460g1n6       crmd:     info: erase_status_tag: 	Deleting xpath: //node_state[@uname='bl460g1n7']/lrm
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: do_dc_join_ack: 	join-2: Registered callback for LRM update 26
Oct 21 11:19:43 [7680] bl460g1n6        cib:    debug: cib_process_xpath: 	//node_state[@uname='bl460g1n7']/lrm was already removed
Oct 21 11:19:43 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_delete operation for section //node_state[@uname='bl460g1n7']/lrm: OK (rc=0, origin=local/crmd/25, version=0.5.3)
Oct 21 11:19:43 [7683] bl460g1n6      attrd:    debug: election_count_vote: 	Election 2 (current: 2, owner: 3232261592): Processed no-vote from bl460g1n8 (Recorded)
Oct 21 11:19:43 [7683] bl460g1n6      attrd:    debug: election_check: 	Still waiting on 1 non-votes (3 total)
Oct 21 11:19:43 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=local/crmd/26, version=0.5.4)
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: --- 0.5.3
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: +++ 0.5.4 647faaef5fcc42db3b3407219a33156b
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-- <cib num_updates="3"/>
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  <cib epoch="5" num_updates="4" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:19:43 2013" update-origin="bl460g1n6" update-client="crmd">
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    <status>
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++     <node_state id="3232261593" uname="bl460g1n7" in_ccm="true" crmd="online" crm-debug-origin="do_lrm_query_internal" join="member" expected="member">
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++       <lrm id="3232261593">
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         <lrm_resources/>
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++       </lrm>
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++     </node_state>
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    </status>
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  </cib>
Oct 21 11:19:43 [7683] bl460g1n6      attrd:    debug: election_count_vote: 	Election 2 (current: 2, owner: 3232261592): Processed no-vote from bl460g1n7 (Recorded)
Oct 21 11:19:43 [7683] bl460g1n6      attrd:     info: election_timer_cb: 	Election election-attrd complete
Oct 21 11:19:43 [7683] bl460g1n6      attrd:    debug: attrd_peer_sync: 	Syncing shutdown[bl460g1n6] = (null) to everyone
Oct 21 11:19:43 [7683] bl460g1n6      attrd:    debug: attrd_peer_sync: 	Syncing shutdown[bl460g1n7] = (null) to everyone
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: erase_xpath_callback: 	Deletion of "//node_state[@uname='bl460g1n7']/lrm": OK (rc=0)
Oct 21 11:19:43 [7683] bl460g1n6      attrd:    debug: attrd_peer_sync: 	Syncing shutdown[bl460g1n8] = (null) to everyone
Oct 21 11:19:43 [7683] bl460g1n6      attrd:    debug: attrd_peer_sync: 	Syncing terminate[bl460g1n6] = (null) to everyone
Oct 21 11:19:43 [7683] bl460g1n6      attrd:    debug: attrd_peer_sync: 	Syncing terminate[bl460g1n7] = (null) to everyone
Oct 21 11:19:43 [7683] bl460g1n6      attrd:    debug: attrd_peer_sync: 	Syncing terminate[bl460g1n8] = (null) to everyone
Oct 21 11:19:43 [7683] bl460g1n6      attrd:    debug: attrd_peer_sync: 	Syncing values to everyone
Oct 21 11:19:43 [7683] bl460g1n6      attrd:    debug: write_attribute: 	Update: bl460g1n6[shutdown]=(null) (3232261592 3232261592 3232261592 bl460g1n6)
Oct 21 11:19:43 [7683] bl460g1n6      attrd:    debug: write_attribute: 	Update: bl460g1n7[shutdown]=(null) (3232261593 3232261593 3232261593 bl460g1n7)
Oct 21 11:19:43 [7683] bl460g1n6      attrd:    debug: write_attribute: 	Update: bl460g1n8[shutdown]=(null) (3232261594 3232261594 3232261594 bl460g1n8)
Oct 21 11:19:43 [7683] bl460g1n6      attrd:   notice: write_attribute: 	Sent update 2 with 3 changes for shutdown, id=<n/a>, set=(null)
Oct 21 11:19:43 [7683] bl460g1n6      attrd:    debug: write_attribute: 	Update: bl460g1n6[terminate]=(null) (3232261592 3232261592 3232261592 bl460g1n6)
Oct 21 11:19:43 [7683] bl460g1n6      attrd:    debug: write_attribute: 	Update: bl460g1n7[terminate]=(null) (3232261593 3232261593 3232261593 bl460g1n7)
Oct 21 11:19:43 [7683] bl460g1n6      attrd:    debug: write_attribute: 	Update: bl460g1n8[terminate]=(null) (3232261594 3232261594 3232261594 bl460g1n8)
Oct 21 11:19:43 [7680] bl460g1n6        cib:    debug: cib_process_modify: 	Destroying /cib/status/node_state[1]/transient_attributes/instance_attributes/nvpair
Oct 21 11:19:43 [7680] bl460g1n6        cib:    debug: cib_process_modify: 	Destroying /cib/status/node_state[2]/transient_attributes/instance_attributes/nvpair
Oct 21 11:19:43 [7680] bl460g1n6        cib:    debug: cib_process_modify: 	Destroying /cib/status/node_state[3]/transient_attributes/instance_attributes/nvpair
Oct 21 11:19:43 [7683] bl460g1n6      attrd:   notice: write_attribute: 	Sent update 3 with 3 changes for terminate, id=<n/a>, set=(null)
Oct 21 11:19:43 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=local/attrd/2, version=0.5.5)
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: --- 0.5.4
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: +++ 0.5.5 aab1e94ac19c7df11e409b28b002d377
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-- <cib num_updates="4"/>
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  <cib epoch="5" num_updates="5" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:19:43 2013" update-origin="bl460g1n6" update-client="crmd">
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    <status>
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      <node_state id="3232261594" uname="bl460g1n8" in_ccm="true" crmd="online" crm-debug-origin="do_lrm_query_internal" join="member" expected="member">
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++       <transient_attributes id="3232261594">
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         <instance_attributes id="status-3232261594"/>
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++       </transient_attributes>
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      </node_state>
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      <node_state id="3232261592" uname="bl460g1n6" in_ccm="true" crmd="online" crm-debug-origin="do_lrm_query_internal" join="member" expected="member">
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++       <transient_attributes id="3232261592">
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         <instance_attributes id="status-3232261592"/>
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++       </transient_attributes>
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      </node_state>
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      <node_state id="3232261593" uname="bl460g1n7" in_ccm="true" crmd="online" crm-debug-origin="do_lrm_query_internal" join="member" expected="member">
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++       <transient_attributes id="3232261593">
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         <instance_attributes id="status-3232261593"/>
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++       </transient_attributes>
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      </node_state>
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    </status>
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  </cib>
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: join_update_complete_callback: 	Join update 26 complete
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: check_join_state: 	Invoked by join_update_complete_callback in state: S_FINALIZE_JOIN
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: check_join_state: 	join-2 complete: join_update_complete_callback
Oct 21 11:19:43 [7680] bl460g1n6        cib:    debug: cib_process_modify: 	Destroying /cib/status/node_state[1]/transient_attributes/instance_attributes/nvpair
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: s_crmd_fsa: 	Processing I_FINALIZED: [ state=S_FINALIZE_JOIN cause=C_FSA_INTERNAL origin=check_join_state ]
Oct 21 11:19:43 [7680] bl460g1n6        cib:    debug: cib_process_modify: 	Destroying /cib/status/node_state[2]/transient_attributes/instance_attributes/nvpair
Oct 21 11:19:43 [7680] bl460g1n6        cib:    debug: cib_process_modify: 	Destroying /cib/status/node_state[3]/transient_attributes/instance_attributes/nvpair
Oct 21 11:19:43 [7685] bl460g1n6       crmd:     info: do_state_transition: 	State transition S_FINALIZE_JOIN -> S_POLICY_ENGINE [ input=I_FINALIZED cause=C_FSA_INTERNAL origin=check_join_state ]
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: do_state_transition: 	All 3 cluster nodes are eligible to run resources.
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: do_dc_join_final: 	Ensuring DC, quorum and node attributes are up-to-date
Oct 21 11:19:43 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=local/attrd/3, version=0.5.5)
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: attrd_update_delegate: 	Sent update: (null)=(null) for localhost
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: crm_update_quorum: 	Updating quorum status to true (call=29)
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: do_te_invoke: 	Cancelling the transition: inactive
Oct 21 11:19:43 [7683] bl460g1n6      attrd:     info: attrd_cib_callback: 	Update 3 for terminate: OK (0)
Oct 21 11:19:43 [7685] bl460g1n6       crmd:     info: abort_transition_graph: 	do_te_invoke:151 - Triggered transition abort (complete=1) : Peer Cancelled
Oct 21 11:19:43 [7683] bl460g1n6      attrd:   notice: attrd_cib_callback: 	Update 3 for terminate[bl460g1n6]=(null): OK (0)
Oct 21 11:19:43 [7683] bl460g1n6      attrd:   notice: attrd_cib_callback: 	Update 3 for terminate[bl460g1n7]=(null): OK (0)
Oct 21 11:19:43 [7683] bl460g1n6      attrd:   notice: attrd_cib_callback: 	Update 3 for terminate[bl460g1n8]=(null): OK (0)
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: s_crmd_fsa: 	Processing I_PE_CALC: [ state=S_POLICY_ENGINE cause=C_FSA_INTERNAL origin=abort_transition_graph ]
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: do_pe_invoke: 	Query 30: Requesting the current CIB: S_POLICY_ENGINE
Oct 21 11:19:43 [7683] bl460g1n6      attrd:     info: attrd_cib_callback: 	Update 2 for shutdown: OK (0)
Oct 21 11:19:43 [7683] bl460g1n6      attrd:   notice: attrd_cib_callback: 	Update 2 for shutdown[bl460g1n6]=(null): OK (0)
Oct 21 11:19:43 [7683] bl460g1n6      attrd:   notice: attrd_cib_callback: 	Update 2 for shutdown[bl460g1n7]=(null): OK (0)
Oct 21 11:19:43 [7683] bl460g1n6      attrd:   notice: attrd_cib_callback: 	Update 2 for shutdown[bl460g1n8]=(null): OK (0)
Oct 21 11:19:43 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section nodes: OK (rc=0, origin=local/crmd/27, version=0.5.5)
Oct 21 11:19:43 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=local/crmd/28, version=0.5.5)
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: te_update_diff: 	Processing diff (cib_modify): 0.5.5 -> 0.5.6 (S_POLICY_ENGINE)
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: --- 0.5.5
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: +++ 0.5.6 c1411157b8d7e441aa92162aee6b3c12
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-- <cib num_updates="5"/>
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++ <cib epoch="5" num_updates="6" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:19:43 2013" update-origin="bl460g1n6" update-client="crmd" have-quorum="1" dc-uuid="3232261592"/>
Oct 21 11:19:43 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section cib: OK (rc=0, origin=local/crmd/29, version=0.5.6)
Oct 21 11:19:43 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_query operation for section 'all': OK (rc=0, origin=local/crmd/30, version=0.5.6)
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: do_pe_invoke_callback: 	Invoking the PE: query=30, ref=pe_calc-dc-1382321983-15, seq=16, quorate=1
Oct 21 11:19:43 [7684] bl460g1n6    pengine:    debug: unpack_config: 	STONITH timeout: 60000
Oct 21 11:19:43 [7684] bl460g1n6    pengine:    debug: unpack_config: 	STONITH of failed nodes is enabled
Oct 21 11:19:43 [7684] bl460g1n6    pengine:    debug: unpack_config: 	Stop all active resources: false
Oct 21 11:19:43 [7684] bl460g1n6    pengine:    debug: unpack_config: 	Cluster is symmetric - resources can run anywhere by default
Oct 21 11:19:43 [7684] bl460g1n6    pengine:    debug: unpack_config: 	Default stickiness: 0
Oct 21 11:19:43 [7684] bl460g1n6    pengine:    debug: unpack_config: 	On loss of CCM Quorum: Stop ALL resources
Oct 21 11:19:43 [7684] bl460g1n6    pengine:    debug: unpack_config: 	Node scores: 'red' = -INFINITY, 'yellow' = 0, 'green' = 0
Oct 21 11:19:43 [7684] bl460g1n6    pengine:    debug: unpack_domains: 	Unpacking domains
Oct 21 11:19:43 [7684] bl460g1n6    pengine:    error: unpack_resources: 	Resource start-up disabled since no STONITH resources have been defined
Oct 21 11:19:43 [7684] bl460g1n6    pengine:    error: unpack_resources: 	Either configure some or disable STONITH with the stonith-enabled option
Oct 21 11:19:43 [7684] bl460g1n6    pengine:    error: unpack_resources: 	NOTE: Clusters with shared data need STONITH to ensure data integrity
Oct 21 11:19:43 [7684] bl460g1n6    pengine:     info: determine_online_status_fencing: 	Node bl460g1n8 is active
Oct 21 11:19:43 [7684] bl460g1n6    pengine:     info: determine_online_status: 	Node bl460g1n8 is online
Oct 21 11:19:43 [7684] bl460g1n6    pengine:     info: determine_online_status_fencing: 	Node bl460g1n6 is active
Oct 21 11:19:43 [7684] bl460g1n6    pengine:     info: determine_online_status: 	Node bl460g1n6 is online
Oct 21 11:19:43 [7684] bl460g1n6    pengine:     info: determine_online_status_fencing: 	Node bl460g1n7 is active
Oct 21 11:19:43 [7684] bl460g1n6    pengine:     info: determine_online_status: 	Node bl460g1n7 is online
Oct 21 11:19:43 [7684] bl460g1n6    pengine:   notice: stage6: 	Delaying fencing operations until there are resources to manage
Oct 21 11:19:43 [7684] bl460g1n6    pengine:    debug: get_last_sequence: 	Series file /var/lib/pacemaker/pengine/pe-input.last does not exist
Oct 21 11:19:43 [7684] bl460g1n6    pengine:   notice: process_pe_message: 	Calculated Transition 0: /var/lib/pacemaker/pengine/pe-input-0.bz2
Oct 21 11:19:43 [7684] bl460g1n6    pengine:   notice: process_pe_message: 	Configuration ERRORs found during PE processing.  Please run "crm_verify -L" to identify issues.
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: s_crmd_fsa: 	Processing I_PE_SUCCESS: [ state=S_POLICY_ENGINE cause=C_IPC_MESSAGE origin=handle_response ]
Oct 21 11:19:43 [7685] bl460g1n6       crmd:     info: do_state_transition: 	State transition S_POLICY_ENGINE -> S_TRANSITION_ENGINE [ input=I_PE_SUCCESS cause=C_IPC_MESSAGE origin=handle_response ]
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: unpack_graph: 	Unpacked transition 0: 3 actions in 3 synapses
Oct 21 11:19:43 [7685] bl460g1n6       crmd:     info: do_te_invoke: 	Processing graph 0 (ref=pe_calc-dc-1382321983-15) derived from /var/lib/pacemaker/pengine/pe-input-0.bz2
Oct 21 11:19:43 [7685] bl460g1n6       crmd:   notice: te_rsc_command: 	Initiating action 4: probe_complete probe_complete on bl460g1n8 - no waiting
Oct 21 11:19:43 [7685] bl460g1n6       crmd:     info: te_rsc_command: 	Action 4 confirmed - no wait
Oct 21 11:19:43 [7685] bl460g1n6       crmd:   notice: te_rsc_command: 	Initiating action 3: probe_complete probe_complete on bl460g1n7 - no waiting
Oct 21 11:19:43 [7685] bl460g1n6       crmd:     info: te_rsc_command: 	Action 3 confirmed - no wait
Oct 21 11:19:43 [7685] bl460g1n6       crmd:   notice: te_rsc_command: 	Initiating action 2: probe_complete probe_complete on bl460g1n6 (local) - no waiting
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: attrd_update_delegate: 	Sent update: probe_complete=true for bl460g1n6
Oct 21 11:19:43 [7685] bl460g1n6       crmd:     info: te_rsc_command: 	Action 2 confirmed - no wait
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: run_graph: 	Transition 0 (Complete=0, Pending=0, Fired=3, Skipped=0, Incomplete=0, Source=/var/lib/pacemaker/pengine/pe-input-0.bz2): In-progress
Oct 21 11:19:43 [7683] bl460g1n6      attrd:     info: attrd_client_message: 	Broadcasting probe_complete[bl460g1n6] = true (writer)
Oct 21 11:19:43 [7685] bl460g1n6       crmd:   notice: run_graph: 	Transition 0 (Complete=3, Pending=0, Fired=0, Skipped=0, Incomplete=0, Source=/var/lib/pacemaker/pengine/pe-input-0.bz2): Complete
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: te_graph_trigger: 	Transition 0 is now complete
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: notify_crmd: 	Processing transition completion in state S_TRANSITION_ENGINE
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: notify_crmd: 	Transition 0 status: done - <null>
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: s_crmd_fsa: 	Processing I_TE_SUCCESS: [ state=S_TRANSITION_ENGINE cause=C_FSA_INTERNAL origin=notify_crmd ]
Oct 21 11:19:43 [7685] bl460g1n6       crmd:     info: do_log: 	FSA: Input I_TE_SUCCESS from notify_crmd() received in state S_TRANSITION_ENGINE
Oct 21 11:19:43 [7685] bl460g1n6       crmd:   notice: do_state_transition: 	State transition S_TRANSITION_ENGINE -> S_IDLE [ input=I_TE_SUCCESS cause=C_FSA_INTERNAL origin=notify_crmd ]
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: do_state_transition: 	Starting PEngine Recheck Timer
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: crm_timer_start: 	Started PEngine Recheck Timer (I_PE_CALC:900000ms), src=40
Oct 21 11:19:43 [7683] bl460g1n6      attrd:    debug: write_attribute: 	Update: bl460g1n6[probe_complete]=true (3232261592 3232261592 3232261592 bl460g1n6)
Oct 21 11:19:43 [7683] bl460g1n6      attrd:   notice: write_attribute: 	Sent update 4 with 1 changes for probe_complete, id=<n/a>, set=(null)
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: --- 0.5.6
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: +++ 0.5.7 56352c7523fc38e6f5fcc7bb0cb03e75
Oct 21 11:19:43 [7683] bl460g1n6      attrd:     info: write_attribute: 	Write out of probe_complete delayed: update 4 in progress
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-- <cib num_updates="6"/>
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  <cib epoch="5" num_updates="7" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:19:43 2013" update-origin="bl460g1n6" update-client="crmd" have-quorum="1" dc-uuid="3232261592">
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    <status>
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      <node_state id="3232261592" uname="bl460g1n6" in_ccm="true" crmd="online" crm-debug-origin="do_state_transition" join="member" expected="member">
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        <transient_attributes id="3232261592">
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          <instance_attributes id="status-3232261592">
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: te_update_diff: 	Processing diff (cib_modify): 0.5.6 -> 0.5.7 (S_IDLE)
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <nvpair id="status-3232261592-probe_complete" name="probe_complete" value="true"/>
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          </instance_attributes>
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        </transient_attributes>
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      </node_state>
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    </status>
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  </cib>
Oct 21 11:19:43 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=local/attrd/4, version=0.5.7)
Oct 21 11:19:43 [7683] bl460g1n6      attrd:     info: attrd_cib_callback: 	Update 4 for probe_complete: OK (0)
Oct 21 11:19:43 [7683] bl460g1n6      attrd:   notice: attrd_cib_callback: 	Update 4 for probe_complete[bl460g1n6]=true: OK (0)
Oct 21 11:19:43 [7683] bl460g1n6      attrd:   notice: attrd_cib_callback: 	Update 4 for probe_complete[bl460g1n7]=(null): OK (0)
Oct 21 11:19:43 [7683] bl460g1n6      attrd:    debug: write_attribute: 	Update: bl460g1n6[probe_complete]=true (3232261592 3232261592 3232261592 bl460g1n6)
Oct 21 11:19:43 [7683] bl460g1n6      attrd:    debug: write_attribute: 	Update: bl460g1n7[probe_complete]=true (3232261593 3232261593 3232261593 bl460g1n7)
Oct 21 11:19:43 [7683] bl460g1n6      attrd:   notice: write_attribute: 	Sent update 5 with 2 changes for probe_complete, id=<n/a>, set=(null)
Oct 21 11:19:43 [7680] bl460g1n6        cib:    debug: write_cib_contents: 	Wrote digest 2b5d37bb9a92bb6b3485bb82e926f2e5 to disk
Oct 21 11:19:43 [7680] bl460g1n6        cib:     info: retrieveCib: 	Reading cluster configuration from: /var/lib/pacemaker/cib/cib.EtbOxm (digest: /var/lib/pacemaker/cib/cib.GRANI3)
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: te_update_diff: 	Processing diff (cib_modify): 0.5.7 -> 0.5.8 (S_IDLE)
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: --- 0.5.7
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: +++ 0.5.8 2e8e6a4845b8234206443e8e55d350ca
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-- <cib num_updates="7"/>
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  <cib epoch="5" num_updates="8" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:19:43 2013" update-origin="bl460g1n6" update-client="crmd" have-quorum="1" dc-uuid="3232261592">
Oct 21 11:19:43 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=local/attrd/5, version=0.5.8)
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    <status>
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      <node_state id="3232261593" uname="bl460g1n7" in_ccm="true" crmd="online" crm-debug-origin="do_state_transition" join="member" expected="member">
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        <transient_attributes id="3232261593">
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          <instance_attributes id="status-3232261593">
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <nvpair id="status-3232261593-probe_complete" name="probe_complete" value="true"/>
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          </instance_attributes>
Oct 21 11:19:43 [7683] bl460g1n6      attrd:     info: write_attribute: 	Write out of probe_complete delayed: update 5 in progress
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        </transient_attributes>
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      </node_state>
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    </status>
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  </cib>
Oct 21 11:19:43 [7683] bl460g1n6      attrd:     info: attrd_cib_callback: 	Update 5 for probe_complete: OK (0)
Oct 21 11:19:43 [7683] bl460g1n6      attrd:   notice: attrd_cib_callback: 	Update 5 for probe_complete[bl460g1n6]=true: OK (0)
Oct 21 11:19:43 [7683] bl460g1n6      attrd:   notice: attrd_cib_callback: 	Update 5 for probe_complete[bl460g1n7]=true: OK (0)
Oct 21 11:19:43 [7683] bl460g1n6      attrd:   notice: attrd_cib_callback: 	Update 5 for probe_complete[bl460g1n8]=(null): OK (0)
Oct 21 11:19:43 [7683] bl460g1n6      attrd:    debug: write_attribute: 	Update: bl460g1n6[probe_complete]=true (3232261592 3232261592 3232261592 bl460g1n6)
Oct 21 11:19:43 [7683] bl460g1n6      attrd:    debug: write_attribute: 	Update: bl460g1n7[probe_complete]=true (3232261593 3232261593 3232261593 bl460g1n7)
Oct 21 11:19:43 [7683] bl460g1n6      attrd:    debug: write_attribute: 	Update: bl460g1n8[probe_complete]=true (3232261594 3232261594 3232261594 bl460g1n8)
Oct 21 11:19:43 [7683] bl460g1n6      attrd:   notice: write_attribute: 	Sent update 6 with 3 changes for probe_complete, id=<n/a>, set=(null)
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: --- 0.5.8
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: +++ 0.5.9 3a5c983766230929c1d81f4f94841d9e
Oct 21 11:19:43 [7685] bl460g1n6       crmd:    debug: te_update_diff: 	Processing diff (cib_modify): 0.5.8 -> 0.5.9 (S_IDLE)
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-- <cib num_updates="8"/>
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  <cib epoch="5" num_updates="9" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:19:43 2013" update-origin="bl460g1n6" update-client="crmd" have-quorum="1" dc-uuid="3232261592">
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    <status>
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      <node_state id="3232261594" uname="bl460g1n8" in_ccm="true" crmd="online" crm-debug-origin="do_state_transition" join="member" expected="member">
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        <transient_attributes id="3232261594">
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          <instance_attributes id="status-3232261594">
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <nvpair id="status-3232261594-probe_complete" name="probe_complete" value="true"/>
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          </instance_attributes>
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        </transient_attributes>
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      </node_state>
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    </status>
Oct 21 11:19:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  </cib>
Oct 21 11:19:43 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=local/attrd/6, version=0.5.9)
Oct 21 11:19:43 [7683] bl460g1n6      attrd:     info: attrd_cib_callback: 	Update 6 for probe_complete: OK (0)
Oct 21 11:19:43 [7683] bl460g1n6      attrd:   notice: attrd_cib_callback: 	Update 6 for probe_complete[bl460g1n6]=true: OK (0)
Oct 21 11:19:43 [7683] bl460g1n6      attrd:   notice: attrd_cib_callback: 	Update 6 for probe_complete[bl460g1n7]=true: OK (0)
Oct 21 11:19:43 [7683] bl460g1n6      attrd:   notice: attrd_cib_callback: 	Update 6 for probe_complete[bl460g1n8]=true: OK (0)
Oct 21 11:19:43 [7680] bl460g1n6        cib:    debug: write_cib_contents: 	Activating /var/lib/pacemaker/cib/cib.EtbOxm
Oct 21 11:19:43 [7680] bl460g1n6        cib:     info: write_cib_contents: 	Archived previous version as /var/lib/pacemaker/cib/cib-1.raw
Oct 21 11:19:43 [7680] bl460g1n6        cib:    debug: write_cib_contents: 	Writing CIB to disk
Oct 21 11:19:43 [7680] bl460g1n6        cib:     info: write_cib_contents: 	Wrote version 0.5.0 of the CIB to disk (digest: fd1c17f0935512546ae5c2e678df50ae)
Oct 21 11:19:43 [7680] bl460g1n6        cib:    debug: write_cib_contents: 	Wrote digest fd1c17f0935512546ae5c2e678df50ae to disk
Oct 21 11:19:43 [7680] bl460g1n6        cib:     info: retrieveCib: 	Reading cluster configuration from: /var/lib/pacemaker/cib/cib.FGLj8y (digest: /var/lib/pacemaker/cib/cib.Ud9hGg)
Oct 21 11:19:43 [7680] bl460g1n6        cib:    debug: write_cib_contents: 	Activating /var/lib/pacemaker/cib/cib.FGLj8y
Oct 21 11:20:18 [7680] bl460g1n6        cib:     info: crm_client_new: 	Connecting 0xbe7fc0 for uid=0 gid=0 pid=7733 id=1704fd54-88ef-40a9-a06d-66bcdaa4dcbb
Oct 21 11:20:18 [7680] bl460g1n6        cib:    debug: handle_new_connection: 	IPC credentials authenticated (7680-7733-14)
Oct 21 11:20:18 [7680] bl460g1n6        cib:    debug: qb_ipcs_shm_connect: 	connecting to client [7733]
Oct 21 11:20:18 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:18 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:18 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:18 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_query operation for section 'all': OK (rc=0, origin=local/cibadmin/2, version=0.5.9)
Oct 21 11:20:18 [7680] bl460g1n6        cib:    debug: qb_ipcs_dispatch_connection_request: 	HUP conn (7680-7733-14)
Oct 21 11:20:18 [7680] bl460g1n6        cib:    debug: qb_ipcs_disconnect: 	qb_ipcs_disconnect(7680-7733-14) state:2
Oct 21 11:20:18 [7680] bl460g1n6        cib:     info: crm_client_destroy: 	Destroying 0 events
Oct 21 11:20:18 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-response-7680-7733-14-header
Oct 21 11:20:18 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-event-7680-7733-14-header
Oct 21 11:20:18 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-request-7680-7733-14-header
Oct 21 11:20:18 [7680] bl460g1n6        cib:     info: crm_client_new: 	Connecting 0xbe7fc0 for uid=0 gid=0 pid=7734 id=a76e8b6f-82be-442c-84bc-1e258f429409
Oct 21 11:20:18 [7680] bl460g1n6        cib:    debug: handle_new_connection: 	IPC credentials authenticated (7680-7734-14)
Oct 21 11:20:18 [7680] bl460g1n6        cib:    debug: qb_ipcs_shm_connect: 	connecting to client [7734]
Oct 21 11:20:18 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:18 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:18 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:18 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_query operation for section 'all': OK (rc=0, origin=local/cibadmin/2, version=0.5.9)
Oct 21 11:20:18 [7680] bl460g1n6        cib:    debug: qb_ipcs_dispatch_connection_request: 	HUP conn (7680-7734-14)
Oct 21 11:20:18 [7680] bl460g1n6        cib:    debug: qb_ipcs_disconnect: 	qb_ipcs_disconnect(7680-7734-14) state:2
Oct 21 11:20:18 [7680] bl460g1n6        cib:     info: crm_client_destroy: 	Destroying 0 events
Oct 21 11:20:18 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-response-7680-7734-14-header
Oct 21 11:20:18 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-event-7680-7734-14-header
Oct 21 11:20:18 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-request-7680-7734-14-header
Oct 21 11:20:18 [7680] bl460g1n6        cib:     info: crm_client_new: 	Connecting 0xbe7fc0 for uid=0 gid=0 pid=7782 id=81d456dc-8fc6-4d67-a104-7752b929a454
Oct 21 11:20:18 [7680] bl460g1n6        cib:    debug: handle_new_connection: 	IPC credentials authenticated (7680-7782-14)
Oct 21 11:20:18 [7680] bl460g1n6        cib:    debug: qb_ipcs_shm_connect: 	connecting to client [7782]
Oct 21 11:20:18 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:18 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:18 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:18 [7680] bl460g1n6        cib:    debug: activateCibXml: 	Triggering CIB write for cib_replace op
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: --- 0.5.9
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: +++ 0.6.1 65395a06b7ce78fecc54a58606f6d0f6
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: te_update_diff: 	Processing diff (cib_replace): 0.5.9 -> 0.6.1 (S_IDLE)
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-  <cib admin_epoch="0" epoch="5" num_updates="9">
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-    <configuration>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-      <crm_config>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-        <cluster_property_set id="cib-bootstrap-options">
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	--         <nvpair id="cib-bootstrap-options-dc-version" name="dc-version" value="1.1.11-0.302.b6d42ed.git.el6-b6d42ed"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	--         <nvpair id="cib-bootstrap-options-cluster-infrastructure" name="cluster-infrastructure" value="corosync"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-        </cluster_property_set>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-      </crm_config>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-    </configuration>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-  </cib>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:     info: abort_transition_graph: 	te_update_diff:126 - Triggered transition abort (complete=1, node=, tag=diff, id=(null), magic=NA, cib=0.6.1) : Non-status change
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  <cib epoch="6" num_updates="1" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:18 2013" update-origin="bl460g1n6" update-client="cibadmin" have-quorum="1" dc-uuid="3232261592">
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    <configuration>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      <crm_config>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        <cluster_property_set id="cib-bootstrap-options">
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause   <diff crm_feature_set="3.0.7" digest="65395a06b7ce78fecc54a58606f6d0f6">
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         <nvpair name="no-quorum-policy" value="freeze" id="cib-bootstrap-options-no-quorum-policy"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause     <diff-removed admin_epoch="0" epoch="5" num_updates="9">
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         <nvpair name="stonith-enabled" value="true" id="cib-bootstrap-options-stonith-enabled"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause       <cib admin_epoch="0" epoch="5" num_updates="9">
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         <nvpair name="startup-fencing" value="false" id="cib-bootstrap-options-startup-fencing"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause         <configuration>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         <nvpair name="crmd-transition-delay" value="2s" id="cib-bootstrap-options-crmd-transition-delay"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause           <crm_config>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        </cluster_property_set>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause             <cluster_property_set id="cib-bootstrap-options">
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      </crm_config>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause               <nvpair id="cib-bootstrap-options-dc-version" name="dc-version" value="1.1.11-0.302.b6d42ed.git.el6-b6d42ed" __crm_diff_marker__="removed:top"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      <resources>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause               <nvpair id="cib-bootstrap-options-cluster-infrastructure" name="cluster-infrastructure" value="corosync" __crm_diff_marker__="removed:top"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++       <primitive id="prmVM1" class="ocf" provider="heartbeat" type="VirtualDomain">
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause             </cluster_property_set>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         <instance_attributes id="prmVM1-instance_attributes">
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause           </crm_config>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <nvpair name="config" value="/etc/libvirt/qemu/vm1.xml" id="prmVM1-instance_attributes-config"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause         </configuration>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <nvpair name="hypervisor" value="qemu:///system" id="prmVM1-instance_attributes-hypervisor"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause       </cib>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <nvpair name="migration_transport" value="ssh" id="prmVM1-instance_attributes-migration_transport"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause     </diff-removed>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         </instance_attributes>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause     <diff-added>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         <meta_attributes id="prmVM1-meta_attributes">
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause       <cib epoch="6" num_updates="1" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:18 2013" update-origin="bl460g1n6" update-client="cibadmin" have-quorum="1" dc-uuid="3232261592">
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <nvpair name="allow-migrate" value="true" id="prmVM1-meta_attributes-allow-migrate"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause         <configuration>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         </meta_attributes>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause           <crm_config>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         <operations>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause             <cluster_property_set id="cib-bootstrap-options">
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <op name="start" interval="0s" timeout="120s" on-fail="restart" id="prmVM1-start-0s"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause               <nvpair name="no-quorum-policy" value="freeze" id="cib-bootstrap-options-no-quorum-policy" __crm_diff_marker__="added:top"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <op name="monitor" interval="10s" timeout="30s" on-fail="restart" id="prmVM1-monitor-10s"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause               <nvpair name="stonith-enabled" value="true" id="cib-bootstrap-options-stonith-enabled" __crm_diff_marker__="added:top"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <op name="stop" interval="0s" timeout="120s" on-fail="fence" id="prmVM1-stop-0s"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause               <nvpair name="startup-fencing" value="false" id="cib-bootstrap-options-startup-fencing" __crm_diff_marker__="added:top"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <op name="migrate_to" interval="0s" timeout="120s" on-fail="restart" id="prmVM1-migrate_to-0s"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause               <nvpair name="crmd-transition-delay" value="2s" id="cib-bootstrap-options-crmd-transition-delay" __crm_diff_marker__="added:top"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <op name="migrate_from" interval="0s" timeout="120s" on-fail="restart" id="prmVM1-migrate_from-0s"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause             </cluster_property_set>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         </operations>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause           </crm_config>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++       </primitive>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause           <resources>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++       <primitive id="prmVM2" class="ocf" provider="heartbeat" type="VirtualDomain">
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause             <primitive id="prmVM1" class="ocf" provider="heartbeat" type="VirtualDomain" __crm_diff_marker__="added:top">
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         <instance_attributes id="prmVM2-instance_attributes">
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause               <instance_attributes id="prmVM1-instance_attributes">
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <nvpair name="config" value="/etc/libvirt/qemu/vm2.xml" id="prmVM2-instance_attributes-config"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                 <nvpair name="config" value="/etc/libvirt/qemu/vm1.xml" id="prmVM1-instance_attributes-config"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <nvpair name="hypervisor" value="qemu:///system" id="prmVM2-instance_attributes-hypervisor"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                 <nvpair name="hypervisor" value="qemu:///system" id="prmVM1-instance_attributes-hypervisor"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <nvpair name="migration_transport" value="ssh" id="prmVM2-instance_attributes-migration_transport"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                 <nvpair name="migration_transport" value="ssh" id="prmVM1-instance_attributes-migration_transport"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         </instance_attributes>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause               </instance_attributes>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         <meta_attributes id="prmVM2-meta_attributes">
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause               <meta_attributes id="prmVM1-meta_attributes">
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <nvpair name="allow-migrate" value="true" id="prmVM2-meta_attributes-allow-migrate"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                 <nvpair name="allow-migrate" value="true" id="prmVM1-meta_attributes-allow-migrate"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         </meta_attributes>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause               </meta_attributes>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         <operations>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause               <operations>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <op name="start" interval="0s" timeout="120s" on-fail="restart" id="prmVM2-start-0s"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                 <op name="start" interval="0s" timeout="120s" on-fail="restart" id="prmVM1-start-0s"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <op name="monitor" interval="10s" timeout="30s" on-fail="restart" id="prmVM2-monitor-10s"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                 <op name="monitor" interval="10s" timeout="30s" on-fail="restart" id="prmVM1-monitor-10s"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <op name="stop" interval="0s" timeout="120s" on-fail="fence" id="prmVM2-stop-0s"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                 <op name="stop" interval="0s" timeout="120s" on-fail="fence" id="prmVM1-stop-0s"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <op name="migrate_to" interval="0s" timeout="120s" on-fail="restart" id="prmVM2-migrate_to-0s"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                 <op name="migrate_to" interval="0s" timeout="120s" on-fail="restart" id="prmVM1-migrate_to-0s"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <op name="migrate_from" interval="0s" timeout="120s" on-fail="restart" id="prmVM2-migrate_from-0s"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                 <op name="migrate_from" interval="0s" timeout="120s" on-fail="restart" id="prmVM1-migrate_from-0s"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         </operations>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause               </operations>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++       </primitive>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause             </primitive>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++       <primitive id="prmVM3" class="ocf" provider="heartbeat" type="VirtualDomain">
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause             <primitive id="prmVM2" class="ocf" provider="heartbeat" type="VirtualDomain" __crm_diff_marker__="added:top">
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         <instance_attributes id="prmVM3-instance_attributes">
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause               <instance_attributes id="prmVM2-instance_attributes">
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <nvpair name="config" value="/etc/libvirt/qemu/vm3.xml" id="prmVM3-instance_attributes-config"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                 <nvpair name="config" value="/etc/libvirt/qemu/vm2.xml" id="prmVM2-instance_attributes-config"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <nvpair name="hypervisor" value="qemu:///system" id="prmVM3-instance_attributes-hypervisor"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                 <nvpair name="hypervisor" value="qemu:///system" id="prmVM2-instance_attributes-hypervisor"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <nvpair name="migration_transport" value="ssh" id="prmVM3-instance_attributes-migration_transport"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                 <nvpair name="migration_transport" value="ssh" id="prmVM2-instance_attributes-migration_transport"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         </instance_attributes>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause               </instance_attributes>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         <meta_attributes id="prmVM3-meta_attributes">
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause               <meta_attributes id="prmVM2-meta_attributes">
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <nvpair name="allow-migrate" value="true" id="prmVM3-meta_attributes-allow-migrate"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                 <nvpair name="allow-migrate" value="true" id="prmVM2-meta_attributes-allow-migrate"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         </meta_attributes>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause               </meta_attributes>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         <operations>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause               <operations>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <op name="start" interval="0s" timeout="120s" on-fail="restart" id="prmVM3-start-0s"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                 <op name="start" interval="0s" timeout="120s" on-fail="restart" id="prmVM2-start-0s"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <op name="monitor" interval="10s" timeout="30s" on-fail="restart" id="prmVM3-monitor-10s"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                 <op name="monitor" interval="10s" timeout="30s" on-fail="restart" id="prmVM2-monitor-10s"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <op name="stop" interval="0s" timeout="120s" on-fail="fence" id="prmVM3-stop-0s"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                 <op name="stop" interval="0s" timeout="120s" on-fail="fence" id="prmVM2-stop-0s"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <op name="migrate_to" interval="0s" timeout="120s" on-fail="restart" id="prmVM3-migrate_to-0s"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                 <op name="migrate_to" interval="0s" timeout="120s" on-fail="restart" id="prmVM2-migrate_to-0s"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <op name="migrate_from" interval="0s" timeout="120s" on-fail="restart" id="prmVM3-migrate_from-0s"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                 <op name="migrate_from" interval="0s" timeout="120s" on-fail="restart" id="prmVM2-migrate_from-0s"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         </operations>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause               </operations>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++       </primitive>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause             </primitive>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++       <group id="grpStonith6">
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause             <primitive id="prmVM3" class="ocf" provider="heartbeat" type="VirtualDomain" __crm_diff_marker__="added:top">
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         <primitive id="prmStonith6-1" class="stonith" type="external/ipmi">
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause               <instance_attributes id="prmVM3-instance_attributes">
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <instance_attributes id="prmStonith6-1-instance_attributes">
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                 <nvpair name="config" value="/etc/libvirt/qemu/vm3.xml" id="prmVM3-instance_attributes-config"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <nvpair name="pcmk_reboot_retries" value="1" id="prmStonith6-1-instance_attributes-pcmk_reboot_retries"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                 <nvpair name="hypervisor" value="qemu:///system" id="prmVM3-instance_attributes-hypervisor"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <nvpair name="pcmk_reboot_timeout" value="60s" id="prmStonith6-1-instance_attributes-pcmk_reboot_timeout"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                 <nvpair name="migration_transport" value="ssh" id="prmVM3-instance_attributes-migration_transport"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <nvpair name="hostname" value="bl460g1n6" id="prmStonith6-1-instance_attributes-hostname"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause               </instance_attributes>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <nvpair name="ipaddr" value="192.168.133.236" id="prmStonith6-1-instance_attributes-ipaddr"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause               <meta_attributes id="prmVM3-meta_attributes">
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <nvpair name="userid" value="USERID" id="prmStonith6-1-instance_attributes-userid"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                 <nvpair name="allow-migrate" value="true" id="prmVM3-meta_attributes-allow-migrate"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <nvpair name="passwd" value="PASSW0RD" id="prmStonith6-1-instance_attributes-passwd"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause               </meta_attributes>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <nvpair name="interface" value="lanplus" id="prmStonith6-1-instance_attributes-interface"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause               <operations>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           </instance_attributes>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                 <op name="start" interval="0s" timeout="120s" on-fail="restart" id="prmVM3-start-0s"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <operations>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                 <op name="monitor" interval="10s" timeout="30s" on-fail="restart" id="prmVM3-monitor-10s"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <op name="start" interval="0s" timeout="60s" id="prmStonith6-1-start-0s"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                 <op name="stop" interval="0s" timeout="120s" on-fail="fence" id="prmVM3-stop-0s"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <op name="monitor" interval="360s" timeout="60s" id="prmStonith6-1-monitor-360s"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                 <op name="migrate_to" interval="0s" timeout="120s" on-fail="restart" id="prmVM3-migrate_to-0s"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <op name="stop" interval="0s" timeout="60s" id="prmStonith6-1-stop-0s"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                 <op name="migrate_from" interval="0s" timeout="120s" on-fail="restart" id="prmVM3-migrate_from-0s"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           </operations>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause               </operations>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         </primitive>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause             </primitive>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         <primitive id="prmStonith6-2" class="stonith" type="external/ssh">
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause             <group id="grpStonith6" __crm_diff_marker__="added:top">
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <instance_attributes id="prmStonith6-2-instance_attributes">
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause               <primitive id="prmStonith6-1" class="stonith" type="external/ipmi">
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <nvpair name="pcmk_reboot_retries" value="1" id="prmStonith6-2-instance_attributes-pcmk_reboot_retries"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                 <instance_attributes id="prmStonith6-1-instance_attributes">
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <nvpair name="pcmk_reboot_timeout" value="40s" id="prmStonith6-2-instance_attributes-pcmk_reboot_timeout"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                   <nvpair name="pcmk_reboot_retries" value="1" id="prmStonith6-1-instance_attributes-pcmk_reboot_retries"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <nvpair name="hostlist" value="bl460g1n6" id="prmStonith6-2-instance_attributes-hostlist"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                   <nvpair name="pcmk_reboot_timeout" value="60s" id="prmStonith6-1-instance_attributes-pcmk_reboot_timeout"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           </instance_attributes>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                   <nvpair name="hostname" value="bl460g1n6" id="prmStonith6-1-instance_attributes-hostname"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <operations>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                   <nvpair name="ipaddr" value="192.168.133.236" id="prmStonith6-1-instance_attributes-ipaddr"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <op name="start" interval="0s" timeout="60s" id="prmStonith6-2-start-0s"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                   <nvpair name="userid" value="USERID" id="prmStonith6-1-instance_attributes-userid"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <op name="monitor" interval="10s" timeout="60s" id="prmStonith6-2-monitor-10s"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                   <nvpair name="passwd" value="PASSW0RD" id="prmStonith6-1-instance_attributes-passwd"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <op name="stop" interval="0s" timeout="60s" id="prmStonith6-2-stop-0s"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                   <nvpair name="interface" value="lanplus" id="prmStonith6-1-instance_attributes-interface"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           </operations>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                 </instance_attributes>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         </primitive>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                 <operations>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++       </group>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                   <op name="start" interval="0s" timeout="60s" id="prmStonith6-1-start-0s"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++       <group id="grpStonith7">
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                   <op name="monitor" interval="360s" timeout="60s" id="prmStonith6-1-monitor-360s"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         <primitive id="prmStonith7-1" class="stonith" type="external/ipmi">
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                   <op name="stop" interval="0s" timeout="60s" id="prmStonith6-1-stop-0s"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <instance_attributes id="prmStonith7-1-instance_attributes">
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                 </operations>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <nvpair name="pcmk_reboot_retries" value="1" id="prmStonith7-1-instance_attributes-pcmk_reboot_retries"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause               </primitive>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <nvpair name="pcmk_reboot_timeout" value="60s" id="prmStonith7-1-instance_attributes-pcmk_reboot_timeout"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause               <primitive id="prmStonith6-2" class="stonith" type="external/ssh">
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <nvpair name="hostname" value="bl460g1n7" id="prmStonith7-1-instance_attributes-hostname"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                 <instance_attributes id="prmStonith6-2-instance_attributes">
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <nvpair name="ipaddr" value="192.168.133.237" id="prmStonith7-1-instance_attributes-ipaddr"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                   <nvpair name="pcmk_reboot_retries" value="1" id="prmStonith6-2-instance_attributes-pcmk_reboot_retries"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <nvpair name="userid" value="USERID" id="prmStonith7-1-instance_attributes-userid"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                   <nvpair name="pcmk_reboot_timeout" value="40s" id="prmStonith6-2-instance_attributes-pcmk_reboot_timeout"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <nvpair name="passwd" value="PASSW0RD" id="prmStonith7-1-instance_attributes-passwd"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                   <nvpair name="hostlist" value="bl460g1n6" id="prmStonith6-2-instance_attributes-hostlist"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <nvpair name="interface" value="lanplus" id="prmStonith7-1-instance_attributes-interface"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                 </instance_attributes>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           </instance_attributes>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                 <operations>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <operations>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                   <op name="start" interval="0s" timeout="60s" id="prmStonith6-2-start-0s"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <op name="start" interval="0s" timeout="60s" id="prmStonith7-1-start-0s"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                   <op name="monitor" interval="10s" timeout="60s" id="prmStonith6-2-monitor-10s"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <op name="monitor" interval="360s" timeout="60s" id="prmStonith7-1-monitor-360s"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                   <op name="stop" interval="0s" timeout="60s" id="prmStonith6-2-stop-0s"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <op name="stop" interval="0s" timeout="60s" id="prmStonith7-1-stop-0s"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                 </operations>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           </operations>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause               </primitive>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         </primitive>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause             </group>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         <primitive id="prmStonith7-2" class="stonith" type="external/ssh">
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause             <group id="grpStonith7" __crm_diff_marker__="added:top">
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <instance_attributes id="prmStonith7-2-instance_attributes">
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause               <primitive id="prmStonith7-1" class="stonith" type="external/ipmi">
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <nvpair name="pcmk_reboot_retries" value="1" id="prmStonith7-2-instance_attributes-pcmk_reboot_retries"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                 <instance_attributes id="prmStonith7-1-instance_attributes">
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <nvpair name="pcmk_reboot_timeout" value="40s" id="prmStonith7-2-instance_attributes-pcmk_reboot_timeout"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                   <nvpair name="pcmk_reboot_retries" value="1" id="prmStonith7-1-instance_attributes-pcmk_reboot_retries"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <nvpair name="hostlist" value="bl460g1n7" id="prmStonith7-2-instance_attributes-hostlist"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                   <nvpair name="pcmk_reboot_timeout" value="60s" id="prmStonith7-1-instance_attributes-pcmk_reboot_timeout"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           </instance_attributes>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                   <nvpair name="hostname" value="bl460g1n7" id="prmStonith7-1-instance_attributes-hostname"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <operations>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                   <nvpair name="ipaddr" value="192.168.133.237" id="prmStonith7-1-instance_attributes-ipaddr"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <op name="start" interval="0s" timeout="60s" id="prmStonith7-2-start-0s"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                   <nvpair name="userid" value="USERID" id="prmStonith7-1-instance_attributes-userid"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <op name="monitor" interval="10s" timeout="60s" id="prmStonith7-2-monitor-10s"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                   <nvpair name="passwd" value="PASSW0RD" id="prmStonith7-1-instance_attributes-passwd"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <op name="stop" interval="0s" timeout="60s" id="prmStonith7-2-stop-0s"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                   <nvpair name="interface" value="lanplus" id="prmStonith7-1-instance_attributes-interface"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           </operations>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                 </instance_attributes>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         </primitive>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                 <operations>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++       </group>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                   <op name="start" interval="0s" timeout="60s" id="prmStonith7-1-start-0s"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++       <group id="grpStonith8">
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                   <op name="monitor" interval="360s" timeout="60s" id="prmStonith7-1-monitor-360s"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         <primitive id="prmStonith8-1" class="stonith" type="external/ipmi">
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                   <op name="stop" interval="0s" timeout="60s" id="prmStonith7-1-stop-0s"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <instance_attributes id="prmStonith8-1-instance_attributes">
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                 </operations>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <nvpair name="pcmk_reboot_retries" value="1" id="prmStonith8-1-instance_attributes-pcmk_reboot_retries"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause               </primitive>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <nvpair name="pcmk_reboot_timeout" value="60s" id="prmStonith8-1-instance_attributes-pcmk_reboot_timeout"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <nvpair name="hostname" value="bl460g1n8" id="prmStonith8-1-instance_attributes-hostname"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause               <primitive id="prmStonith7-2" class="stonith" type="external/ssh">
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <nvpair name="ipaddr" value="192.168.133.238" id="prmStonith8-1-instance_attributes-ipaddr"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                 <instance_attributes id="prmStonith7-2-instance_attributes">
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <nvpair name="userid" value="USERID" id="prmStonith8-1-instance_attributes-userid"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                   <nvpair name="pcmk_reboot_retries" value="1" id="prmStonith7-2-instance_attributes-pcmk_reboot_retries"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <nvpair name="passwd" value="PASSW0RD" id="prmStonith8-1-instance_attributes-passwd"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                   <nvpair name="pcmk_reboot_timeout" value="40s" id="prmStonith7-2-instance_attributes-pcmk_reboot_timeout"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <nvpair name="interface" value="lanplus" id="prmStonith8-1-instance_attributes-interface"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                   <nvpair name="hostlist" value="bl460g1n7" id="prmStonith7-2-instance_attributes-hostlist"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           </instance_attributes>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                 </instance_attributes>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <operations>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                 <operations>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <op name="start" interval="0s" timeout="60s" id="prmStonith8-1-start-0s"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                   <op name="start" interval="0s" timeout="60s" id="prmStonith7-2-start-0s"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <op name="monitor" interval="360s" timeout="60s" id="prmStonith8-1-monitor-360s"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                   <op name="monitor" interval="10s" timeout="60s" id="prmStonith7-2-monitor-10s"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <op name="stop" interval="0s" timeout="60s" id="prmStonith8-1-stop-0s"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                   <op name="stop" interval="0s" timeout="60s" id="prmStonith7-2-stop-0s"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           </operations>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                 </operations>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         </primitive>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause               </primitive>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         <primitive id="prmStonith8-2" class="stonith" type="external/ssh">
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause             </group>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <instance_attributes id="prmStonith8-2-instance_attributes">
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause             <group id="grpStonith8" __crm_diff_marker__="added:top">
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <nvpair name="pcmk_reboot_retries" value="1" id="prmStonith8-2-instance_attributes-pcmk_reboot_retries"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause               <primitive id="prmStonith8-1" class="stonith" type="external/ipmi">
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <nvpair name="pcmk_reboot_timeout" value="40s" id="prmStonith8-2-instance_attributes-pcmk_reboot_timeout"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                 <instance_attributes id="prmStonith8-1-instance_attributes">
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <nvpair name="hostlist" value="bl460g1n8" id="prmStonith8-2-instance_attributes-hostlist"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                   <nvpair name="pcmk_reboot_retries" value="1" id="prmStonith8-1-instance_attributes-pcmk_reboot_retries"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           </instance_attributes>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                   <nvpair name="pcmk_reboot_timeout" value="60s" id="prmStonith8-1-instance_attributes-pcmk_reboot_timeout"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <operations>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                   <nvpair name="hostname" value="bl460g1n8" id="prmStonith8-1-instance_attributes-hostname"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <op name="start" interval="0s" timeout="60s" id="prmStonith8-2-start-0s"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                   <nvpair name="ipaddr" value="192.168.133.238" id="prmStonith8-1-instance_attributes-ipaddr"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <op name="monitor" interval="10s" timeout="60s" id="prmStonith8-2-monitor-10s"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                   <nvpair name="userid" value="USERID" id="prmStonith8-1-instance_attributes-userid"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <op name="stop" interval="0s" timeout="60s" id="prmStonith8-2-stop-0s"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                   <nvpair name="passwd" value="PASSW0RD" id="prmStonith8-1-instance_attributes-passwd"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           </operations>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                   <nvpair name="interface" value="lanplus" id="prmStonith8-1-instance_attributes-interface"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         </primitive>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                 </instance_attributes>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++       </group>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                 <operations>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++       <clone id="clnPing">
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                   <op name="start" interval="0s" timeout="60s" id="prmStonith8-1-start-0s"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         <primitive id="prmPing" class="ocf" provider="pacemaker" type="ping">
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                   <op name="monitor" interval="360s" timeout="60s" id="prmStonith8-1-monitor-360s"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <instance_attributes id="prmPing-instance_attributes">
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                   <op name="stop" interval="0s" timeout="60s" id="prmStonith8-1-stop-0s"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <nvpair name="name" value="default_ping_set" id="prmPing-instance_attributes-name"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                 </operations>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <nvpair name="host_list" value="192.168.201.254" id="prmPing-instance_attributes-host_list"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause               </primitive>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <nvpair name="multiplier" value="100" id="prmPing-instance_attributes-multiplier"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause               <primitive id="prmStonith8-2" class="stonith" type="external/ssh">
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <nvpair name="attempts" value="2" id="prmPing-instance_attributes-attempts"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                 <instance_attributes id="prmStonith8-2-instance_attributes">
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <nvpair name="timeout" value="2" id="prmPing-instance_attributes-timeout"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                   <nvpair name="pcmk_reboot_retries" value="1" id="prmStonith8-2-instance_attributes-pcmk_reboot_retries"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           </instance_attributes>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                   <nvpair name="pcmk_reboot_timeout" value="40s" id="prmStonith8-2-instance_attributes-pcmk_reboot_timeout"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <operations>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                   <nvpair name="hostlist" value="bl460g1n8" id="prmStonith8-2-instance_attributes-hostlist"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <op name="start" interval="0s" timeout="60s" on-fail="restart" id="prmPing-start-0s"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                 </instance_attributes>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <op name="monitor" interval="10s" timeout="60s" on-fail="restart" id="prmPing-monitor-10s"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                 <operations>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <op name="stop" interval="0s" timeout="60s" on-fail="ignore" id="prmPing-stop-0s"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                   <op name="start" interval="0s" timeout="60s" id="prmStonith8-2-start-0s"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           </operations>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         </primitive>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                   <op name="monitor" interval="10s" timeout="60s" id="prmStonith8-2-monitor-10s"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++       </clone>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                   <op name="stop" interval="0s" timeout="60s" id="prmStonith8-2-stop-0s"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      </resources>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                 </operations>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      <constraints>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause               </primitive>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++       <rsc_location id="l1" rsc="prmVM1">
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause             </group>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         <!--#  rule -inf: not_defined default_ping_set or default_ping_set lt 100-->/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause             <clone id="clnPing" __crm_diff_marker__="added:top">
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         <rule score="300" id="l1-rule">
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause               <primitive id="prmPing" class="ocf" provider="pacemaker" type="ping">
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <expression attribute="#uname" operation="eq" value="bl460g1n6" id="l1-expression"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                 <instance_attributes id="prmPing-instance_attributes">
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         </rule>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                   <nvpair name="name" value="default_ping_set" id="prmPing-instance_attributes-name"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         <rule score="200" id="l1-rule-0">
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                   <nvpair name="host_list" value="192.168.201.254" id="prmPing-instance_attributes-host_list"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <expression attribute="#uname" operation="eq" value="bl460g1n7" id="l1-expression-0"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                   <nvpair name="multiplier" value="100" id="prmPing-instance_attributes-multiplier"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         </rule>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                   <nvpair name="attempts" value="2" id="prmPing-instance_attributes-attempts"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         <rule score="100" id="l1-rule-1">
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                   <nvpair name="timeout" value="2" id="prmPing-instance_attributes-timeout"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <expression attribute="#uname" operation="eq" value="bl460g1n8" id="l1-expression-1"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                 </instance_attributes>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         </rule>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                 <operations>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         <rule score="-INFINITY" id="l1-rule-2">
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                   <op name="start" interval="0s" timeout="60s" on-fail="restart" id="prmPing-start-0s"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <expression attribute="default_ping_set" operation="lt" value="100" id="l1-expression-2"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         </rule>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                   <op name="monitor" interval="10s" timeout="60s" on-fail="restart" id="prmPing-monitor-10s"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++       </rsc_location>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                   <op name="stop" interval="0s" timeout="60s" on-fail="ignore" id="prmPing-stop-0s"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                 </operations>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++       <rsc_location id="l2" rsc="prmVM2">
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause               </primitive>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         <rule score="300" id="l2-rule">
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause             </clone>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <expression attribute="#uname" operation="eq" value="bl460g1n6" id="l2-expression"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause           </resources>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         </rule>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause           <constraints>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         <rule score="200" id="l2-rule-0">
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause             <rsc_location id="l1" rsc="prmVM1" __crm_diff_marker__="added:top">
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <expression attribute="#uname" operation="eq" value="bl460g1n7" id="l2-expression-0"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause               <!--#  rule -inf: not_defined default_ping_set or default_ping_set lt 100-->/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         </rule>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause               <rule score="300" id="l1-rule">
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         <rule score="100" id="l2-rule-1">
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                 <expression attribute="#uname" operation="eq" value="bl460g1n6" id="l1-expression"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <expression attribute="#uname" operation="eq" value="bl460g1n8" id="l2-expression-1"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause               </rule>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         </rule>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause               <rule score="200" id="l1-rule-0">
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         <rule score="-INFINITY" id="l2-rule-2">
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                 <expression attribute="#uname" operation="eq" value="bl460g1n7" id="l1-expression-0"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <expression attribute="default_ping_set" operation="lt" value="100" id="l2-expression-2"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause               </rule>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         </rule>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause               <rule score="100" id="l1-rule-1">
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++       </rsc_location>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                 <expression attribute="#uname" operation="eq" value="bl460g1n8" id="l1-expression-1"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++       <rsc_location id="l3" rsc="prmVM3">
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause               </rule>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         <rule score="300" id="l3-rule">
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause               <rule score="-INFINITY" id="l1-rule-2">
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <expression attribute="#uname" operation="eq" value="bl460g1n6" id="l3-expression"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                 <expression attribute="default_ping_set" operation="lt" value="100" id="l1-expression-2"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         </rule>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause               </rule>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         <rule score="200" id="l3-rule-0">
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <expression attribute="#uname" operation="eq" value="bl460g1n7" id="l3-expression-0"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause             </rsc_location>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         </rule>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause             <rsc_location id="l2" rsc="prmVM2" __crm_diff_marker__="added:top">
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         <rule score="100" id="l3-rule-1">
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause               <rule score="300" id="l2-rule">
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <expression attribute="#uname" operation="eq" value="bl460g1n8" id="l3-expression-1"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                 <expression attribute="#uname" operation="eq" value="bl460g1n6" id="l2-expression"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         </rule>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause               </rule>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         <rule score="-INFINITY" id="l3-rule-2">
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause               <rule score="200" id="l2-rule-0">
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <expression attribute="default_ping_set" operation="lt" value="100" id="l3-expression-2"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                 <expression attribute="#uname" operation="eq" value="bl460g1n7" id="l2-expression-0"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         </rule>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause               </rule>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++       </rsc_location>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause               <rule score="100" id="l2-rule-1">
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++       <rsc_location id="lo6" rsc="grpStonith6">
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                 <expression attribute="#uname" operation="eq" value="bl460g1n8" id="l2-expression-1"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         <rule score="-INFINITY" id="lo6-rule">
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause               </rule>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <expression attribute="#uname" operation="eq" value="bl460g1n6" id="lo6-expression"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause               <rule score="-INFINITY" id="l2-rule-2">
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         </rule>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                 <expression attribute="default_ping_set" operation="lt" value="100" id="l2-expression-2"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++       </rsc_location>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause               </rule>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++       <rsc_location id="lo7" rsc="grpStonith7">
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause             </rsc_location>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         <rule score="-INFINITY" id="lo7-rule">
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause             <rsc_location id="l3" rsc="prmVM3" __crm_diff_marker__="added:top">
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <expression attribute="#uname" operation="eq" value="bl460g1n7" id="lo7-expression"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause               <rule score="300" id="l3-rule">
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         </rule>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                 <expression attribute="#uname" operation="eq" value="bl460g1n6" id="l3-expression"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++       </rsc_location>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause               </rule>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++       <rsc_location id="lo8" rsc="grpStonith8">
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause               <rule score="200" id="l3-rule-0">
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         <rule score="-INFINITY" id="lo8-rule">
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                 <expression attribute="#uname" operation="eq" value="bl460g1n7" id="l3-expression-0"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <expression attribute="#uname" operation="eq" value="bl460g1n8" id="lo8-expression"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause               </rule>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         </rule>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause               <rule score="100" id="l3-rule-1">
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++       </rsc_location>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                 <expression attribute="#uname" operation="eq" value="bl460g1n8" id="l3-expression-1"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++       <rsc_colocation id="c1" score="INFINITY" rsc="prmVM1" with-rsc="clnPing"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause               </rule>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++       <rsc_colocation id="c4" score="INFINITY" rsc="prmVM2" with-rsc="clnPing"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause               <rule score="-INFINITY" id="l3-rule-2">
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++       <rsc_colocation id="c7" score="INFINITY" rsc="prmVM3" with-rsc="clnPing"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                 <expression attribute="default_ping_set" operation="lt" value="100" id="l3-expression-2"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++       <rsc_order id="o1" score="0" first="clnPing" then="prmVM1" symmetrical="false"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause               </rule>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++       <rsc_order id="o4" score="0" first="clnPing" then="prmVM2" symmetrical="false"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause             </rsc_location>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++       <rsc_order id="o7" score="0" first="clnPing" then="prmVM3" symmetrical="false"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause             <rsc_location id="lo6" rsc="grpStonith6" __crm_diff_marker__="added:top">
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      </constraints>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause               <rule score="-INFINITY" id="lo6-rule">
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++     <fencing-topology>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                 <expression attribute="#uname" operation="eq" value="bl460g1n6" id="lo6-expression"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++       <fencing-level target="bl460g1n6" devices="prmStonith6-1" index="1" id="fencing"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause               </rule>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++       <fencing-level target="bl460g1n6" devices="prmStonith6-2" index="2" id="fencing-0"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause             </rsc_location>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++       <fencing-level target="bl460g1n7" devices="prmStonith7-1" index="1" id="fencing-1"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause             <rsc_location id="lo7" rsc="grpStonith7" __crm_diff_marker__="added:top">
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++       <fencing-level target="bl460g1n7" devices="prmStonith7-2" index="2" id="fencing-2"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause               <rule score="-INFINITY" id="lo7-rule">
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++       <fencing-level target="bl460g1n8" devices="prmStonith8-1" index="1" id="fencing-3"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                 <expression attribute="#uname" operation="eq" value="bl460g1n7" id="lo7-expression"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++       <fencing-level target="bl460g1n8" devices="prmStonith8-2" index="2" id="fencing-4"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause               </rule>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++     </fencing-topology>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause             </rsc_location>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++     <rsc_defaults>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause             <rsc_location id="lo8" rsc="grpStonith8" __crm_diff_marker__="added:top">
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++       <meta_attributes id="rsc-options">
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause               <rule score="-INFINITY" id="lo8-rule">
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         <nvpair name="resource-stickiness" value="INFINITY" id="rsc-options-resource-stickiness"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                 <expression attribute="#uname" operation="eq" value="bl460g1n8" id="lo8-expression"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         <nvpair name="migration-threshold" value="1" id="rsc-options-migration-threshold"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause               </rule>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++       </meta_attributes>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause             </rsc_location>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++     </rsc_defaults>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    </configuration>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause             <rsc_colocation id="c1" score="INFINITY" rsc="prmVM1" with-rsc="clnPing" __crm_diff_marker__="added:top"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  </cib>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause             <rsc_colocation id="c4" score="INFINITY" rsc="prmVM2" with-rsc="clnPing" __crm_diff_marker__="added:top"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause             <rsc_colocation id="c7" score="INFINITY" rsc="prmVM3" with-rsc="clnPing" __crm_diff_marker__="added:top"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause             <rsc_order id="o1" score="0" first="clnPing" then="prmVM1" symmetrical="false" __crm_diff_marker__="added:top"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause             <rsc_order id="o4" score="0" first="clnPing" then="prmVM2" symmetrical="false" __crm_diff_marker__="added:top"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause             <rsc_order id="o7" score="0" first="clnPing" then="prmVM3" symmetrical="false" __crm_diff_marker__="added:top"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause           </constraints>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause           <fencing-topology __crm_diff_marker__="added:top">
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause             <fencing-level target="bl460g1n6" devices="prmStonith6-1" index="1" id="fencing"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause             <fencing-level target="bl460g1n6" devices="prmStonith6-2" index="2" id="fencing-0"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause             <fencing-level target="bl460g1n7" devices="prmStonith7-1" index="1" id="fencing-1"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause             <fencing-level target="bl460g1n7" devices="prmStonith7-2" index="2" id="fencing-2"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause             <fencing-level target="bl460g1n8" devices="prmStonith8-1" index="1" id="fencing-3"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause             <fencing-level target="bl460g1n8" devices="prmStonith8-2" index="2" id="fencing-4"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause           </fencing-topology>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause           <rsc_defaults __crm_diff_marker__="added:top">
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause             <meta_attributes id="rsc-options">
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause               <nvpair name="resource-stickiness" value="INFINITY" id="rsc-options-resource-stickiness"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause               <nvpair name="migration-threshold" value="1" id="rsc-options-migration-threshold"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause             </meta_attributes>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause           </rsc_defaults>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause         </configuration>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause       </cib>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause     </diff-added>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause   </diff>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: s_crmd_fsa: 	Processing I_PE_CALC: [ state=S_IDLE cause=C_FSA_INTERNAL origin=abort_transition_graph ]
Oct 21 11:20:18 [7685] bl460g1n6       crmd:   notice: do_state_transition: 	State transition S_IDLE -> S_POLICY_ENGINE [ input=I_PE_CALC cause=C_FSA_INTERNAL origin=abort_transition_graph ]
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: do_state_transition: 	All 3 cluster nodes are eligible to run resources.
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: do_pe_invoke: 	Query 31: Requesting the current CIB: S_POLICY_ENGINE
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:     info: stonith_level_remove: 	Node bl460g1n6 not found (0 active entries)
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:     info: stonith_level_register: 	Node bl460g1n6 has 1 active fencing levels
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:     info: stonith_level_register: 	Node bl460g1n6 has 2 active fencing levels
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:     info: stonith_level_remove: 	Node bl460g1n7 not found (1 active entries)
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:     info: stonith_level_register: 	Node bl460g1n7 has 1 active fencing levels
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:     info: stonith_level_register: 	Node bl460g1n7 has 2 active fencing levels
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:     info: stonith_level_remove: 	Node bl460g1n8 not found (2 active entries)
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:     info: stonith_level_register: 	Node bl460g1n8 has 1 active fencing levels
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:     info: stonith_level_register: 	Node bl460g1n8 has 2 active fencing levels
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:     info: update_cib_stonith_devices: 	Updating device list from the cib: new resource
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: unpack_config: 	STONITH timeout: 60000
Oct 21 11:20:18 [7680] bl460g1n6        cib:     info: cib_replace_notify: 	Replaced: 0.5.9 -> 0.6.1 from bl460g1n6
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: unpack_config: 	STONITH of failed nodes is enabled
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: unpack_config: 	Stop all active resources: false
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: unpack_config: 	Cluster is symmetric - resources can run anywhere by default
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: unpack_config: 	Default stickiness: 0
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: unpack_config: 	On loss of CCM Quorum: Freeze resources
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: unpack_config: 	Node scores: 'red' = -INFINITY, 'yellow' = 0, 'green' = 0
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:  warning: handle_startup_fencing: 	Blind faith: not fencing unseen nodes
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	Diff: --- 0.5.9
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	Diff: +++ 0.6.1 65395a06b7ce78fecc54a58606f6d0f6
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: unpack_domains: 	Unpacking domains
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	--         <nvpair id="cib-bootstrap-options-dc-version" name="dc-version" value="1.1.11-0.302.b6d42ed.git.el6-b6d42ed"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	--         <nvpair id="cib-bootstrap-options-cluster-infrastructure" name="cluster-infrastructure" value="corosync"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++         <nvpair name="no-quorum-policy" value="freeze" id="cib-bootstrap-options-no-quorum-policy"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++         <nvpair name="stonith-enabled" value="true" id="cib-bootstrap-options-stonith-enabled"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++         <nvpair name="startup-fencing" value="false" id="cib-bootstrap-options-startup-fencing"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++         <nvpair name="crmd-transition-delay" value="2s" id="cib-bootstrap-options-crmd-transition-delay"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++       <primitive id="prmVM1" class="ocf" provider="heartbeat" type="VirtualDomain">
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++         <instance_attributes id="prmVM1-instance_attributes">
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++           <nvpair name="config" value="/etc/libvirt/qemu/vm1.xml" id="prmVM1-instance_attributes-config"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++           <nvpair name="hypervisor" value="qemu:///system" id="prmVM1-instance_attributes-hypervisor"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++           <nvpair name="migration_transport" value="ssh" id="prmVM1-instance_attributes-migration_transport"/>
Oct 21 11:20:18 [7683] bl460g1n6      attrd:   notice: attrd_cib_replaced_cb: 	Updating all attributes after cib_refresh_notify event
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++         </instance_attributes>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++         <meta_attributes id="prmVM1-meta_attributes">
Oct 21 11:20:18 [7683] bl460g1n6      attrd:    debug: write_attribute: 	Update: bl460g1n6[shutdown]=(null) (3232261592 3232261592 3232261592 bl460g1n6)
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++           <nvpair name="allow-migrate" value="true" id="prmVM1-meta_attributes-allow-migrate"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++         </meta_attributes>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++         <operations>
Oct 21 11:20:18 [7683] bl460g1n6      attrd:    debug: write_attribute: 	Update: bl460g1n7[shutdown]=(null) (3232261593 3232261593 3232261593 bl460g1n7)
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++           <op name="start" interval="0s" timeout="120s" on-fail="restart" id="prmVM1-start-0s"/>
Oct 21 11:20:18 [7683] bl460g1n6      attrd:    debug: write_attribute: 	Update: bl460g1n8[shutdown]=(null) (3232261594 3232261594 3232261594 bl460g1n8)
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++           <op name="monitor" interval="10s" timeout="30s" on-fail="restart" id="prmVM1-monitor-10s"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++           <op name="stop" interval="0s" timeout="120s" on-fail="fence" id="prmVM1-stop-0s"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++           <op name="migrate_to" interval="0s" timeout="120s" on-fail="restart" id="prmVM1-migrate_to-0s"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++           <op name="migrate_from" interval="0s" timeout="120s" on-fail="restart" id="prmVM1-migrate_from-0s"/>
Oct 21 11:20:18 [7683] bl460g1n6      attrd:   notice: write_attribute: 	Sent update 7 with 3 changes for shutdown, id=<n/a>, set=(null)
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++         </operations>
Oct 21 11:20:18 [7683] bl460g1n6      attrd:    debug: write_attribute: 	Update: bl460g1n6[terminate]=(null) (3232261592 3232261592 3232261592 bl460g1n6)
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++       </primitive>
Oct 21 11:20:18 [7683] bl460g1n6      attrd:    debug: write_attribute: 	Update: bl460g1n7[terminate]=(null) (3232261593 3232261593 3232261593 bl460g1n7)
Oct 21 11:20:18 [7683] bl460g1n6      attrd:    debug: write_attribute: 	Update: bl460g1n8[terminate]=(null) (3232261594 3232261594 3232261594 bl460g1n8)
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++       <primitive id="prmVM2" class="ocf" provider="heartbeat" type="VirtualDomain">
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++         <instance_attributes id="prmVM2-instance_attributes">
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++           <nvpair name="config" value="/etc/libvirt/qemu/vm2.xml" id="prmVM2-instance_attributes-config"/>
Oct 21 11:20:18 [7683] bl460g1n6      attrd:   notice: write_attribute: 	Sent update 8 with 3 changes for terminate, id=<n/a>, set=(null)
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++           <nvpair name="hypervisor" value="qemu:///system" id="prmVM2-instance_attributes-hypervisor"/>
Oct 21 11:20:18 [7683] bl460g1n6      attrd:    debug: write_attribute: 	Update: bl460g1n6[probe_complete]=true (3232261592 3232261592 3232261592 bl460g1n6)
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: group_rsc_location: 	Processing rsc_location lo8-rule for grpStonith8
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++           <nvpair name="migration_transport" value="ssh" id="prmVM2-instance_attributes-migration_transport"/>
Oct 21 11:20:18 [7683] bl460g1n6      attrd:    debug: write_attribute: 	Update: bl460g1n7[probe_complete]=true (3232261593 3232261593 3232261593 bl460g1n7)
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: group_rsc_location: 	Processing rsc_location lo7-rule for grpStonith7
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++         </instance_attributes>
Oct 21 11:20:18 [7683] bl460g1n6      attrd:    debug: write_attribute: 	Update: bl460g1n8[probe_complete]=true (3232261594 3232261594 3232261594 bl460g1n8)
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: group_rsc_location: 	Processing rsc_location lo6-rule for grpStonith6
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++         <meta_attributes id="prmVM2-meta_attributes">
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++           <nvpair name="allow-migrate" value="true" id="prmVM2-meta_attributes-allow-migrate"/>
Oct 21 11:20:18 [7683] bl460g1n6      attrd:   notice: write_attribute: 	Sent update 9 with 3 changes for probe_complete, id=<n/a>, set=(null)
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:     info: cib_device_update: 	Device prmStonith6-1 has been disabled on bl460g1n6: score=-INFINITY
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++         </meta_attributes>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:     info: cib_device_update: 	Device prmStonith6-2 is allowed on bl460g1n6: score=0
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++         <operations>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++           <op name="start" interval="0s" timeout="120s" on-fail="restart" id="prmVM2-start-0s"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++           <op name="monitor" interval="10s" timeout="30s" on-fail="restart" id="prmVM2-monitor-10s"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++           <op name="stop" interval="0s" timeout="120s" on-fail="fence" id="prmVM2-stop-0s"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: do_cib_replaced: 	Updating the CIB after a replace: DC=true
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++           <op name="migrate_to" interval="0s" timeout="120s" on-fail="restart" id="prmVM2-migrate_to-0s"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++           <op name="migrate_from" interval="0s" timeout="120s" on-fail="restart" id="prmVM2-migrate_from-0s"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++         </operations>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++       </primitive>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++       <primitive id="prmVM3" class="ocf" provider="heartbeat" type="VirtualDomain">
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: s_crmd_fsa: 	Processing I_ELECTION: [ state=S_POLICY_ENGINE cause=C_FSA_INTERNAL origin=do_cib_replaced ]
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++         <instance_attributes id="prmVM3-instance_attributes">
Oct 21 11:20:18 [7685] bl460g1n6       crmd:     info: do_state_transition: 	State transition S_POLICY_ENGINE -> S_ELECTION [ input=I_ELECTION cause=C_FSA_INTERNAL origin=do_cib_replaced ]
Oct 21 11:20:18 [7685] bl460g1n6       crmd:     info: update_dc: 	Unset DC. Was bl460g1n6
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++           <nvpair name="config" value="/etc/libvirt/qemu/vm3.xml" id="prmVM3-instance_attributes-config"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: crm_uptime: 	Current CPU usage is: 0s, 36994us
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++           <nvpair name="hypervisor" value="qemu:///system" id="prmVM3-instance_attributes-hypervisor"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: election_vote: 	Started election 3
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++           <nvpair name="migration_transport" value="ssh" id="prmVM3-instance_attributes-migration_transport"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++         </instance_attributes>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++         <meta_attributes id="prmVM3-meta_attributes">
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++           <nvpair name="allow-migrate" value="true" id="prmVM3-meta_attributes-allow-migrate"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++         </meta_attributes>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++         <operations>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:     info: stonith_action_create: 	Initiating action metadata for agent fence_legacy (target=(null))
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++           <op name="start" interval="0s" timeout="120s" on-fail="restart" id="prmVM3-start-0s"/>
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	forking
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++           <op name="monitor" interval="10s" timeout="30s" on-fail="restart" id="prmVM3-monitor-10s"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++           <op name="stop" interval="0s" timeout="120s" on-fail="fence" id="prmVM3-stop-0s"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++           <op name="migrate_to" interval="0s" timeout="120s" on-fail="restart" id="prmVM3-migrate_to-0s"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++           <op name="migrate_from" interval="0s" timeout="120s" on-fail="restart" id="prmVM3-migrate_from-0s"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++         </operations>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++       </primitive>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++       <group id="grpStonith6">
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++         <primitive id="prmStonith6-1" class="stonith" type="external/ipmi">
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++           <instance_attributes id="prmStonith6-1-instance_attributes">
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++             <nvpair name="pcmk_reboot_retries" value="1" id="prmStonith6-1-instance_attributes-pcmk_reboot_retries"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++             <nvpair name="pcmk_reboot_timeout" value="60s" id="prmStonith6-1-instance_attributes-pcmk_reboot_timeout"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++             <nvpair name="hostname" value="bl460g1n6" id="prmStonith6-1-instance_attributes-hostname"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++             <nvpair name="ipaddr" value="192.168.133.236" id="prmStonith6-1-instance_attributes-ipaddr"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++             <nvpair name="userid" value="USERID" id="prmStonith6-1-instance_attributes-userid"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++             <nvpair name="passwd" value="PASSW0RD" id="prmStonith6-1-instance_attributes-passwd"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++             <nvpair name="interface" value="lanplus" id="prmStonith6-1-instance_attributes-interface"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++           </instance_attributes>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++           <operations>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: election_count_vote: 	Created voted hash
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++             <op name="start" interval="0s" timeout="60s" id="prmStonith6-1-start-0s"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: election_count_vote: 	Election 3 (current: 3, owner: 3232261592): Processed vote from bl460g1n6 (Recorded)
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++             <op name="monitor" interval="360s" timeout="60s" id="prmStonith6-1-monitor-360s"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: election_check: 	Still waiting on 2 non-votes (3 total)
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++             <op name="stop" interval="0s" timeout="60s" id="prmStonith6-1-stop-0s"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++           </operations>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++         </primitive>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++         <primitive id="prmStonith6-2" class="stonith" type="external/ssh">
Oct 21 11:20:18 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	sending args
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++           <instance_attributes id="prmStonith6-2-instance_attributes">
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: election_count_vote: 	Election 3 (current: 3, owner: 3232261592): Processed no-vote from bl460g1n8 (Recorded)
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++             <nvpair name="pcmk_reboot_retries" value="1" id="prmStonith6-2-instance_attributes-pcmk_reboot_retries"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: election_check: 	Still waiting on 1 non-votes (3 total)
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++             <nvpair name="pcmk_reboot_timeout" value="40s" id="prmStonith6-2-instance_attributes-pcmk_reboot_timeout"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++             <nvpair name="hostlist" value="bl460g1n6" id="prmStonith6-2-instance_attributes-hostlist"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++           </instance_attributes>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++           <operations>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++             <op name="start" interval="0s" timeout="60s" id="prmStonith6-2-start-0s"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++             <op name="monitor" interval="10s" timeout="60s" id="prmStonith6-2-monitor-10s"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++             <op name="stop" interval="0s" timeout="60s" id="prmStonith6-2-stop-0s"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++           </operations>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++         </primitive>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++       </group>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++       <group id="grpStonith7">
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++         <primitive id="prmStonith7-1" class="stonith" type="external/ipmi">
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++           <instance_attributes id="prmStonith7-1-instance_attributes">
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++             <nvpair name="pcmk_reboot_retries" value="1" id="prmStonith7-1-instance_attributes-pcmk_reboot_retries"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++             <nvpair name="pcmk_reboot_timeout" value="60s" id="prmStonith7-1-instance_attributes-pcmk_reboot_timeout"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++             <nvpair name="hostname" value="bl460g1n7" id="prmStonith7-1-instance_attributes-hostname"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++             <nvpair name="ipaddr" value="192.168.133.237" id="prmStonith7-1-instance_attributes-ipaddr"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++             <nvpair name="userid" value="USERID" id="prmStonith7-1-instance_attributes-userid"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++             <nvpair name="passwd" value="PASSW0RD" id="prmStonith7-1-instance_attributes-passwd"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++             <nvpair name="interface" value="lanplus" id="prmStonith7-1-instance_attributes-interface"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++           </instance_attributes>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++           <operations>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++             <op name="start" interval="0s" timeout="60s" id="prmStonith7-1-start-0s"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++             <op name="monitor" interval="360s" timeout="60s" id="prmStonith7-1-monitor-360s"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++             <op name="stop" interval="0s" timeout="60s" id="prmStonith7-1-stop-0s"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++           </operations>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++         </primitive>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++         <primitive id="prmStonith7-2" class="stonith" type="external/ssh">
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++           <instance_attributes id="prmStonith7-2-instance_attributes">
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++             <nvpair name="pcmk_reboot_retries" value="1" id="prmStonith7-2-instance_attributes-pcmk_reboot_retries"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++             <nvpair name="pcmk_reboot_timeout" value="40s" id="prmStonith7-2-instance_attributes-pcmk_reboot_timeout"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++             <nvpair name="hostlist" value="bl460g1n7" id="prmStonith7-2-instance_attributes-hostlist"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++           </instance_attributes>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++           <operations>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++             <op name="start" interval="0s" timeout="60s" id="prmStonith7-2-start-0s"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++             <op name="monitor" interval="10s" timeout="60s" id="prmStonith7-2-monitor-10s"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++             <op name="stop" interval="0s" timeout="60s" id="prmStonith7-2-stop-0s"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++           </operations>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: election_count_vote: 	Election 3 (current: 3, owner: 3232261592): Processed no-vote from bl460g1n7 (Recorded)
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++         </primitive>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++       </group>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++       <group id="grpStonith8">
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++         <primitive id="prmStonith8-1" class="stonith" type="external/ipmi">
Oct 21 11:20:18 [7685] bl460g1n6       crmd:     info: election_timer_cb: 	Election election-0 complete
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++           <instance_attributes id="prmStonith8-1-instance_attributes">
Oct 21 11:20:18 [7685] bl460g1n6       crmd:     info: election_timeout_popped: 	Election failed: Declaring ourselves the winner
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++             <nvpair name="pcmk_reboot_retries" value="1" id="prmStonith8-1-instance_attributes-pcmk_reboot_retries"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: s_crmd_fsa: 	Processing I_ELECTION_DC: [ state=S_ELECTION cause=C_TIMER_POPPED origin=election_timeout_popped ]
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++             <nvpair name="pcmk_reboot_timeout" value="60s" id="prmStonith8-1-instance_attributes-pcmk_reboot_timeout"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:     info: do_log: 	FSA: Input I_ELECTION_DC from election_timeout_popped() received in state S_ELECTION
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++             <nvpair name="hostname" value="bl460g1n8" id="prmStonith8-1-instance_attributes-hostname"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:   notice: do_state_transition: 	State transition S_ELECTION -> S_INTEGRATION [ input=I_ELECTION_DC cause=C_TIMER_POPPED origin=election_timeout_popped ]
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++             <nvpair name="ipaddr" value="192.168.133.238" id="prmStonith8-1-instance_attributes-ipaddr"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: do_te_control: 	The transitioner is already active
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++             <nvpair name="userid" value="USERID" id="prmStonith8-1-instance_attributes-userid"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: crm_timer_start: 	Started Integration Timer (I_INTEGRATED:180000ms), src=46
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++             <nvpair name="passwd" value="PASSW0RD" id="prmStonith8-1-instance_attributes-passwd"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:     info: do_dc_takeover: 	Taking over DC status for this partition
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++             <nvpair name="interface" value="lanplus" id="prmStonith8-1-instance_attributes-interface"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++           </instance_attributes>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++           <operations>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++             <op name="start" interval="0s" timeout="60s" id="prmStonith8-1-start-0s"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++             <op name="monitor" interval="360s" timeout="60s" id="prmStonith8-1-monitor-360s"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++             <op name="stop" interval="0s" timeout="60s" id="prmStonith8-1-stop-0s"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++           </operations>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++         </primitive>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++         <primitive id="prmStonith8-2" class="stonith" type="external/ssh">
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++           <instance_attributes id="prmStonith8-2-instance_attributes">
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++             <nvpair name="pcmk_reboot_retries" value="1" id="prmStonith8-2-instance_attributes-pcmk_reboot_retries"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++             <nvpair name="pcmk_reboot_timeout" value="40s" id="prmStonith8-2-instance_attributes-pcmk_reboot_timeout"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++             <nvpair name="hostlist" value="bl460g1n8" id="prmStonith8-2-instance_attributes-hostlist"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++           </instance_attributes>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++           <operations>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++             <op name="start" interval="0s" timeout="60s" id="prmStonith8-2-start-0s"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++             <op name="monitor" interval="10s" timeout="60s" id="prmStonith8-2-monitor-10s"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++             <op name="stop" interval="0s" timeout="60s" id="prmStonith8-2-stop-0s"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++           </operations>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++         </primitive>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++       </group>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++       <clone id="clnPing">
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++         <primitive id="prmPing" class="ocf" provider="pacemaker" type="ping">
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++           <instance_attributes id="prmPing-instance_attributes">
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++             <nvpair name="name" value="default_ping_set" id="prmPing-instance_attributes-name"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++             <nvpair name="host_list" value="192.168.201.254" id="prmPing-instance_attributes-host_list"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++             <nvpair name="multiplier" value="100" id="prmPing-instance_attributes-multiplier"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++             <nvpair name="attempts" value="2" id="prmPing-instance_attributes-attempts"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++             <nvpair name="timeout" value="2" id="prmPing-instance_attributes-timeout"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++           </instance_attributes>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++           <operations>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++             <op name="start" interval="0s" timeout="60s" on-fail="restart" id="prmPing-start-0s"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++             <op name="monitor" interval="10s" timeout="60s" on-fail="restart" id="prmPing-monitor-10s"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++             <op name="stop" interval="0s" timeout="60s" on-fail="ignore" id="prmPing-stop-0s"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++           </operations>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++         </primitive>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++       </clone>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++       <rsc_location id="l1" rsc="prmVM1">
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++         <!--#  rule -inf: not_defined default_ping_set or default_ping_set lt 100-->/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++         <rule score="300" id="l1-rule">
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++           <expression attribute="#uname" operation="eq" value="bl460g1n6" id="l1-expression"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++         </rule>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++         <rule score="200" id="l1-rule-0">
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++           <expression attribute="#uname" operation="eq" value="bl460g1n7" id="l1-expression-0"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++         </rule>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++         <rule score="100" id="l1-rule-1">
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++           <expression attribute="#uname" operation="eq" value="bl460g1n8" id="l1-expression-1"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++         </rule>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++         <rule score="-INFINITY" id="l1-rule-2">
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++           <expression attribute="default_ping_set" operation="lt" value="100" id="l1-expression-2"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++         </rule>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++       </rsc_location>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++       <rsc_location id="l2" rsc="prmVM2">
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++         <rule score="300" id="l2-rule">
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++           <expression attribute="#uname" operation="eq" value="bl460g1n6" id="l2-expression"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++         </rule>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++         <rule score="200" id="l2-rule-0">
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++           <expression attribute="#uname" operation="eq" value="bl460g1n7" id="l2-expression-0"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++         </rule>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++         <rule score="100" id="l2-rule-1">
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++           <expression attribute="#uname" operation="eq" value="bl460g1n8" id="l2-expression-1"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++         </rule>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++         <rule score="-INFINITY" id="l2-rule-2">
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++           <expression attribute="default_ping_set" operation="lt" value="100" id="l2-expression-2"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++         </rule>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++       </rsc_location>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++       <rsc_location id="l3" rsc="prmVM3">
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++         <rule score="300" id="l3-rule">
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++           <expression attribute="#uname" operation="eq" value="bl460g1n6" id="l3-expression"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++         </rule>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++         <rule score="200" id="l3-rule-0">
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++           <expression attribute="#uname" operation="eq" value="bl460g1n7" id="l3-expression-0"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++         </rule>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++         <rule score="100" id="l3-rule-1">
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++           <expression attribute="#uname" operation="eq" value="bl460g1n8" id="l3-expression-1"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++         </rule>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++         <rule score="-INFINITY" id="l3-rule-2">
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++           <expression attribute="default_ping_set" operation="lt" value="100" id="l3-expression-2"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++         </rule>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++       </rsc_location>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++       <rsc_location id="lo6" rsc="grpStonith6">
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++         <rule score="-INFINITY" id="lo6-rule">
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++           <expression attribute="#uname" operation="eq" value="bl460g1n6" id="lo6-expression"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++         </rule>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++       </rsc_location>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++       <rsc_location id="lo7" rsc="grpStonith7">
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++         <rule score="-INFINITY" id="lo7-rule">
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++           <expression attribute="#uname" operation="eq" value="bl460g1n7" id="lo7-expression"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++         </rule>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++       </rsc_location>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++       <rsc_location id="lo8" rsc="grpStonith8">
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++         <rule score="-INFINITY" id="lo8-rule">
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++           <expression attribute="#uname" operation="eq" value="bl460g1n8" id="lo8-expression"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++         </rule>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++       </rsc_location>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++       <rsc_colocation id="c1" score="INFINITY" rsc="prmVM1" with-rsc="clnPing"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++       <rsc_colocation id="c4" score="INFINITY" rsc="prmVM2" with-rsc="clnPing"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++       <rsc_colocation id="c7" score="INFINITY" rsc="prmVM3" with-rsc="clnPing"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++       <rsc_order id="o1" score="0" first="clnPing" then="prmVM1" symmetrical="false"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++       <rsc_order id="o4" score="0" first="clnPing" then="prmVM2" symmetrical="false"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++       <rsc_order id="o7" score="0" first="clnPing" then="prmVM3" symmetrical="false"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++     <fencing-topology>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++       <fencing-level target="bl460g1n6" devices="prmStonith6-1" index="1" id="fencing"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++       <fencing-level target="bl460g1n6" devices="prmStonith6-2" index="2" id="fencing-0"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++       <fencing-level target="bl460g1n7" devices="prmStonith7-1" index="1" id="fencing-1"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++       <fencing-level target="bl460g1n7" devices="prmStonith7-2" index="2" id="fencing-2"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++       <fencing-level target="bl460g1n8" devices="prmStonith8-1" index="1" id="fencing-3"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++       <fencing-level target="bl460g1n8" devices="prmStonith8-2" index="2" id="fencing-4"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++     </fencing-topology>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++     <rsc_defaults>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++       <meta_attributes id="rsc-options">
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++         <nvpair name="resource-stickiness" value="INFINITY" id="rsc-options-resource-stickiness"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++         <nvpair name="migration-threshold" value="1" id="rsc-options-migration-threshold"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++       </meta_attributes>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++     </rsc_defaults>
Oct 21 11:20:18 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_replace operation for section 'all': OK (rc=0, origin=local/cibadmin/2, version=0.6.1)
Oct 21 11:20:18 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_query operation for section 'all': OK (rc=0, origin=local/crmd/31, version=0.6.1)
Oct 21 11:20:18 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_query operation for section crm_config: OK (rc=0, origin=local/crmd/32, version=0.6.1)
Oct 21 11:20:18 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section nodes: OK (rc=0, origin=local/crmd/33, version=0.6.1)
Oct 21 11:20:18 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=local/crmd/34, version=0.6.1)
Oct 21 11:20:18 [7680] bl460g1n6        cib:    debug: cib_process_readwrite: 	We are still in R/W mode
Oct 21 11:20:18 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_master operation for section 'all': OK (rc=0, origin=local/crmd/35, version=0.6.1)
Oct 21 11:20:18 [7680] bl460g1n6        cib:    debug: cib_process_modify: 	Destroying /cib/status/node_state[1]/transient_attributes/instance_attributes/nvpair[2]
Oct 21 11:20:18 [7680] bl460g1n6        cib:    debug: cib_process_modify: 	Destroying /cib/status/node_state[2]/transient_attributes/instance_attributes/nvpair[2]
Oct 21 11:20:18 [7680] bl460g1n6        cib:    debug: cib_process_modify: 	Destroying /cib/status/node_state[3]/transient_attributes/instance_attributes/nvpair[2]
Oct 21 11:20:18 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=local/attrd/7, version=0.6.1)
Oct 21 11:20:18 [7683] bl460g1n6      attrd:     info: attrd_cib_callback: 	Update 7 for shutdown: OK (0)
Oct 21 11:20:18 [7683] bl460g1n6      attrd:   notice: attrd_cib_callback: 	Update 7 for shutdown[bl460g1n6]=(null): OK (0)
Oct 21 11:20:18 [7683] bl460g1n6      attrd:   notice: attrd_cib_callback: 	Update 7 for shutdown[bl460g1n7]=(null): OK (0)
Oct 21 11:20:18 [7683] bl460g1n6      attrd:   notice: attrd_cib_callback: 	Update 7 for shutdown[bl460g1n8]=(null): OK (0)
Oct 21 11:20:18 [7680] bl460g1n6        cib:    debug: cib_process_modify: 	Destroying /cib/status/node_state[1]/transient_attributes/instance_attributes/nvpair[2]
Oct 21 11:20:18 [7680] bl460g1n6        cib:    debug: cib_process_modify: 	Destroying /cib/status/node_state[2]/transient_attributes/instance_attributes/nvpair[2]
Oct 21 11:20:18 [7680] bl460g1n6        cib:    debug: cib_process_modify: 	Destroying /cib/status/node_state[3]/transient_attributes/instance_attributes/nvpair[2]
Oct 21 11:20:18 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=local/attrd/8, version=0.6.1)
Oct 21 11:20:18 [7683] bl460g1n6      attrd:     info: attrd_cib_callback: 	Update 8 for terminate: OK (0)
Oct 21 11:20:18 [7683] bl460g1n6      attrd:   notice: attrd_cib_callback: 	Update 8 for terminate[bl460g1n6]=(null): OK (0)
Oct 21 11:20:18 [7683] bl460g1n6      attrd:   notice: attrd_cib_callback: 	Update 8 for terminate[bl460g1n7]=(null): OK (0)
Oct 21 11:20:18 [7683] bl460g1n6      attrd:   notice: attrd_cib_callback: 	Update 8 for terminate[bl460g1n8]=(null): OK (0)
Oct 21 11:20:18 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=local/attrd/9, version=0.6.1)
Oct 21 11:20:18 [7683] bl460g1n6      attrd:     info: attrd_cib_callback: 	Update 9 for probe_complete: OK (0)
Oct 21 11:20:18 [7683] bl460g1n6      attrd:   notice: attrd_cib_callback: 	Update 9 for probe_complete[bl460g1n6]=true: OK (0)
Oct 21 11:20:18 [7683] bl460g1n6      attrd:   notice: attrd_cib_callback: 	Update 9 for probe_complete[bl460g1n7]=true: OK (0)
Oct 21 11:20:18 [7683] bl460g1n6      attrd:   notice: attrd_cib_callback: 	Update 9 for probe_complete[bl460g1n8]=true: OK (0)
Oct 21 11:20:18 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section cib: OK (rc=0, origin=local/crmd/36, version=0.6.1)
Oct 21 11:20:18 [7680] bl460g1n6        cib:    debug: cib_process_xpath: 	cib_query: //cib/configuration/crm_config//cluster_property_set//nvpair[@name='dc-version'] does not exist
Oct 21 11:20:18 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_query operation for section //cib/configuration/crm_config//cluster_property_set//nvpair[@name='dc-version']: No such device or address (rc=-6, origin=local/crmd/37, version=0.6.1)
Oct 21 11:20:18 [7680] bl460g1n6        cib:    debug: qb_ipcs_dispatch_connection_request: 	HUP conn (7680-7782-14)
Oct 21 11:20:18 [7680] bl460g1n6        cib:    debug: qb_ipcs_disconnect: 	qb_ipcs_disconnect(7680-7782-14) state:2
Oct 21 11:20:18 [7680] bl460g1n6        cib:     info: crm_client_destroy: 	Destroying 0 events
Oct 21 11:20:18 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-response-7680-7782-14-header
Oct 21 11:20:18 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-event-7680-7782-14-header
Oct 21 11:20:18 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-request-7680-7782-14-header
Oct 21 11:20:18 [7680] bl460g1n6        cib:    debug: activateCibXml: 	Triggering CIB write for cib_modify op
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: log_cib_diff: 	cib:diff: Local-only Change: 0.7.1
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	-- <cib admin_epoch="0" epoch="6" num_updates="1"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++         <nvpair id="cib-bootstrap-options-dc-version" name="dc-version" value="1.1.11-0.302.b6d42ed.git.el6-b6d42ed"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section crm_config: OK (rc=0, origin=local/crmd/38, version=0.7.1)
Oct 21 11:20:18 [7680] bl460g1n6        cib:    debug: cib_process_xpath: 	cib_query: //cib/configuration/crm_config//cluster_property_set//nvpair[@name='cluster-infrastructure'] does not exist
Oct 21 11:20:18 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_query operation for section //cib/configuration/crm_config//cluster_property_set//nvpair[@name='cluster-infrastructure']: No such device or address (rc=-6, origin=local/crmd/39, version=0.7.1)
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: initialize_join: 	join-3: Initializing join data (flag=true)
Oct 21 11:20:18 [7685] bl460g1n6       crmd:     info: crm_update_peer_join: 	initialize_join: Node bl460g1n7[3232261593] - join-3 phase 4 -> 0
Oct 21 11:20:18 [7685] bl460g1n6       crmd:     info: crm_update_peer_join: 	initialize_join: Node bl460g1n8[3232261594] - join-3 phase 4 -> 0
Oct 21 11:20:18 [7685] bl460g1n6       crmd:     info: crm_update_peer_join: 	initialize_join: Node bl460g1n6[3232261592] - join-3 phase 4 -> 0
Oct 21 11:20:18 [7685] bl460g1n6       crmd:     info: join_make_offer: 	join-3: Sending offer to bl460g1n7
Oct 21 11:20:18 [7685] bl460g1n6       crmd:     info: crm_update_peer_join: 	join_make_offer: Node bl460g1n7[3232261593] - join-3 phase 0 -> 1
Oct 21 11:20:18 [7685] bl460g1n6       crmd:     info: join_make_offer: 	join-3: Sending offer to bl460g1n8
Oct 21 11:20:18 [7685] bl460g1n6       crmd:     info: crm_update_peer_join: 	join_make_offer: Node bl460g1n8[3232261594] - join-3 phase 0 -> 1
Oct 21 11:20:18 [7685] bl460g1n6       crmd:     info: join_make_offer: 	join-3: Sending offer to bl460g1n6
Oct 21 11:20:18 [7685] bl460g1n6       crmd:     info: crm_update_peer_join: 	join_make_offer: Node bl460g1n6[3232261592] - join-3 phase 0 -> 1
Oct 21 11:20:18 [7685] bl460g1n6       crmd:     info: do_dc_join_offer_all: 	join-3: Waiting on 3 outstanding join acks
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: s_crmd_fsa: 	Processing I_ELECTION_DC: [ state=S_INTEGRATION cause=C_FSA_INTERNAL origin=do_election_check ]
Oct 21 11:20:18 [7685] bl460g1n6       crmd:  warning: do_log: 	FSA: Input I_ELECTION_DC from do_election_check() received in state S_INTEGRATION
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: election_vote: 	Started election 4
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: initialize_join: 	join-4: Initializing join data (flag=true)
Oct 21 11:20:18 [7685] bl460g1n6       crmd:     info: crm_update_peer_join: 	initialize_join: Node bl460g1n7[3232261593] - join-4 phase 1 -> 0
Oct 21 11:20:18 [7685] bl460g1n6       crmd:     info: crm_update_peer_join: 	initialize_join: Node bl460g1n8[3232261594] - join-4 phase 1 -> 0
Oct 21 11:20:18 [7685] bl460g1n6       crmd:     info: crm_update_peer_join: 	initialize_join: Node bl460g1n6[3232261592] - join-4 phase 1 -> 0
Oct 21 11:20:18 [7685] bl460g1n6       crmd:     info: join_make_offer: 	join-4: Sending offer to bl460g1n7
Oct 21 11:20:18 [7685] bl460g1n6       crmd:     info: crm_update_peer_join: 	join_make_offer: Node bl460g1n7[3232261593] - join-4 phase 0 -> 1
Oct 21 11:20:18 [7685] bl460g1n6       crmd:     info: join_make_offer: 	join-4: Sending offer to bl460g1n8
Oct 21 11:20:18 [7685] bl460g1n6       crmd:     info: crm_update_peer_join: 	join_make_offer: Node bl460g1n8[3232261594] - join-4 phase 0 -> 1
Oct 21 11:20:18 [7685] bl460g1n6       crmd:     info: join_make_offer: 	join-4: Sending offer to bl460g1n6
Oct 21 11:20:18 [7685] bl460g1n6       crmd:     info: crm_update_peer_join: 	join_make_offer: Node bl460g1n6[3232261592] - join-4 phase 0 -> 1
Oct 21 11:20:18 [7685] bl460g1n6       crmd:     info: do_dc_join_offer_all: 	join-4: Waiting on 3 outstanding join acks
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: do_pe_invoke_callback: 	Discarding PE request in state: S_INTEGRATION
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: config_query_callback: 	Call 32 : Parsing CIB options
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: config_query_callback: 	Shutdown escalation occurs after: 1200000ms
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: config_query_callback: 	Checking for expired actions every 900000ms
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: handle_request: 	Raising I_JOIN_OFFER: join-3
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: handle_request: 	Raising I_JOIN_OFFER: join-4
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: s_crmd_fsa: 	Processing I_JOIN_OFFER: [ state=S_INTEGRATION cause=C_HA_MESSAGE origin=route_message ]
Oct 21 11:20:18 [7685] bl460g1n6       crmd:     info: update_dc: 	Set DC to bl460g1n6 (3.0.7)
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: do_cl_join_offer_respond: 	do_cl_join_offer_respond added action A_DC_TIMER_STOP to the FSA
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: election_count_vote: 	Created voted hash
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: election_count_vote: 	Election 4 (current: 4, owner: 3232261592): Processed vote from bl460g1n6 (Recorded)
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: do_election_check: 	Ignore election check: we not in an election
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: election_count_vote: 	Election 4 (current: 4, owner: 3232261592): Processed no-vote from bl460g1n8 (Recorded)
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: do_election_check: 	Ignore election check: we not in an election
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: s_crmd_fsa: 	Processing I_JOIN_OFFER: [ state=S_INTEGRATION cause=C_HA_MESSAGE origin=route_message ]
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: do_cl_join_offer_respond: 	do_cl_join_offer_respond added action A_DC_TIMER_STOP to the FSA
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: s_crmd_fsa: 	Processing I_JOIN_REQUEST: [ state=S_INTEGRATION cause=C_HA_MESSAGE origin=route_message ]
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: do_dc_join_filter_offer: 	Processing req from bl460g1n8
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: do_dc_join_filter_offer: 	Invalid response from bl460g1n8: join-3 vs. join-4
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: check_join_state: 	Invoked by do_dc_join_filter_offer in state: S_INTEGRATION
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: s_crmd_fsa: 	Processing I_JOIN_REQUEST: [ state=S_INTEGRATION cause=C_HA_MESSAGE origin=route_message ]
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: do_dc_join_filter_offer: 	Processing req from bl460g1n8
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: do_dc_join_filter_offer: 	join-4: Welcoming node bl460g1n8 (ref join_request-crmd-1382322018-9)
Oct 21 11:20:18 [7685] bl460g1n6       crmd:     info: crm_update_peer_join: 	do_dc_join_filter_offer: Node bl460g1n8[3232261594] - join-4 phase 1 -> 2
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: do_dc_join_filter_offer: 	1 nodes have been integrated into join-4
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: check_join_state: 	Invoked by do_dc_join_filter_offer in state: S_INTEGRATION
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: do_dc_join_filter_offer: 	join-4: Still waiting on 2 outstanding offers
Oct 21 11:20:18 [7680] bl460g1n6        cib:    debug: activateCibXml: 	Triggering CIB write for cib_modify op
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: log_cib_diff: 	cib:diff: Local-only Change: 0.8.1
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	-- <cib admin_epoch="0" epoch="7" num_updates="1"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:   notice: cib:diff: 	++         <nvpair id="cib-bootstrap-options-cluster-infrastructure" name="cluster-infrastructure" value="corosync"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section crm_config: OK (rc=0, origin=local/crmd/40, version=0.8.1)
Oct 21 11:20:18 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_query operation for section crm_config: OK (rc=0, origin=local/crmd/41, version=0.8.1)
Oct 21 11:20:18 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_query operation for section crm_config: OK (rc=0, origin=local/crmd/42, version=0.8.1)
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: config_query_callback: 	Call 41 : Parsing CIB options
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: config_query_callback: 	Shutdown escalation occurs after: 1200000ms
Oct 21 11:20:18 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_query operation for section 'all': OK (rc=0, origin=local/crmd/43, version=0.8.1)
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: config_query_callback: 	Checking for expired actions every 900000ms
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: config_query_callback: 	Call 42 : Parsing CIB options
Oct 21 11:20:18 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_query operation for section 'all': OK (rc=0, origin=local/crmd/44, version=0.8.1)
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: config_query_callback: 	Shutdown escalation occurs after: 1200000ms
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: config_query_callback: 	Checking for expired actions every 900000ms
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: join_query_callback: 	Respond to join offer join-4
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: join_query_callback: 	Acknowledging bl460g1n6 as our DC
Oct 21 11:20:18 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_query operation for section crm_config: OK (rc=0, origin=local/crmd/45, version=0.8.1)
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: config_query_callback: 	Call 45 : Parsing CIB options
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: config_query_callback: 	Shutdown escalation occurs after: 1200000ms
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: config_query_callback: 	Checking for expired actions every 900000ms
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: s_crmd_fsa: 	Processing I_JOIN_REQUEST: [ state=S_INTEGRATION cause=C_HA_MESSAGE origin=route_message ]
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: do_dc_join_filter_offer: 	Processing req from bl460g1n6
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: do_dc_join_filter_offer: 	bl460g1n6 has a better generation number than the current max bl460g1n8
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: do_dc_join_filter_offer: 	Max generation   <generation_tuple epoch="7" num_updates="1" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:18 2013" update-origin="bl460g1n6" update-client="crmd" have-quorum="1" dc-uuid="3232261592"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: do_dc_join_filter_offer: 	Their generation   <generation_tuple epoch="8" num_updates="1" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:18 2013" update-origin="bl460g1n6" update-client="crmd" have-quorum="1" dc-uuid="3232261592"/>
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: do_dc_join_filter_offer: 	join-4: Welcoming node bl460g1n6 (ref join_request-crmd-1382322018-27)
Oct 21 11:20:18 [7685] bl460g1n6       crmd:     info: crm_update_peer_join: 	do_dc_join_filter_offer: Node bl460g1n6[3232261592] - join-4 phase 1 -> 2
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: do_dc_join_filter_offer: 	2 nodes have been integrated into join-4
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: check_join_state: 	Invoked by do_dc_join_filter_offer in state: S_INTEGRATION
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: do_dc_join_filter_offer: 	join-4: Still waiting on 1 outstanding offers
Oct 21 11:20:18 [7680] bl460g1n6        cib:     info: crm_client_new: 	Connecting 0xbe7fc0 for uid=0 gid=0 pid=7784 id=2067dde6-069d-4a84-805c-75a474fa1a43
Oct 21 11:20:18 [7680] bl460g1n6        cib:    debug: handle_new_connection: 	IPC credentials authenticated (7680-7784-14)
Oct 21 11:20:18 [7680] bl460g1n6        cib:    debug: qb_ipcs_shm_connect: 	connecting to client [7784]
Oct 21 11:20:18 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:18 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:18 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:18 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_query operation for section 'all': OK (rc=0, origin=local/cibadmin/2, version=0.8.1)
Oct 21 11:20:18 [7680] bl460g1n6        cib:    debug: qb_ipcs_dispatch_connection_request: 	HUP conn (7680-7784-14)
Oct 21 11:20:18 [7680] bl460g1n6        cib:    debug: qb_ipcs_disconnect: 	qb_ipcs_disconnect(7680-7784-14) state:2
Oct 21 11:20:18 [7680] bl460g1n6        cib:     info: crm_client_destroy: 	Destroying 0 events
Oct 21 11:20:18 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-response-7680-7784-14-header
Oct 21 11:20:18 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-event-7680-7784-14-header
Oct 21 11:20:18 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-request-7680-7784-14-header
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: election_count_vote: 	Election 4 (current: 4, owner: 3232261592): Processed no-vote from bl460g1n7 (Recorded)
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: do_election_check: 	Ignore election check: we not in an election
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: s_crmd_fsa: 	Processing I_JOIN_REQUEST: [ state=S_INTEGRATION cause=C_HA_MESSAGE origin=route_message ]
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: do_dc_join_filter_offer: 	Processing req from bl460g1n7
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: do_dc_join_filter_offer: 	join-4: Welcoming node bl460g1n7 (ref join_request-crmd-1382322018-9)
Oct 21 11:20:18 [7685] bl460g1n6       crmd:     info: crm_update_peer_join: 	do_dc_join_filter_offer: Node bl460g1n7[3232261593] - join-4 phase 1 -> 2
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: do_dc_join_filter_offer: 	3 nodes have been integrated into join-4
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: check_join_state: 	Invoked by do_dc_join_filter_offer in state: S_INTEGRATION
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: check_join_state: 	join-4: Integration of 3 peers complete: do_dc_join_filter_offer
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: s_crmd_fsa: 	Processing I_INTEGRATED: [ state=S_INTEGRATION cause=C_FSA_INTERNAL origin=check_join_state ]
Oct 21 11:20:18 [7685] bl460g1n6       crmd:     info: do_state_transition: 	State transition S_INTEGRATION -> S_FINALIZE_JOIN [ input=I_INTEGRATED cause=C_FSA_INTERNAL origin=check_join_state ]
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: do_state_transition: 	All 3 cluster nodes responded to the join offer.
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: crm_timer_start: 	Started Finalization Timer (I_ELECTION:1800000ms), src=54
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: do_dc_join_finalize: 	Finializing join-4 for 3 clients
Oct 21 11:20:18 [7685] bl460g1n6       crmd:     info: crmd_join_phase_log: 	join-4: bl460g1n7=integrated
Oct 21 11:20:18 [7685] bl460g1n6       crmd:     info: crmd_join_phase_log: 	join-4: bl460g1n8=integrated
Oct 21 11:20:18 [7685] bl460g1n6       crmd:     info: crmd_join_phase_log: 	join-4: bl460g1n6=integrated
Oct 21 11:20:18 [7685] bl460g1n6       crmd:     info: do_dc_join_finalize: 	join-4: Syncing our CIB to the rest of the cluster
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: do_dc_join_finalize: 	Requested version   <generation_tuple epoch="8" num_updates="1" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:18 2013" update-origin="bl460g1n6" update-client="crmd" have-quorum="1" dc-uuid="3232261592"/>
Oct 21 11:20:18 [7680] bl460g1n6        cib:    debug: sync_our_cib: 	Syncing CIB to all peers
Oct 21 11:20:18 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_sync operation for section 'all': OK (rc=0, origin=local/crmd/46, version=0.8.1)
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: check_join_state: 	Invoked by finalize_sync_callback in state: S_FINALIZE_JOIN
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: check_join_state: 	join-4: Still waiting on 3 integrated nodes
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: crmd_join_phase_log: 	join-4: bl460g1n7=integrated
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: crmd_join_phase_log: 	join-4: bl460g1n8=integrated
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: crmd_join_phase_log: 	join-4: bl460g1n6=integrated
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: finalize_sync_callback: 	Notifying 3 clients of join-4 results
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: finalize_join_for: 	join-4: ACK'ing join request from bl460g1n7
Oct 21 11:20:18 [7685] bl460g1n6       crmd:     info: crm_update_peer_join: 	finalize_join_for: Node bl460g1n7[3232261593] - join-4 phase 2 -> 3
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: finalize_join_for: 	join-4: ACK'ing join request from bl460g1n8
Oct 21 11:20:18 [7685] bl460g1n6       crmd:     info: crm_update_peer_join: 	finalize_join_for: Node bl460g1n8[3232261594] - join-4 phase 2 -> 3
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: finalize_join_for: 	join-4: ACK'ing join request from bl460g1n6
Oct 21 11:20:18 [7685] bl460g1n6       crmd:     info: crm_update_peer_join: 	finalize_join_for: Node bl460g1n6[3232261592] - join-4 phase 2 -> 3
Oct 21 11:20:18 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section nodes: OK (rc=0, origin=local/crmd/47, version=0.8.1)
Oct 21 11:20:18 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section nodes: OK (rc=0, origin=local/crmd/48, version=0.8.1)
Oct 21 11:20:18 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section nodes: OK (rc=0, origin=local/crmd/49, version=0.8.1)
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: handle_request: 	Raising I_JOIN_RESULT: join-4
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: s_crmd_fsa: 	Processing I_JOIN_RESULT: [ state=S_FINALIZE_JOIN cause=C_HA_MESSAGE origin=route_message ]
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: do_cl_join_finalize_respond: 	Confirming join join-4: join_ack_nack
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: do_cl_join_finalize_respond: 	join-4: Join complete.  Sending local LRM status to bl460g1n6
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: do_dc_join_ack: 	Ignoring op=join_ack_nack message from bl460g1n6
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: s_crmd_fsa: 	Processing I_JOIN_RESULT: [ state=S_FINALIZE_JOIN cause=C_HA_MESSAGE origin=route_message ]
Oct 21 11:20:18 [7685] bl460g1n6       crmd:     info: crm_update_peer_join: 	do_dc_join_ack: Node bl460g1n7[3232261593] - join-4 phase 3 -> 4
Oct 21 11:20:18 [7685] bl460g1n6       crmd:     info: do_dc_join_ack: 	join-4: Updating node state to member for bl460g1n7
Oct 21 11:20:18 [7685] bl460g1n6       crmd:     info: erase_status_tag: 	Deleting xpath: //node_state[@uname='bl460g1n7']/lrm
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: do_dc_join_ack: 	join-4: Registered callback for LRM update 51
Oct 21 11:20:18 [7680] bl460g1n6        cib:    debug: cib_process_xpath: 	Processing cib_delete op for //node_state[@uname='bl460g1n7']/lrm (/cib/status/node_state[3]/lrm)
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: s_crmd_fsa: 	Processing I_JOIN_RESULT: [ state=S_FINALIZE_JOIN cause=C_HA_MESSAGE origin=route_message ]
Oct 21 11:20:18 [7685] bl460g1n6       crmd:     info: crm_update_peer_join: 	do_dc_join_ack: Node bl460g1n8[3232261594] - join-4 phase 3 -> 4
Oct 21 11:20:18 [7685] bl460g1n6       crmd:     info: do_dc_join_ack: 	join-4: Updating node state to member for bl460g1n8
Oct 21 11:20:18 [7685] bl460g1n6       crmd:     info: erase_status_tag: 	Deleting xpath: //node_state[@uname='bl460g1n8']/lrm
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: do_dc_join_ack: 	join-4: Registered callback for LRM update 53
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: s_crmd_fsa: 	Processing I_JOIN_RESULT: [ state=S_FINALIZE_JOIN cause=C_HA_MESSAGE origin=route_message ]
Oct 21 11:20:18 [7685] bl460g1n6       crmd:     info: crm_update_peer_join: 	do_dc_join_ack: Node bl460g1n6[3232261592] - join-4 phase 3 -> 4
Oct 21 11:20:18 [7685] bl460g1n6       crmd:     info: do_dc_join_ack: 	join-4: Updating node state to member for bl460g1n6
Oct 21 11:20:18 [7685] bl460g1n6       crmd:     info: erase_status_tag: 	Deleting xpath: //node_state[@uname='bl460g1n6']/lrm
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: do_dc_join_ack: 	join-4: Registered callback for LRM update 55
Oct 21 11:20:18 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_delete operation for section //node_state[@uname='bl460g1n7']/lrm: OK (rc=0, origin=local/crmd/50, version=0.8.2)
Oct 21 11:20:18 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=local/crmd/51, version=0.8.3)
Oct 21 11:20:18 [7680] bl460g1n6        cib:    debug: cib_process_xpath: 	Processing cib_delete op for //node_state[@uname='bl460g1n8']/lrm (/cib/status/node_state[1]/lrm)
Oct 21 11:20:18 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_delete operation for section //node_state[@uname='bl460g1n8']/lrm: OK (rc=0, origin=local/crmd/52, version=0.8.4)
Oct 21 11:20:18 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=local/crmd/53, version=0.8.5)
Oct 21 11:20:18 [7680] bl460g1n6        cib:    debug: cib_process_xpath: 	Processing cib_delete op for //node_state[@uname='bl460g1n6']/lrm (/cib/status/node_state[2]/lrm)
Oct 21 11:20:18 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_delete operation for section //node_state[@uname='bl460g1n6']/lrm: OK (rc=0, origin=local/crmd/54, version=0.8.6)
Oct 21 11:20:18 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=local/crmd/55, version=0.8.7)
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: erase_xpath_callback: 	Deletion of "//node_state[@uname='bl460g1n7']/lrm": OK (rc=0)
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: join_update_complete_callback: 	Join update 51 complete
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: check_join_state: 	Invoked by join_update_complete_callback in state: S_FINALIZE_JOIN
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: check_join_state: 	join-4 complete: join_update_complete_callback
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: s_crmd_fsa: 	Processing I_FINALIZED: [ state=S_FINALIZE_JOIN cause=C_FSA_INTERNAL origin=check_join_state ]
Oct 21 11:20:18 [7685] bl460g1n6       crmd:     info: do_state_transition: 	State transition S_FINALIZE_JOIN -> S_POLICY_ENGINE [ input=I_FINALIZED cause=C_FSA_INTERNAL origin=check_join_state ]
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: do_state_transition: 	All 3 cluster nodes are eligible to run resources.
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: do_dc_join_final: 	Ensuring DC, quorum and node attributes are up-to-date
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: attrd_update_delegate: 	Sent update: (null)=(null) for localhost
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: crm_update_quorum: 	Updating quorum status to true (call=58)
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: do_te_invoke: 	Cancelling the transition: inactive
Oct 21 11:20:18 [7685] bl460g1n6       crmd:     info: abort_transition_graph: 	do_te_invoke:151 - Triggered transition abort (complete=1) : Peer Cancelled
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: crm_timer_start: 	Started New Transition Timer (I_PE_CALC:2000ms), src=65
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: erase_xpath_callback: 	Deletion of "//node_state[@uname='bl460g1n8']/lrm": OK (rc=0)
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: join_update_complete_callback: 	Join update 53 complete
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: check_join_state: 	Invoked by join_update_complete_callback in state: S_POLICY_ENGINE
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: erase_xpath_callback: 	Deletion of "//node_state[@uname='bl460g1n6']/lrm": OK (rc=0)
Oct 21 11:20:18 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section nodes: OK (rc=0, origin=local/crmd/56, version=0.8.7)
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: join_update_complete_callback: 	Join update 55 complete
Oct 21 11:20:18 [7685] bl460g1n6       crmd:    debug: check_join_state: 	Invoked by join_update_complete_callback in state: S_POLICY_ENGINE
Oct 21 11:20:18 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=local/crmd/57, version=0.8.7)
Oct 21 11:20:18 [7680] bl460g1n6        cib:     info: write_cib_contents: 	Archived previous version as /var/lib/pacemaker/cib/cib-2.raw
Oct 21 11:20:18 [7680] bl460g1n6        cib:    debug: write_cib_contents: 	Writing CIB to disk
Oct 21 11:20:18 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section cib: OK (rc=0, origin=local/crmd/58, version=0.8.7)
Oct 21 11:20:18 [7680] bl460g1n6        cib:     info: write_cib_contents: 	Wrote version 0.7.0 of the CIB to disk (digest: a7e604e59f9a2dfc32eb6b09eaaef938)
Oct 21 11:20:18 [7680] bl460g1n6        cib:    debug: write_cib_contents: 	Wrote digest a7e604e59f9a2dfc32eb6b09eaaef938 to disk
Oct 21 11:20:18 [7680] bl460g1n6        cib:     info: retrieveCib: 	Reading cluster configuration from: /var/lib/pacemaker/cib/cib.9Jg1Lh (digest: /var/lib/pacemaker/cib/cib.I8guix)
Oct 21 11:20:18 [7680] bl460g1n6        cib:    debug: write_cib_contents: 	Activating /var/lib/pacemaker/cib/cib.9Jg1Lh
Oct 21 11:20:18 [7680] bl460g1n6        cib:     info: write_cib_contents: 	Archived previous version as /var/lib/pacemaker/cib/cib-3.raw
Oct 21 11:20:18 [7680] bl460g1n6        cib:    debug: write_cib_contents: 	Writing CIB to disk
Oct 21 11:20:18 [7680] bl460g1n6        cib:     info: write_cib_contents: 	Wrote version 0.8.0 of the CIB to disk (digest: 691cb7e31283f65808b970435c0d8e48)
Oct 21 11:20:18 [7680] bl460g1n6        cib:    debug: write_cib_contents: 	Wrote digest 691cb7e31283f65808b970435c0d8e48 to disk
Oct 21 11:20:18 [7680] bl460g1n6        cib:     info: retrieveCib: 	Reading cluster configuration from: /var/lib/pacemaker/cib/cib.t2Px4s (digest: /var/lib/pacemaker/cib/cib.ecvwWI)
Oct 21 11:20:18 [7680] bl460g1n6        cib:    debug: write_cib_contents: 	Activating /var/lib/pacemaker/cib/cib.t2Px4s
Oct 21 11:20:19 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	result = 0
Oct 21 11:20:19 [7681] bl460g1n6 stonith-ng:   notice: stonith_device_register: 	Added 'prmStonith6-2' to the device list (1 active devices)
Oct 21 11:20:19 [7681] bl460g1n6 stonith-ng:     info: cib_device_update: 	Device prmStonith7-1 is allowed on bl460g1n6: score=0
Oct 21 11:20:19 [7681] bl460g1n6 stonith-ng:     info: stonith_action_create: 	Initiating action metadata for agent fence_legacy (target=(null))
Oct 21 11:20:19 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	forking
Oct 21 11:20:19 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	sending args
Oct 21 11:20:20 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	result = 0
Oct 21 11:20:20 [7681] bl460g1n6 stonith-ng:   notice: stonith_device_register: 	Added 'prmStonith7-1' to the device list (2 active devices)
Oct 21 11:20:20 [7681] bl460g1n6 stonith-ng:     info: cib_device_update: 	Device prmStonith7-2 is allowed on bl460g1n6: score=0
Oct 21 11:20:20 [7681] bl460g1n6 stonith-ng:     info: stonith_action_create: 	Initiating action metadata for agent fence_legacy (target=(null))
Oct 21 11:20:20 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	forking
Oct 21 11:20:20 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	sending args
Oct 21 11:20:20 [7685] bl460g1n6       crmd:     info: crm_timer_popped: 	New Transition Timer (I_PE_CALC) just popped (2000ms)
Oct 21 11:20:20 [7685] bl460g1n6       crmd:    debug: s_crmd_fsa: 	Processing I_PE_CALC: [ state=S_POLICY_ENGINE cause=C_TIMER_POPPED origin=crm_timer_popped ]
Oct 21 11:20:20 [7685] bl460g1n6       crmd:    debug: do_pe_invoke: 	Query 59: Requesting the current CIB: S_POLICY_ENGINE
Oct 21 11:20:20 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_query operation for section 'all': OK (rc=0, origin=local/crmd/59, version=0.8.7)
Oct 21 11:20:20 [7685] bl460g1n6       crmd:    debug: do_pe_invoke_callback: 	Invoking the PE: query=59, ref=pe_calc-dc-1382322020-32, seq=16, quorate=1
Oct 21 11:20:20 [7684] bl460g1n6    pengine:    debug: unpack_config: 	STONITH timeout: 60000
Oct 21 11:20:20 [7684] bl460g1n6    pengine:    debug: unpack_config: 	STONITH of failed nodes is enabled
Oct 21 11:20:20 [7684] bl460g1n6    pengine:    debug: unpack_config: 	Stop all active resources: false
Oct 21 11:20:20 [7684] bl460g1n6    pengine:    debug: unpack_config: 	Cluster is symmetric - resources can run anywhere by default
Oct 21 11:20:20 [7684] bl460g1n6    pengine:    debug: unpack_config: 	Default stickiness: 0
Oct 21 11:20:20 [7684] bl460g1n6    pengine:    debug: unpack_config: 	On loss of CCM Quorum: Freeze resources
Oct 21 11:20:20 [7684] bl460g1n6    pengine:    debug: unpack_config: 	Node scores: 'red' = -INFINITY, 'yellow' = 0, 'green' = 0
Oct 21 11:20:20 [7684] bl460g1n6    pengine:    debug: unpack_domains: 	Unpacking domains
Oct 21 11:20:20 [7684] bl460g1n6    pengine:     info: determine_online_status_fencing: 	Node bl460g1n8 is active
Oct 21 11:20:20 [7684] bl460g1n6    pengine:     info: determine_online_status: 	Node bl460g1n8 is online
Oct 21 11:20:20 [7684] bl460g1n6    pengine:     info: determine_online_status_fencing: 	Node bl460g1n6 is active
Oct 21 11:20:20 [7684] bl460g1n6    pengine:     info: determine_online_status: 	Node bl460g1n6 is online
Oct 21 11:20:20 [7684] bl460g1n6    pengine:     info: determine_online_status_fencing: 	Node bl460g1n7 is active
Oct 21 11:20:20 [7684] bl460g1n6    pengine:     info: determine_online_status: 	Node bl460g1n7 is online
Oct 21 11:20:20 [7684] bl460g1n6    pengine:     info: native_print: 	prmVM1	(ocf::heartbeat:VirtualDomain):	Stopped 
Oct 21 11:20:20 [7684] bl460g1n6    pengine:     info: native_print: 	prmVM2	(ocf::heartbeat:VirtualDomain):	Stopped 
Oct 21 11:20:20 [7684] bl460g1n6    pengine:     info: native_print: 	prmVM3	(ocf::heartbeat:VirtualDomain):	Stopped 
Oct 21 11:20:20 [7684] bl460g1n6    pengine:     info: group_print: 	 Resource Group: grpStonith6
Oct 21 11:20:20 [7684] bl460g1n6    pengine:     info: native_print: 	     prmStonith6-1	(stonith:external/ipmi):	Stopped 
Oct 21 11:20:20 [7684] bl460g1n6    pengine:     info: native_print: 	     prmStonith6-2	(stonith:external/ssh):	Stopped 
Oct 21 11:20:20 [7684] bl460g1n6    pengine:     info: group_print: 	 Resource Group: grpStonith7
Oct 21 11:20:20 [7684] bl460g1n6    pengine:     info: native_print: 	     prmStonith7-1	(stonith:external/ipmi):	Stopped 
Oct 21 11:20:20 [7684] bl460g1n6    pengine:     info: native_print: 	     prmStonith7-2	(stonith:external/ssh):	Stopped 
Oct 21 11:20:20 [7684] bl460g1n6    pengine:     info: group_print: 	 Resource Group: grpStonith8
Oct 21 11:20:20 [7684] bl460g1n6    pengine:     info: native_print: 	     prmStonith8-1	(stonith:external/ipmi):	Stopped 
Oct 21 11:20:20 [7684] bl460g1n6    pengine:     info: native_print: 	     prmStonith8-2	(stonith:external/ssh):	Stopped 
Oct 21 11:20:20 [7684] bl460g1n6    pengine:     info: clone_print: 	 Clone Set: clnPing [prmPing]
Oct 21 11:20:20 [7684] bl460g1n6    pengine:     info: short_print: 	     Stopped: [ bl460g1n6 bl460g1n7 bl460g1n8 ]
Oct 21 11:20:20 [7684] bl460g1n6    pengine:    debug: group_rsc_location: 	Processing rsc_location lo8-rule for grpStonith8
Oct 21 11:20:20 [7684] bl460g1n6    pengine:    debug: group_rsc_location: 	Processing rsc_location lo7-rule for grpStonith7
Oct 21 11:20:20 [7684] bl460g1n6    pengine:    debug: group_rsc_location: 	Processing rsc_location lo6-rule for grpStonith6
Oct 21 11:20:20 [7684] bl460g1n6    pengine:    debug: native_assign_node: 	Assigning bl460g1n6 to prmPing:0
Oct 21 11:20:20 [7684] bl460g1n6    pengine:    debug: native_assign_node: 	Assigning bl460g1n7 to prmPing:1
Oct 21 11:20:20 [7684] bl460g1n6    pengine:    debug: native_assign_node: 	Assigning bl460g1n8 to prmPing:2
Oct 21 11:20:20 [7684] bl460g1n6    pengine:    debug: clone_color: 	Allocated 3 clnPing instances of a possible 3
Oct 21 11:20:20 [7684] bl460g1n6    pengine:    debug: native_assign_node: 	Assigning bl460g1n6 to prmVM1
Oct 21 11:20:20 [7684] bl460g1n6    pengine:    debug: native_assign_node: 	Assigning bl460g1n6 to prmVM2
Oct 21 11:20:20 [7684] bl460g1n6    pengine:    debug: native_assign_node: 	Assigning bl460g1n6 to prmVM3
Oct 21 11:20:20 [7684] bl460g1n6    pengine:    debug: native_assign_node: 	Assigning bl460g1n7 to prmStonith6-1
Oct 21 11:20:20 [7684] bl460g1n6    pengine:    debug: native_assign_node: 	Assigning bl460g1n7 to prmStonith6-2
Oct 21 11:20:20 [7684] bl460g1n6    pengine:    debug: native_assign_node: 	Assigning bl460g1n8 to prmStonith7-1
Oct 21 11:20:20 [7684] bl460g1n6    pengine:    debug: native_assign_node: 	Assigning bl460g1n8 to prmStonith7-2
Oct 21 11:20:20 [7684] bl460g1n6    pengine:    debug: native_assign_node: 	Assigning bl460g1n7 to prmStonith8-1
Oct 21 11:20:20 [7684] bl460g1n6    pengine:    debug: native_assign_node: 	Assigning bl460g1n7 to prmStonith8-2
Oct 21 11:20:20 [7684] bl460g1n6    pengine:    debug: native_create_probe: 	Probing prmVM1 on bl460g1n6 (Stopped)
Oct 21 11:20:20 [7684] bl460g1n6    pengine:    debug: native_create_probe: 	Probing prmVM2 on bl460g1n6 (Stopped)
Oct 21 11:20:20 [7684] bl460g1n6    pengine:    debug: native_create_probe: 	Probing prmVM3 on bl460g1n6 (Stopped)
Oct 21 11:20:20 [7684] bl460g1n6    pengine:    debug: native_create_probe: 	Probing prmStonith6-1 on bl460g1n6 (Stopped)
Oct 21 11:20:20 [7684] bl460g1n6    pengine:    debug: native_create_probe: 	Probing prmStonith6-2 on bl460g1n6 (Stopped)
Oct 21 11:20:20 [7684] bl460g1n6    pengine:    debug: native_create_probe: 	Probing prmStonith7-1 on bl460g1n6 (Stopped)
Oct 21 11:20:20 [7684] bl460g1n6    pengine:    debug: native_create_probe: 	Probing prmStonith7-2 on bl460g1n6 (Stopped)
Oct 21 11:20:20 [7684] bl460g1n6    pengine:    debug: native_create_probe: 	Probing prmStonith8-1 on bl460g1n6 (Stopped)
Oct 21 11:20:20 [7684] bl460g1n6    pengine:    debug: native_create_probe: 	Probing prmStonith8-2 on bl460g1n6 (Stopped)
Oct 21 11:20:20 [7684] bl460g1n6    pengine:    debug: native_create_probe: 	Probing prmPing:0 on bl460g1n6 (Stopped)
Oct 21 11:20:20 [7684] bl460g1n6    pengine:    debug: native_create_probe: 	Probing prmVM1 on bl460g1n7 (Stopped)
Oct 21 11:20:20 [7684] bl460g1n6    pengine:    debug: native_create_probe: 	Probing prmVM2 on bl460g1n7 (Stopped)
Oct 21 11:20:20 [7684] bl460g1n6    pengine:    debug: native_create_probe: 	Probing prmVM3 on bl460g1n7 (Stopped)
Oct 21 11:20:20 [7684] bl460g1n6    pengine:    debug: native_create_probe: 	Probing prmStonith6-1 on bl460g1n7 (Stopped)
Oct 21 11:20:20 [7684] bl460g1n6    pengine:    debug: native_create_probe: 	Probing prmStonith6-2 on bl460g1n7 (Stopped)
Oct 21 11:20:20 [7684] bl460g1n6    pengine:    debug: native_create_probe: 	Probing prmStonith7-1 on bl460g1n7 (Stopped)
Oct 21 11:20:20 [7684] bl460g1n6    pengine:    debug: native_create_probe: 	Probing prmStonith7-2 on bl460g1n7 (Stopped)
Oct 21 11:20:20 [7684] bl460g1n6    pengine:    debug: native_create_probe: 	Probing prmStonith8-1 on bl460g1n7 (Stopped)
Oct 21 11:20:20 [7684] bl460g1n6    pengine:    debug: native_create_probe: 	Probing prmStonith8-2 on bl460g1n7 (Stopped)
Oct 21 11:20:20 [7684] bl460g1n6    pengine:    debug: native_create_probe: 	Probing prmPing:1 on bl460g1n7 (Stopped)
Oct 21 11:20:20 [7684] bl460g1n6    pengine:    debug: native_create_probe: 	Probing prmVM1 on bl460g1n8 (Stopped)
Oct 21 11:20:20 [7684] bl460g1n6    pengine:    debug: native_create_probe: 	Probing prmVM2 on bl460g1n8 (Stopped)
Oct 21 11:20:20 [7684] bl460g1n6    pengine:    debug: native_create_probe: 	Probing prmVM3 on bl460g1n8 (Stopped)
Oct 21 11:20:20 [7684] bl460g1n6    pengine:    debug: native_create_probe: 	Probing prmStonith6-1 on bl460g1n8 (Stopped)
Oct 21 11:20:20 [7684] bl460g1n6    pengine:    debug: native_create_probe: 	Probing prmStonith6-2 on bl460g1n8 (Stopped)
Oct 21 11:20:20 [7684] bl460g1n6    pengine:    debug: native_create_probe: 	Probing prmStonith7-1 on bl460g1n8 (Stopped)
Oct 21 11:20:20 [7684] bl460g1n6    pengine:    debug: native_create_probe: 	Probing prmStonith7-2 on bl460g1n8 (Stopped)
Oct 21 11:20:20 [7684] bl460g1n6    pengine:    debug: native_create_probe: 	Probing prmStonith8-1 on bl460g1n8 (Stopped)
Oct 21 11:20:20 [7684] bl460g1n6    pengine:    debug: native_create_probe: 	Probing prmStonith8-2 on bl460g1n8 (Stopped)
Oct 21 11:20:20 [7684] bl460g1n6    pengine:    debug: native_create_probe: 	Probing prmPing:2 on bl460g1n8 (Stopped)
Oct 21 11:20:20 [7684] bl460g1n6    pengine:     info: RecurringOp: 	 Start recurring monitor (10s) for prmVM1 on bl460g1n6
Oct 21 11:20:20 [7684] bl460g1n6    pengine:     info: RecurringOp: 	 Start recurring monitor (10s) for prmVM2 on bl460g1n6
Oct 21 11:20:20 [7684] bl460g1n6    pengine:     info: RecurringOp: 	 Start recurring monitor (10s) for prmVM3 on bl460g1n6
Oct 21 11:20:20 [7684] bl460g1n6    pengine:     info: RecurringOp: 	 Start recurring monitor (360s) for prmStonith6-1 on bl460g1n7
Oct 21 11:20:20 [7684] bl460g1n6    pengine:     info: RecurringOp: 	 Start recurring monitor (10s) for prmStonith6-2 on bl460g1n7
Oct 21 11:20:20 [7684] bl460g1n6    pengine:     info: RecurringOp: 	 Start recurring monitor (360s) for prmStonith7-1 on bl460g1n8
Oct 21 11:20:20 [7684] bl460g1n6    pengine:     info: RecurringOp: 	 Start recurring monitor (10s) for prmStonith7-2 on bl460g1n8
Oct 21 11:20:20 [7684] bl460g1n6    pengine:     info: RecurringOp: 	 Start recurring monitor (360s) for prmStonith8-1 on bl460g1n7
Oct 21 11:20:20 [7684] bl460g1n6    pengine:     info: RecurringOp: 	 Start recurring monitor (10s) for prmStonith8-2 on bl460g1n7
Oct 21 11:20:20 [7684] bl460g1n6    pengine:     info: RecurringOp: 	 Start recurring monitor (10s) for prmPing:0 on bl460g1n6
Oct 21 11:20:20 [7684] bl460g1n6    pengine:     info: RecurringOp: 	 Start recurring monitor (10s) for prmPing:1 on bl460g1n7
Oct 21 11:20:20 [7684] bl460g1n6    pengine:     info: RecurringOp: 	 Start recurring monitor (10s) for prmPing:2 on bl460g1n8
Oct 21 11:20:20 [7684] bl460g1n6    pengine:   notice: LogActions: 	Start   prmVM1	(bl460g1n6)
Oct 21 11:20:20 [7684] bl460g1n6    pengine:   notice: LogActions: 	Start   prmVM2	(bl460g1n6)
Oct 21 11:20:20 [7684] bl460g1n6    pengine:   notice: LogActions: 	Start   prmVM3	(bl460g1n6)
Oct 21 11:20:20 [7684] bl460g1n6    pengine:   notice: LogActions: 	Start   prmStonith6-1	(bl460g1n7)
Oct 21 11:20:20 [7684] bl460g1n6    pengine:   notice: LogActions: 	Start   prmStonith6-2	(bl460g1n7)
Oct 21 11:20:20 [7684] bl460g1n6    pengine:   notice: LogActions: 	Start   prmStonith7-1	(bl460g1n8)
Oct 21 11:20:20 [7684] bl460g1n6    pengine:   notice: LogActions: 	Start   prmStonith7-2	(bl460g1n8)
Oct 21 11:20:20 [7684] bl460g1n6    pengine:   notice: LogActions: 	Start   prmStonith8-1	(bl460g1n7)
Oct 21 11:20:20 [7684] bl460g1n6    pengine:   notice: LogActions: 	Start   prmStonith8-2	(bl460g1n7)
Oct 21 11:20:20 [7684] bl460g1n6    pengine:   notice: LogActions: 	Start   prmPing:0	(bl460g1n6)
Oct 21 11:20:20 [7684] bl460g1n6    pengine:   notice: LogActions: 	Start   prmPing:1	(bl460g1n7)
Oct 21 11:20:20 [7684] bl460g1n6    pengine:   notice: LogActions: 	Start   prmPing:2	(bl460g1n8)
Oct 21 11:20:20 [7684] bl460g1n6    pengine:    debug: action2xml: 	Using anonymous clone name prmPing for prmPing:0 (aka. (null))
Oct 21 11:20:20 [7684] bl460g1n6    pengine:    debug: action2xml: 	Using anonymous clone name prmPing for prmPing:0 (aka. (null))
Oct 21 11:20:20 [7684] bl460g1n6    pengine:    debug: action2xml: 	Using anonymous clone name prmPing for prmPing:0 (aka. (null))
Oct 21 11:20:20 [7684] bl460g1n6    pengine:    debug: action2xml: 	Using anonymous clone name prmPing for prmPing:1 (aka. (null))
Oct 21 11:20:20 [7684] bl460g1n6    pengine:    debug: action2xml: 	Using anonymous clone name prmPing for prmPing:1 (aka. (null))
Oct 21 11:20:20 [7684] bl460g1n6    pengine:    debug: action2xml: 	Using anonymous clone name prmPing for prmPing:1 (aka. (null))
Oct 21 11:20:20 [7684] bl460g1n6    pengine:    debug: action2xml: 	Using anonymous clone name prmPing for prmPing:2 (aka. (null))
Oct 21 11:20:20 [7684] bl460g1n6    pengine:    debug: action2xml: 	Using anonymous clone name prmPing for prmPing:2 (aka. (null))
Oct 21 11:20:20 [7684] bl460g1n6    pengine:    debug: action2xml: 	Using anonymous clone name prmPing for prmPing:2 (aka. (null))
Oct 21 11:20:20 [7684] bl460g1n6    pengine:   notice: process_pe_message: 	Calculated Transition 1: /var/lib/pacemaker/pengine/pe-input-1.bz2
Oct 21 11:20:20 [7685] bl460g1n6       crmd:    debug: s_crmd_fsa: 	Processing I_PE_SUCCESS: [ state=S_POLICY_ENGINE cause=C_IPC_MESSAGE origin=handle_response ]
Oct 21 11:20:20 [7685] bl460g1n6       crmd:     info: do_state_transition: 	State transition S_POLICY_ENGINE -> S_TRANSITION_ENGINE [ input=I_PE_SUCCESS cause=C_IPC_MESSAGE origin=handle_response ]
Oct 21 11:20:20 [7685] bl460g1n6       crmd:    debug: unpack_graph: 	Unpacked transition 1: 66 actions in 66 synapses
Oct 21 11:20:20 [7685] bl460g1n6       crmd:     info: do_te_invoke: 	Processing graph 1 (ref=pe_calc-dc-1382322020-32) derived from /var/lib/pacemaker/pengine/pe-input-1.bz2
Oct 21 11:20:20 [7685] bl460g1n6       crmd:   notice: te_rsc_command: 	Initiating action 26: monitor prmVM1_monitor_0 on bl460g1n8
Oct 21 11:20:20 [7685] bl460g1n6       crmd:   notice: te_rsc_command: 	Initiating action 15: monitor prmVM1_monitor_0 on bl460g1n7
Oct 21 11:20:20 [7685] bl460g1n6       crmd:   notice: te_rsc_command: 	Initiating action 4: monitor prmVM1_monitor_0 on bl460g1n6 (local)
Oct 21 11:20:20 [7682] bl460g1n6       lrmd:     info: process_lrmd_get_rsc_info: 	Resource 'prmVM1' not found (0 active resources)
Oct 21 11:20:20 [7682] bl460g1n6       lrmd:    debug: process_lrmd_message: 	Processed lrmd_rsc_info operation from 666c66a8-da92-4298-beea-12fd671d2b0d: rc=0, reply=0, notify=0, exit=4201864
Oct 21 11:20:20 [7682] bl460g1n6       lrmd:     info: process_lrmd_rsc_register: 	Added 'prmVM1' to the rsc list (1 active resources)
Oct 21 11:20:20 [7682] bl460g1n6       lrmd:    debug: process_lrmd_message: 	Processed lrmd_rsc_register operation from 666c66a8-da92-4298-beea-12fd671d2b0d: rc=0, reply=1, notify=1, exit=4201864
Oct 21 11:20:20 [7682] bl460g1n6       lrmd:    debug: process_lrmd_message: 	Processed lrmd_rsc_info operation from 666c66a8-da92-4298-beea-12fd671d2b0d: rc=0, reply=0, notify=0, exit=4201864
Oct 21 11:20:20 [7685] bl460g1n6       crmd:     info: do_lrm_rsc_op: 	Performing key=4:1:7:38db9a68-056c-4e65-8658-75f0c3cc91e5 op=prmVM1_monitor_0
Oct 21 11:20:20 [7682] bl460g1n6       lrmd:    debug: process_lrmd_message: 	Processed lrmd_rsc_exec operation from 666c66a8-da92-4298-beea-12fd671d2b0d: rc=5, reply=1, notify=0, exit=4201864
Oct 21 11:20:20 [7682] bl460g1n6       lrmd:    debug: log_execute: 	executing - rsc:prmVM1 action:monitor call_id:5
Oct 21 11:20:20 [7685] bl460g1n6       crmd:   notice: te_rsc_command: 	Initiating action 27: monitor prmVM2_monitor_0 on bl460g1n8
Oct 21 11:20:20 [7685] bl460g1n6       crmd:   notice: te_rsc_command: 	Initiating action 16: monitor prmVM2_monitor_0 on bl460g1n7
Oct 21 11:20:20 [7685] bl460g1n6       crmd:   notice: te_rsc_command: 	Initiating action 5: monitor prmVM2_monitor_0 on bl460g1n6 (local)
Oct 21 11:20:20 [7682] bl460g1n6       lrmd:     info: process_lrmd_get_rsc_info: 	Resource 'prmVM2' not found (1 active resources)
Oct 21 11:20:20 [7682] bl460g1n6       lrmd:    debug: process_lrmd_message: 	Processed lrmd_rsc_info operation from 666c66a8-da92-4298-beea-12fd671d2b0d: rc=0, reply=0, notify=0, exit=4201864
Oct 21 11:20:20 [7682] bl460g1n6       lrmd:     info: process_lrmd_rsc_register: 	Added 'prmVM2' to the rsc list (2 active resources)
Oct 21 11:20:20 [7682] bl460g1n6       lrmd:    debug: process_lrmd_message: 	Processed lrmd_rsc_register operation from 666c66a8-da92-4298-beea-12fd671d2b0d: rc=0, reply=1, notify=1, exit=4201864
Oct 21 11:20:20 [7682] bl460g1n6       lrmd:    debug: process_lrmd_message: 	Processed lrmd_rsc_info operation from 666c66a8-da92-4298-beea-12fd671d2b0d: rc=0, reply=0, notify=0, exit=4201864
Oct 21 11:20:20 [7685] bl460g1n6       crmd:     info: do_lrm_rsc_op: 	Performing key=5:1:7:38db9a68-056c-4e65-8658-75f0c3cc91e5 op=prmVM2_monitor_0
Oct 21 11:20:20 [7682] bl460g1n6       lrmd:    debug: process_lrmd_message: 	Processed lrmd_rsc_exec operation from 666c66a8-da92-4298-beea-12fd671d2b0d: rc=9, reply=1, notify=0, exit=4201864
Oct 21 11:20:20 [7682] bl460g1n6       lrmd:    debug: log_execute: 	executing - rsc:prmVM2 action:monitor call_id:9
Oct 21 11:20:20 [7685] bl460g1n6       crmd:   notice: te_rsc_command: 	Initiating action 28: monitor prmVM3_monitor_0 on bl460g1n8
Oct 21 11:20:20 [7685] bl460g1n6       crmd:   notice: te_rsc_command: 	Initiating action 17: monitor prmVM3_monitor_0 on bl460g1n7
Oct 21 11:20:20 [7685] bl460g1n6       crmd:   notice: te_rsc_command: 	Initiating action 6: monitor prmVM3_monitor_0 on bl460g1n6 (local)
Oct 21 11:20:20 [7682] bl460g1n6       lrmd:     info: process_lrmd_get_rsc_info: 	Resource 'prmVM3' not found (2 active resources)
Oct 21 11:20:20 [7682] bl460g1n6       lrmd:    debug: process_lrmd_message: 	Processed lrmd_rsc_info operation from 666c66a8-da92-4298-beea-12fd671d2b0d: rc=0, reply=0, notify=0, exit=4201864
Oct 21 11:20:20 [7682] bl460g1n6       lrmd:     info: process_lrmd_rsc_register: 	Added 'prmVM3' to the rsc list (3 active resources)
Oct 21 11:20:20 [7682] bl460g1n6       lrmd:    debug: process_lrmd_message: 	Processed lrmd_rsc_register operation from 666c66a8-da92-4298-beea-12fd671d2b0d: rc=0, reply=1, notify=1, exit=4201864
Oct 21 11:20:20 [7682] bl460g1n6       lrmd:    debug: process_lrmd_message: 	Processed lrmd_rsc_info operation from 666c66a8-da92-4298-beea-12fd671d2b0d: rc=0, reply=0, notify=0, exit=4201864
Oct 21 11:20:20 [7685] bl460g1n6       crmd:     info: do_lrm_rsc_op: 	Performing key=6:1:7:38db9a68-056c-4e65-8658-75f0c3cc91e5 op=prmVM3_monitor_0
Oct 21 11:20:20 [7682] bl460g1n6       lrmd:    debug: process_lrmd_message: 	Processed lrmd_rsc_exec operation from 666c66a8-da92-4298-beea-12fd671d2b0d: rc=13, reply=1, notify=0, exit=4201864
Oct 21 11:20:20 [7682] bl460g1n6       lrmd:    debug: log_execute: 	executing - rsc:prmVM3 action:monitor call_id:13
Oct 21 11:20:20 [7685] bl460g1n6       crmd:    debug: te_pseudo_action: 	Pseudo action 46 fired and confirmed
Oct 21 11:20:20 [7685] bl460g1n6       crmd:   notice: te_rsc_command: 	Initiating action 29: monitor prmStonith6-1_monitor_0 on bl460g1n8
Oct 21 11:20:20 [7685] bl460g1n6       crmd:   notice: te_rsc_command: 	Initiating action 18: monitor prmStonith6-1_monitor_0 on bl460g1n7
Oct 21 11:20:20 [7685] bl460g1n6       crmd:   notice: te_rsc_command: 	Initiating action 7: monitor prmStonith6-1_monitor_0 on bl460g1n6 (local)
Oct 21 11:20:20 [7682] bl460g1n6       lrmd:     info: process_lrmd_get_rsc_info: 	Resource 'prmStonith6-1' not found (3 active resources)
Oct 21 11:20:20 [7682] bl460g1n6       lrmd:    debug: process_lrmd_message: 	Processed lrmd_rsc_info operation from 666c66a8-da92-4298-beea-12fd671d2b0d: rc=0, reply=0, notify=0, exit=4201864
Oct 21 11:20:20 [7682] bl460g1n6       lrmd:     info: process_lrmd_rsc_register: 	Added 'prmStonith6-1' to the rsc list (4 active resources)
Oct 21 11:20:20 [7682] bl460g1n6       lrmd:    debug: process_lrmd_message: 	Processed lrmd_rsc_register operation from 666c66a8-da92-4298-beea-12fd671d2b0d: rc=0, reply=1, notify=1, exit=4201864
Oct 21 11:20:20 [7682] bl460g1n6       lrmd:    debug: process_lrmd_message: 	Processed lrmd_rsc_info operation from 666c66a8-da92-4298-beea-12fd671d2b0d: rc=0, reply=0, notify=0, exit=4201864
Oct 21 11:20:20 [7685] bl460g1n6       crmd:     info: do_lrm_rsc_op: 	Performing key=7:1:7:38db9a68-056c-4e65-8658-75f0c3cc91e5 op=prmStonith6-1_monitor_0
Oct 21 11:20:20 [7682] bl460g1n6       lrmd:    debug: process_lrmd_message: 	Processed lrmd_rsc_exec operation from 666c66a8-da92-4298-beea-12fd671d2b0d: rc=17, reply=1, notify=0, exit=4201864
Oct 21 11:20:20 [7682] bl460g1n6       lrmd:    debug: log_execute: 	executing - rsc:prmStonith6-1 action:monitor call_id:17
Oct 21 11:20:20 [7685] bl460g1n6       crmd:   notice: te_rsc_command: 	Initiating action 30: monitor prmStonith6-2_monitor_0 on bl460g1n8
Oct 21 11:20:20 [7685] bl460g1n6       crmd:   notice: te_rsc_command: 	Initiating action 19: monitor prmStonith6-2_monitor_0 on bl460g1n7
Oct 21 11:20:20 [7685] bl460g1n6       crmd:   notice: te_rsc_command: 	Initiating action 8: monitor prmStonith6-2_monitor_0 on bl460g1n6 (local)
VirtualDomain(prmVM1)[7789]:	2013/10/21_11:20:20 DEBUG: Virtual domain vm1 is currently running.
Oct 21 11:20:20 [7680] bl460g1n6        cib:    debug: activateCibXml: 	Triggering CIB write for cib_modify op
Oct 21 11:20:20 [7680] bl460g1n6        cib:   notice: cib:diff: 	Diff: --- 0.8.7
Oct 21 11:20:20 [7680] bl460g1n6        cib:   notice: cib:diff: 	Diff: +++ 0.9.1 d8a6e5052646e205343278a9d1e7e1ca
Oct 21 11:20:20 [7680] bl460g1n6        cib:   notice: cib:diff: 	-- <cib admin_epoch="0" epoch="8" num_updates="7"/>
Oct 21 11:20:20 [7680] bl460g1n6        cib:   notice: cib:diff: 	++         <utilization id="prmVM1-utilization">
Oct 21 11:20:20 [7680] bl460g1n6        cib:   notice: cib:diff: 	++           <nvpair id="prmVM1-utilization-cpu" name="cpu" value="1"/>
Oct 21 11:20:20 [7680] bl460g1n6        cib:   notice: cib:diff: 	++         </utilization>
Oct 21 11:20:20 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section resources: OK (rc=0, origin=bl460g1n8/crm_resource/5, version=0.9.1)
Oct 21 11:20:20 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section resources: OK (rc=0, origin=bl460g1n7/crm_resource/5, version=0.9.1)
VirtualDomain(prmVM2)[7790]:	2013/10/21_11:20:20 DEBUG: Virtual domain vm2 is currently running.
Oct 21 11:20:20 [7680] bl460g1n6        cib:    debug: activateCibXml: 	Triggering CIB write for cib_modify op
Oct 21 11:20:20 [7680] bl460g1n6        cib:   notice: log_cib_diff: 	cib:diff: Local-only Change: 0.10.1
Oct 21 11:20:20 [7680] bl460g1n6        cib:   notice: cib:diff: 	-- <cib admin_epoch="0" epoch="9" num_updates="1"/>
Oct 21 11:20:20 [7680] bl460g1n6        cib:   notice: cib:diff: 	++         <utilization id="prmVM2-utilization">
Oct 21 11:20:20 [7680] bl460g1n6        cib:   notice: cib:diff: 	++           <nvpair id="prmVM2-utilization-cpu" name="cpu" value="1"/>
Oct 21 11:20:20 [7680] bl460g1n6        cib:   notice: cib:diff: 	++         </utilization>
Oct 21 11:20:20 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section resources: OK (rc=0, origin=bl460g1n8/crm_resource/5, version=0.10.1)
VirtualDomain(prmVM3)[7791]:	2013/10/21_11:20:20 DEBUG: Virtual domain vm3 is currently running.
Oct 21 11:20:20 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section resources: OK (rc=0, origin=bl460g1n7/crm_resource/5, version=0.10.1)
Oct 21 11:20:20 [7680] bl460g1n6        cib:    debug: activateCibXml: 	Triggering CIB write for cib_modify op
Oct 21 11:20:20 [7680] bl460g1n6        cib:   notice: log_cib_diff: 	cib:diff: Local-only Change: 0.11.1
Oct 21 11:20:20 [7680] bl460g1n6        cib:   notice: cib:diff: 	-- <cib admin_epoch="0" epoch="10" num_updates="1"/>
Oct 21 11:20:20 [7680] bl460g1n6        cib:   notice: cib:diff: 	++         <utilization id="prmVM3-utilization">
Oct 21 11:20:20 [7680] bl460g1n6        cib:   notice: cib:diff: 	++           <nvpair id="prmVM3-utilization-cpu" name="cpu" value="1"/>
Oct 21 11:20:20 [7680] bl460g1n6        cib:   notice: cib:diff: 	++         </utilization>
Oct 21 11:20:20 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section resources: OK (rc=0, origin=bl460g1n7/crm_resource/5, version=0.11.1)
Oct 21 11:20:20 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section resources: OK (rc=0, origin=bl460g1n8/crm_resource/5, version=0.11.1)
Oct 21 11:20:21 [7680] bl460g1n6        cib:     info: write_cib_contents: 	Archived previous version as /var/lib/pacemaker/cib/cib-4.raw
Oct 21 11:20:21 [7680] bl460g1n6        cib:    debug: write_cib_contents: 	Writing CIB to disk
Oct 21 11:20:21 [7680] bl460g1n6        cib:     info: crm_client_new: 	Connecting 0xc0f470 for uid=0 gid=0 pid=7884 id=db3d3d26-61b2-48f3-a13c-d06d0d5852bd
Oct 21 11:20:21 [7680] bl460g1n6        cib:    debug: handle_new_connection: 	IPC credentials authenticated (7680-7884-14)
Oct 21 11:20:21 [7680] bl460g1n6        cib:    debug: qb_ipcs_shm_connect: 	connecting to client [7884]
Oct 21 11:20:21 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:21 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:21 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:21 [7884] bl460g1n6 crm_resource:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:21 [7884] bl460g1n6 crm_resource:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:21 [7884] bl460g1n6 crm_resource:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:21 [7884] bl460g1n6 crm_resource:    debug: cib_native_signon_raw: 	Connection to CIB successful
Oct 21 11:20:21 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_query operation for section 'all': OK (rc=0, origin=local/crm_resource/2, version=0.11.1)
Oct 21 11:20:21 [7884] bl460g1n6 crm_resource:    debug: unpack_config: 	STONITH timeout: 60000
Oct 21 11:20:21 [7884] bl460g1n6 crm_resource:    debug: unpack_config: 	STONITH of failed nodes is enabled
Oct 21 11:20:21 [7884] bl460g1n6 crm_resource:    debug: unpack_config: 	Stop all active resources: false
Oct 21 11:20:21 [7884] bl460g1n6 crm_resource:    debug: unpack_config: 	Cluster is symmetric - resources can run anywhere by default
Oct 21 11:20:21 [7884] bl460g1n6 crm_resource:    debug: unpack_config: 	Default stickiness: 0
Oct 21 11:20:21 [7884] bl460g1n6 crm_resource:    debug: unpack_config: 	On loss of CCM Quorum: Freeze resources
Oct 21 11:20:21 [7884] bl460g1n6 crm_resource:    debug: unpack_config: 	Node scores: 'red' = -INFINITY, 'yellow' = 0, 'green' = 0
Oct 21 11:20:21 [7884] bl460g1n6 crm_resource:  warning: handle_startup_fencing: 	Blind faith: not fencing unseen nodes
Oct 21 11:20:21 [7884] bl460g1n6 crm_resource:    debug: unpack_domains: 	Unpacking domains
Oct 21 11:20:21 [7884] bl460g1n6 crm_resource:     info: determine_online_status_fencing: 	Node bl460g1n8 is active
Oct 21 11:20:21 [7884] bl460g1n6 crm_resource:     info: determine_online_status: 	Node bl460g1n8 is online
Oct 21 11:20:21 [7884] bl460g1n6 crm_resource:     info: determine_online_status_fencing: 	Node bl460g1n6 is active
Oct 21 11:20:21 [7884] bl460g1n6 crm_resource:     info: determine_online_status: 	Node bl460g1n6 is online
Oct 21 11:20:21 [7884] bl460g1n6 crm_resource:     info: determine_online_status_fencing: 	Node bl460g1n7 is active
Oct 21 11:20:21 [7884] bl460g1n6 crm_resource:     info: determine_online_status: 	Node bl460g1n7 is online
Oct 21 11:20:21 [7884] bl460g1n6 crm_resource:    debug: dump_resource_attr: 	Looking up cpu in prmVM1
Oct 21 11:20:21 [7884] bl460g1n6 crm_resource:    debug: cib_native_signoff: 	Signing out of the CIB Service
Oct 21 11:20:21 [7884] bl460g1n6 crm_resource:    debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Oct 21 11:20:21 [7884] bl460g1n6 crm_resource:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-request-7680-7884-14-header
Oct 21 11:20:21 [7884] bl460g1n6 crm_resource:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-response-7680-7884-14-header
Oct 21 11:20:21 [7884] bl460g1n6 crm_resource:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-event-7680-7884-14-header
Oct 21 11:20:21 [7884] bl460g1n6 crm_resource:     info: crm_xml_cleanup: 	Cleaning up memory from libxml2
Oct 21 11:20:21 [7680] bl460g1n6        cib:    debug: activateCibXml: 	Triggering CIB write for cib_modify op
Oct 21 11:20:21 [7680] bl460g1n6        cib:     info: write_cib_contents: 	Wrote version 0.11.0 of the CIB to disk (digest: 386307e745636a0ab3808ef7dcbad794)
Oct 21 11:20:21 [7680] bl460g1n6        cib:   notice: log_cib_diff: 	cib:diff: Local-only Change: 0.12.1
Oct 21 11:20:21 [7680] bl460g1n6        cib:   notice: cib:diff: 	-- <cib admin_epoch="0" epoch="11" num_updates="1"/>
Oct 21 11:20:21 [7680] bl460g1n6        cib:   notice: cib:diff: 	++           <nvpair id="prmVM1-utilization-hv_memory" name="hv_memory" value="2048"/>
Oct 21 11:20:21 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section resources: OK (rc=0, origin=bl460g1n8/crm_resource/5, version=0.12.1)
Oct 21 11:20:21 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section resources: OK (rc=0, origin=bl460g1n7/crm_resource/5, version=0.12.1)
Oct 21 11:20:21 [7680] bl460g1n6        cib:    debug: activateCibXml: 	Triggering CIB write for cib_modify op
Oct 21 11:20:21 [7680] bl460g1n6        cib:   notice: log_cib_diff: 	cib:diff: Local-only Change: 0.13.1
Oct 21 11:20:21 [7680] bl460g1n6        cib:   notice: cib:diff: 	-- <cib admin_epoch="0" epoch="12" num_updates="1"/>
Oct 21 11:20:21 [7680] bl460g1n6        cib:   notice: cib:diff: 	++           <nvpair id="prmVM2-utilization-hv_memory" name="hv_memory" value="2048"/>
Oct 21 11:20:21 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section resources: OK (rc=0, origin=bl460g1n7/crm_resource/5, version=0.13.1)
Oct 21 11:20:21 [7680] bl460g1n6        cib:    debug: activateCibXml: 	Triggering CIB write for cib_modify op
Oct 21 11:20:21 [7680] bl460g1n6        cib:   notice: log_cib_diff: 	cib:diff: Local-only Change: 0.14.1
Oct 21 11:20:21 [7680] bl460g1n6        cib:   notice: cib:diff: 	-- <cib admin_epoch="0" epoch="13" num_updates="1"/>
Oct 21 11:20:21 [7680] bl460g1n6        cib:   notice: cib:diff: 	++           <nvpair id="prmVM3-utilization-hv_memory" name="hv_memory" value="2048"/>
Oct 21 11:20:21 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section resources: OK (rc=0, origin=bl460g1n7/crm_resource/5, version=0.14.1)
Oct 21 11:20:21 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section resources: OK (rc=0, origin=bl460g1n8/crm_resource/5, version=0.14.1)
Oct 21 11:20:21 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section resources: OK (rc=0, origin=bl460g1n8/crm_resource/5, version=0.14.1)
Oct 21 11:20:21 [7680] bl460g1n6        cib:     info: crm_client_new: 	Connecting 0xc124a0 for uid=0 gid=0 pid=7888 id=0ac898d7-d0fd-46bd-9729-bb6bae9be2b7
Oct 21 11:20:21 [7680] bl460g1n6        cib:    debug: handle_new_connection: 	IPC credentials authenticated (7680-7888-15)
Oct 21 11:20:21 [7680] bl460g1n6        cib:    debug: qb_ipcs_shm_connect: 	connecting to client [7888]
Oct 21 11:20:21 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:21 [7680] bl460g1n6        cib:    debug: write_cib_contents: 	Wrote digest 386307e745636a0ab3808ef7dcbad794 to disk
Oct 21 11:20:21 [7680] bl460g1n6        cib:     info: retrieveCib: 	Reading cluster configuration from: /var/lib/pacemaker/cib/cib.vdhi3v (digest: /var/lib/pacemaker/cib/cib.QWxpGR)
Oct 21 11:20:21 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:21 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:21 [7680] bl460g1n6        cib:    debug: qb_ipcs_dispatch_connection_request: 	HUP conn (7680-7884-14)
Oct 21 11:20:21 [7680] bl460g1n6        cib:    debug: qb_ipcs_disconnect: 	qb_ipcs_disconnect(7680-7884-14) state:2
Oct 21 11:20:21 [7680] bl460g1n6        cib:     info: crm_client_destroy: 	Destroying 0 events
Oct 21 11:20:21 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-response-7680-7884-14-header
Oct 21 11:20:21 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-event-7680-7884-14-header
Oct 21 11:20:21 [7888] bl460g1n6 crm_resource:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:21 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-request-7680-7884-14-header
Oct 21 11:20:21 [7888] bl460g1n6 crm_resource:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:21 [7680] bl460g1n6        cib:     info: crm_client_new: 	Connecting 0xc10e00 for uid=0 gid=0 pid=7886 id=20edd739-8cbb-4cac-a289-e0f21c580f0c
Oct 21 11:20:21 [7888] bl460g1n6 crm_resource:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:21 [7680] bl460g1n6        cib:    debug: handle_new_connection: 	IPC credentials authenticated (7680-7886-14)
Oct 21 11:20:21 [7680] bl460g1n6        cib:    debug: qb_ipcs_shm_connect: 	connecting to client [7886]
Oct 21 11:20:21 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:21 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:21 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:21 [7886] bl460g1n6 crm_resource:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:21 [7886] bl460g1n6 crm_resource:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:21 [7886] bl460g1n6 crm_resource:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:21 [7888] bl460g1n6 crm_resource:    debug: cib_native_signon_raw: 	Connection to CIB successful
Oct 21 11:20:21 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_query operation for section 'all': OK (rc=0, origin=local/crm_resource/2, version=0.14.1)
Oct 21 11:20:21 [7886] bl460g1n6 crm_resource:    debug: cib_native_signon_raw: 	Connection to CIB successful
Oct 21 11:20:21 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_query operation for section 'all': OK (rc=0, origin=local/crm_resource/2, version=0.14.1)
Oct 21 11:20:21 [7680] bl460g1n6        cib:    debug: write_cib_contents: 	Activating /var/lib/pacemaker/cib/cib.vdhi3v
Oct 21 11:20:21 [7888] bl460g1n6 crm_resource:    debug: unpack_config: 	STONITH timeout: 60000
Oct 21 11:20:21 [7888] bl460g1n6 crm_resource:    debug: unpack_config: 	STONITH of failed nodes is enabled
Oct 21 11:20:21 [7886] bl460g1n6 crm_resource:    debug: unpack_config: 	STONITH timeout: 60000
Oct 21 11:20:21 [7888] bl460g1n6 crm_resource:    debug: unpack_config: 	Stop all active resources: false
Oct 21 11:20:21 [7886] bl460g1n6 crm_resource:    debug: unpack_config: 	STONITH of failed nodes is enabled
Oct 21 11:20:21 [7888] bl460g1n6 crm_resource:    debug: unpack_config: 	Cluster is symmetric - resources can run anywhere by default
Oct 21 11:20:21 [7886] bl460g1n6 crm_resource:    debug: unpack_config: 	Stop all active resources: false
Oct 21 11:20:21 [7888] bl460g1n6 crm_resource:    debug: unpack_config: 	Default stickiness: 0
Oct 21 11:20:21 [7886] bl460g1n6 crm_resource:    debug: unpack_config: 	Cluster is symmetric - resources can run anywhere by default
Oct 21 11:20:21 [7888] bl460g1n6 crm_resource:    debug: unpack_config: 	On loss of CCM Quorum: Freeze resources
Oct 21 11:20:21 [7886] bl460g1n6 crm_resource:    debug: unpack_config: 	Default stickiness: 0
Oct 21 11:20:21 [7888] bl460g1n6 crm_resource:    debug: unpack_config: 	Node scores: 'red' = -INFINITY, 'yellow' = 0, 'green' = 0
Oct 21 11:20:21 [7886] bl460g1n6 crm_resource:    debug: unpack_config: 	On loss of CCM Quorum: Freeze resources
Oct 21 11:20:21 [7886] bl460g1n6 crm_resource:    debug: unpack_config: 	Node scores: 'red' = -INFINITY, 'yellow' = 0, 'green' = 0
Oct 21 11:20:21 [7888] bl460g1n6 crm_resource:  warning: handle_startup_fencing: 	Blind faith: not fencing unseen nodes
Oct 21 11:20:21 [7888] bl460g1n6 crm_resource:    debug: unpack_domains: 	Unpacking domains
Oct 21 11:20:21 [7886] bl460g1n6 crm_resource:  warning: handle_startup_fencing: 	Blind faith: not fencing unseen nodes
Oct 21 11:20:21 [7886] bl460g1n6 crm_resource:    debug: unpack_domains: 	Unpacking domains
Oct 21 11:20:21 [7888] bl460g1n6 crm_resource:     info: determine_online_status_fencing: 	Node bl460g1n8 is active
Oct 21 11:20:21 [7888] bl460g1n6 crm_resource:     info: determine_online_status: 	Node bl460g1n8 is online
Oct 21 11:20:21 [7888] bl460g1n6 crm_resource:     info: determine_online_status_fencing: 	Node bl460g1n6 is active
Oct 21 11:20:21 [7888] bl460g1n6 crm_resource:     info: determine_online_status: 	Node bl460g1n6 is online
Oct 21 11:20:21 [7888] bl460g1n6 crm_resource:     info: determine_online_status_fencing: 	Node bl460g1n7 is active
Oct 21 11:20:21 [7888] bl460g1n6 crm_resource:     info: determine_online_status: 	Node bl460g1n7 is online
Oct 21 11:20:21 [7888] bl460g1n6 crm_resource:    debug: dump_resource_attr: 	Looking up cpu in prmVM3
Oct 21 11:20:21 [7886] bl460g1n6 crm_resource:     info: determine_online_status_fencing: 	Node bl460g1n8 is active
Oct 21 11:20:21 [7886] bl460g1n6 crm_resource:     info: determine_online_status: 	Node bl460g1n8 is online
Oct 21 11:20:21 [7886] bl460g1n6 crm_resource:     info: determine_online_status_fencing: 	Node bl460g1n6 is active
Oct 21 11:20:21 [7886] bl460g1n6 crm_resource:     info: determine_online_status: 	Node bl460g1n6 is online
Oct 21 11:20:21 [7886] bl460g1n6 crm_resource:     info: determine_online_status_fencing: 	Node bl460g1n7 is active
Oct 21 11:20:21 [7886] bl460g1n6 crm_resource:     info: determine_online_status: 	Node bl460g1n7 is online
Oct 21 11:20:21 [7886] bl460g1n6 crm_resource:    debug: dump_resource_attr: 	Looking up cpu in prmVM2
Oct 21 11:20:21 [7888] bl460g1n6 crm_resource:    debug: cib_native_signoff: 	Signing out of the CIB Service
Oct 21 11:20:21 [7888] bl460g1n6 crm_resource:    debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Oct 21 11:20:21 [7680] bl460g1n6        cib:    debug: qb_ipcs_dispatch_connection_request: 	HUP conn (7680-7888-15)
Oct 21 11:20:21 [7888] bl460g1n6 crm_resource:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-request-7680-7888-15-header
Oct 21 11:20:21 [7886] bl460g1n6 crm_resource:    debug: cib_native_signoff: 	Signing out of the CIB Service
Oct 21 11:20:21 [7680] bl460g1n6        cib:    debug: qb_ipcs_disconnect: 	qb_ipcs_disconnect(7680-7888-15) state:2
Oct 21 11:20:21 [7888] bl460g1n6 crm_resource:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-response-7680-7888-15-header
Oct 21 11:20:21 [7886] bl460g1n6 crm_resource:    debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Oct 21 11:20:21 [7888] bl460g1n6 crm_resource:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-event-7680-7888-15-header
Oct 21 11:20:21 [7886] bl460g1n6 crm_resource:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-request-7680-7886-14-header
Oct 21 11:20:21 [7886] bl460g1n6 crm_resource:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-response-7680-7886-14-header
Oct 21 11:20:21 [7888] bl460g1n6 crm_resource:     info: crm_xml_cleanup: 	Cleaning up memory from libxml2
Oct 21 11:20:21 [7680] bl460g1n6        cib:     info: crm_client_destroy: 	Destroying 0 events
Oct 21 11:20:21 [7886] bl460g1n6 crm_resource:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-event-7680-7886-14-header
Oct 21 11:20:21 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-response-7680-7888-15-header
Oct 21 11:20:21 [7886] bl460g1n6 crm_resource:     info: crm_xml_cleanup: 	Cleaning up memory from libxml2
Oct 21 11:20:21 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-event-7680-7888-15-header
Oct 21 11:20:21 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-request-7680-7888-15-header
Oct 21 11:20:21 [7680] bl460g1n6        cib:    debug: qb_ipcs_dispatch_connection_request: 	HUP conn (7680-7886-14)
Oct 21 11:20:21 [7680] bl460g1n6        cib:    debug: qb_ipcs_disconnect: 	qb_ipcs_disconnect(7680-7886-14) state:2
Oct 21 11:20:21 [7680] bl460g1n6        cib:     info: crm_client_destroy: 	Destroying 0 events
Oct 21 11:20:21 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-response-7680-7886-14-header
Oct 21 11:20:21 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-event-7680-7886-14-header
Oct 21 11:20:21 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-request-7680-7886-14-header
Oct 21 11:20:21 [7680] bl460g1n6        cib:     info: crm_client_new: 	Connecting 0xc124a0 for uid=0 gid=0 pid=7894 id=00863edc-a4c7-4493-9f76-bf7ec2152f32
Oct 21 11:20:21 [7680] bl460g1n6        cib:    debug: handle_new_connection: 	IPC credentials authenticated (7680-7894-14)
Oct 21 11:20:21 [7680] bl460g1n6        cib:    debug: qb_ipcs_shm_connect: 	connecting to client [7894]
Oct 21 11:20:21 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:21 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:21 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:21 [7894] bl460g1n6 crm_resource:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:21 [7894] bl460g1n6 crm_resource:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:21 [7894] bl460g1n6 crm_resource:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:21 [7894] bl460g1n6 crm_resource:    debug: cib_native_signon_raw: 	Connection to CIB successful
Oct 21 11:20:21 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_query operation for section 'all': OK (rc=0, origin=local/crm_resource/2, version=0.14.1)
Oct 21 11:20:21 [7894] bl460g1n6 crm_resource:    debug: unpack_config: 	STONITH timeout: 60000
Oct 21 11:20:21 [7894] bl460g1n6 crm_resource:    debug: unpack_config: 	STONITH of failed nodes is enabled
Oct 21 11:20:21 [7894] bl460g1n6 crm_resource:    debug: unpack_config: 	Stop all active resources: false
Oct 21 11:20:21 [7894] bl460g1n6 crm_resource:    debug: unpack_config: 	Cluster is symmetric - resources can run anywhere by default
Oct 21 11:20:21 [7894] bl460g1n6 crm_resource:    debug: unpack_config: 	Default stickiness: 0
Oct 21 11:20:21 [7894] bl460g1n6 crm_resource:    debug: unpack_config: 	On loss of CCM Quorum: Freeze resources
Oct 21 11:20:21 [7894] bl460g1n6 crm_resource:    debug: unpack_config: 	Node scores: 'red' = -INFINITY, 'yellow' = 0, 'green' = 0
Oct 21 11:20:21 [7894] bl460g1n6 crm_resource:  warning: handle_startup_fencing: 	Blind faith: not fencing unseen nodes
Oct 21 11:20:21 [7894] bl460g1n6 crm_resource:    debug: unpack_domains: 	Unpacking domains
Oct 21 11:20:21 [7894] bl460g1n6 crm_resource:     info: determine_online_status_fencing: 	Node bl460g1n8 is active
Oct 21 11:20:21 [7894] bl460g1n6 crm_resource:     info: determine_online_status: 	Node bl460g1n8 is online
Oct 21 11:20:21 [7894] bl460g1n6 crm_resource:     info: determine_online_status_fencing: 	Node bl460g1n6 is active
Oct 21 11:20:21 [7894] bl460g1n6 crm_resource:     info: determine_online_status: 	Node bl460g1n6 is online
Oct 21 11:20:21 [7894] bl460g1n6 crm_resource:     info: determine_online_status_fencing: 	Node bl460g1n7 is active
Oct 21 11:20:21 [7894] bl460g1n6 crm_resource:     info: determine_online_status: 	Node bl460g1n7 is online
Oct 21 11:20:21 [7894] bl460g1n6 crm_resource:    debug: dump_resource_attr: 	Looking up hv_memory in prmVM1
Oct 21 11:20:21 [7894] bl460g1n6 crm_resource:    debug: cib_native_signoff: 	Signing out of the CIB Service
Oct 21 11:20:21 [7894] bl460g1n6 crm_resource:    debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Oct 21 11:20:21 [7680] bl460g1n6        cib:    debug: qb_ipcs_dispatch_connection_request: 	HUP conn (7680-7894-14)
Oct 21 11:20:21 [7894] bl460g1n6 crm_resource:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-request-7680-7894-14-header
Oct 21 11:20:21 [7680] bl460g1n6        cib:    debug: qb_ipcs_disconnect: 	qb_ipcs_disconnect(7680-7894-14) state:2
Oct 21 11:20:21 [7894] bl460g1n6 crm_resource:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-response-7680-7894-14-header
Oct 21 11:20:21 [7894] bl460g1n6 crm_resource:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-event-7680-7894-14-header
Oct 21 11:20:21 [7894] bl460g1n6 crm_resource:     info: crm_xml_cleanup: 	Cleaning up memory from libxml2
Oct 21 11:20:21 [7680] bl460g1n6        cib:     info: crm_client_destroy: 	Destroying 0 events
Oct 21 11:20:21 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-response-7680-7894-14-header
Oct 21 11:20:21 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-event-7680-7894-14-header
Oct 21 11:20:21 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-request-7680-7894-14-header
Oct 21 11:20:21 [7680] bl460g1n6        cib:     info: write_cib_contents: 	Archived previous version as /var/lib/pacemaker/cib/cib-5.raw
Oct 21 11:20:21 [7680] bl460g1n6        cib:    debug: write_cib_contents: 	Writing CIB to disk
Oct 21 11:20:21 [7680] bl460g1n6        cib:     info: crm_client_new: 	Connecting 0xc124a0 for uid=0 gid=0 pid=7905 id=5172bb1f-d640-4d95-8d01-dde495147f99
Oct 21 11:20:21 [7680] bl460g1n6        cib:    debug: handle_new_connection: 	IPC credentials authenticated (7680-7905-14)
Oct 21 11:20:21 [7680] bl460g1n6        cib:    debug: qb_ipcs_shm_connect: 	connecting to client [7905]
Oct 21 11:20:21 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:21 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:21 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:21 [7905] bl460g1n6 crm_resource:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:21 [7905] bl460g1n6 crm_resource:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:21 [7905] bl460g1n6 crm_resource:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:21 [7905] bl460g1n6 crm_resource:    debug: cib_native_signon_raw: 	Connection to CIB successful
Oct 21 11:20:21 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_query operation for section 'all': OK (rc=0, origin=local/crm_resource/2, version=0.14.1)
Oct 21 11:20:21 [7680] bl460g1n6        cib:     info: crm_client_new: 	Connecting 0xc10e00 for uid=0 gid=0 pid=7907 id=4ec8d76f-7cf3-4767-a9ef-8b21cd4edd95
Oct 21 11:20:21 [7680] bl460g1n6        cib:    debug: handle_new_connection: 	IPC credentials authenticated (7680-7907-15)
Oct 21 11:20:21 [7680] bl460g1n6        cib:    debug: qb_ipcs_shm_connect: 	connecting to client [7907]
Oct 21 11:20:21 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:21 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:21 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:21 [7907] bl460g1n6 crm_resource:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:21 [7907] bl460g1n6 crm_resource:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:21 [7907] bl460g1n6 crm_resource:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:21 [7680] bl460g1n6        cib:     info: write_cib_contents: 	Wrote version 0.14.0 of the CIB to disk (digest: d03db20616fa0f9347c494c5a518bf99)
Oct 21 11:20:21 [7905] bl460g1n6 crm_resource:    debug: unpack_config: 	STONITH timeout: 60000
Oct 21 11:20:21 [7905] bl460g1n6 crm_resource:    debug: unpack_config: 	STONITH of failed nodes is enabled
Oct 21 11:20:21 [7905] bl460g1n6 crm_resource:    debug: unpack_config: 	Stop all active resources: false
Oct 21 11:20:21 [7905] bl460g1n6 crm_resource:    debug: unpack_config: 	Cluster is symmetric - resources can run anywhere by default
Oct 21 11:20:21 [7905] bl460g1n6 crm_resource:    debug: unpack_config: 	Default stickiness: 0
Oct 21 11:20:21 [7905] bl460g1n6 crm_resource:    debug: unpack_config: 	On loss of CCM Quorum: Freeze resources
Oct 21 11:20:21 [7905] bl460g1n6 crm_resource:    debug: unpack_config: 	Node scores: 'red' = -INFINITY, 'yellow' = 0, 'green' = 0
Oct 21 11:20:21 [7905] bl460g1n6 crm_resource:  warning: handle_startup_fencing: 	Blind faith: not fencing unseen nodes
Oct 21 11:20:21 [7905] bl460g1n6 crm_resource:    debug: unpack_domains: 	Unpacking domains
Oct 21 11:20:21 [7907] bl460g1n6 crm_resource:    debug: cib_native_signon_raw: 	Connection to CIB successful
Oct 21 11:20:21 [7905] bl460g1n6 crm_resource:     info: determine_online_status_fencing: 	Node bl460g1n8 is active
Oct 21 11:20:21 [7905] bl460g1n6 crm_resource:     info: determine_online_status: 	Node bl460g1n8 is online
Oct 21 11:20:21 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_query operation for section 'all': OK (rc=0, origin=local/crm_resource/2, version=0.14.1)
Oct 21 11:20:21 [7905] bl460g1n6 crm_resource:     info: determine_online_status_fencing: 	Node bl460g1n6 is active
Oct 21 11:20:21 [7905] bl460g1n6 crm_resource:     info: determine_online_status: 	Node bl460g1n6 is online
Oct 21 11:20:21 [7905] bl460g1n6 crm_resource:     info: determine_online_status_fencing: 	Node bl460g1n7 is active
Oct 21 11:20:21 [7905] bl460g1n6 crm_resource:     info: determine_online_status: 	Node bl460g1n7 is online
Oct 21 11:20:21 [7905] bl460g1n6 crm_resource:    debug: dump_resource_attr: 	Looking up hv_memory in prmVM2
Oct 21 11:20:21 [7905] bl460g1n6 crm_resource:    debug: cib_native_signoff: 	Signing out of the CIB Service
Oct 21 11:20:21 [7905] bl460g1n6 crm_resource:    debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Oct 21 11:20:21 [7905] bl460g1n6 crm_resource:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-request-7680-7905-14-header
Oct 21 11:20:21 [7905] bl460g1n6 crm_resource:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-response-7680-7905-14-header
Oct 21 11:20:21 [7905] bl460g1n6 crm_resource:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-event-7680-7905-14-header
Oct 21 11:20:21 [7905] bl460g1n6 crm_resource:     info: crm_xml_cleanup: 	Cleaning up memory from libxml2
Oct 21 11:20:21 [7680] bl460g1n6        cib:    debug: qb_ipcs_dispatch_connection_request: 	HUP conn (7680-7905-14)
Oct 21 11:20:21 [7680] bl460g1n6        cib:    debug: qb_ipcs_disconnect: 	qb_ipcs_disconnect(7680-7905-14) state:2
Oct 21 11:20:21 [7680] bl460g1n6        cib:     info: crm_client_destroy: 	Destroying 0 events
Oct 21 11:20:21 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-response-7680-7905-14-header
Oct 21 11:20:21 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-event-7680-7905-14-header
Oct 21 11:20:21 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-request-7680-7905-14-header
Oct 21 11:20:21 [7907] bl460g1n6 crm_resource:    debug: unpack_config: 	STONITH timeout: 60000
Oct 21 11:20:21 [7907] bl460g1n6 crm_resource:    debug: unpack_config: 	STONITH of failed nodes is enabled
Oct 21 11:20:21 [7907] bl460g1n6 crm_resource:    debug: unpack_config: 	Stop all active resources: false
Oct 21 11:20:21 [7907] bl460g1n6 crm_resource:    debug: unpack_config: 	Cluster is symmetric - resources can run anywhere by default
Oct 21 11:20:21 [7907] bl460g1n6 crm_resource:    debug: unpack_config: 	Default stickiness: 0
Oct 21 11:20:21 [7907] bl460g1n6 crm_resource:    debug: unpack_config: 	On loss of CCM Quorum: Freeze resources
Oct 21 11:20:21 [7907] bl460g1n6 crm_resource:    debug: unpack_config: 	Node scores: 'red' = -INFINITY, 'yellow' = 0, 'green' = 0
Oct 21 11:20:21 [7907] bl460g1n6 crm_resource:  warning: handle_startup_fencing: 	Blind faith: not fencing unseen nodes
Oct 21 11:20:21 [7907] bl460g1n6 crm_resource:    debug: unpack_domains: 	Unpacking domains
Oct 21 11:20:21 [7907] bl460g1n6 crm_resource:     info: determine_online_status_fencing: 	Node bl460g1n8 is active
Oct 21 11:20:21 [7907] bl460g1n6 crm_resource:     info: determine_online_status: 	Node bl460g1n8 is online
Oct 21 11:20:21 [7907] bl460g1n6 crm_resource:     info: determine_online_status_fencing: 	Node bl460g1n6 is active
Oct 21 11:20:21 [7907] bl460g1n6 crm_resource:     info: determine_online_status: 	Node bl460g1n6 is online
Oct 21 11:20:21 [7907] bl460g1n6 crm_resource:     info: determine_online_status_fencing: 	Node bl460g1n7 is active
Oct 21 11:20:21 [7907] bl460g1n6 crm_resource:     info: determine_online_status: 	Node bl460g1n7 is online
Oct 21 11:20:21 [7907] bl460g1n6 crm_resource:    debug: dump_resource_attr: 	Looking up hv_memory in prmVM3
Oct 21 11:20:21 [7907] bl460g1n6 crm_resource:    debug: cib_native_signoff: 	Signing out of the CIB Service
Oct 21 11:20:21 [7907] bl460g1n6 crm_resource:    debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Oct 21 11:20:21 [7907] bl460g1n6 crm_resource:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-request-7680-7907-15-header
Oct 21 11:20:21 [7680] bl460g1n6        cib:    debug: qb_ipcs_dispatch_connection_request: 	HUP conn (7680-7907-15)
Oct 21 11:20:21 [7680] bl460g1n6        cib:    debug: qb_ipcs_disconnect: 	qb_ipcs_disconnect(7680-7907-15) state:2
Oct 21 11:20:21 [7907] bl460g1n6 crm_resource:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-response-7680-7907-15-header
Oct 21 11:20:21 [7680] bl460g1n6        cib:     info: crm_client_destroy: 	Destroying 0 events
Oct 21 11:20:21 [7907] bl460g1n6 crm_resource:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-event-7680-7907-15-header
Oct 21 11:20:21 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-response-7680-7907-15-header
Oct 21 11:20:21 [7907] bl460g1n6 crm_resource:     info: crm_xml_cleanup: 	Cleaning up memory from libxml2
Oct 21 11:20:21 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-event-7680-7907-15-header
Oct 21 11:20:21 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-request-7680-7907-15-header
Oct 21 11:20:21 [7680] bl460g1n6        cib:    debug: write_cib_contents: 	Wrote digest d03db20616fa0f9347c494c5a518bf99 to disk
Oct 21 11:20:21 [7680] bl460g1n6        cib:     info: retrieveCib: 	Reading cluster configuration from: /var/lib/pacemaker/cib/cib.QkJPGH (digest: /var/lib/pacemaker/cib/cib.2KqzF3)
Oct 21 11:20:21 [7680] bl460g1n6        cib:    debug: write_cib_contents: 	Activating /var/lib/pacemaker/cib/cib.QkJPGH
Oct 21 11:20:21 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	result = 0
Oct 21 11:20:21 [7681] bl460g1n6 stonith-ng:   notice: stonith_device_register: 	Added 'prmStonith7-2' to the device list (3 active devices)
Oct 21 11:20:21 [7681] bl460g1n6 stonith-ng:     info: cib_device_update: 	Device prmStonith8-1 is allowed on bl460g1n6: score=0
Oct 21 11:20:21 [7681] bl460g1n6 stonith-ng:     info: stonith_action_create: 	Initiating action metadata for agent fence_legacy (target=(null))
Oct 21 11:20:21 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	forking
Oct 21 11:20:21 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	sending args
Oct 21 11:20:22 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	result = 0
Oct 21 11:20:22 [7681] bl460g1n6 stonith-ng:   notice: stonith_device_register: 	Added 'prmStonith8-1' to the device list (4 active devices)
Oct 21 11:20:22 [7681] bl460g1n6 stonith-ng:     info: cib_device_update: 	Device prmStonith8-2 is allowed on bl460g1n6: score=0
Oct 21 11:20:22 [7681] bl460g1n6 stonith-ng:     info: stonith_action_create: 	Initiating action metadata for agent fence_legacy (target=(null))
Oct 21 11:20:22 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	forking
Oct 21 11:20:22 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	sending args
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	result = 0
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:   notice: stonith_device_register: 	Added 'prmStonith8-2' to the device list (5 active devices)
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: log_cib_diff: 	Config update: Local-only Change: 0.7.1
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-- <cib admin_epoch="0" epoch="6" num_updates="1"/>
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  <cib epoch="7" num_updates="1" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:18 2013" update-origin="bl460g1n6" update-client="crmd" have-quorum="1" dc-uuid="3232261592">
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    <configuration>
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      <crm_config>
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        <cluster_property_set id="cib-bootstrap-options">
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         <nvpair id="cib-bootstrap-options-dc-version" name="dc-version" value="1.1.11-0.302.b6d42ed.git.el6-b6d42ed"/>
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        </cluster_property_set>
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      </crm_config>
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    </configuration>
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  </cib>
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: log_cib_diff: 	Config update: Local-only Change: 0.8.1
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-- <cib admin_epoch="0" epoch="7" num_updates="1"/>
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  <cib epoch="8" num_updates="1" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:18 2013" update-origin="bl460g1n6" update-client="crmd" have-quorum="1" dc-uuid="3232261592">
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    <configuration>
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      <crm_config>
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        <cluster_property_set id="cib-bootstrap-options">
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         <nvpair id="cib-bootstrap-options-cluster-infrastructure" name="cluster-infrastructure" value="corosync"/>
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        </cluster_property_set>
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      </crm_config>
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    </configuration>
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  </cib>
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: --- 0.8.1
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: +++ 0.8.2 0c9c2f47074174763f1dad3c7cac669e
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-  <cib num_updates="1">
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-    <status>
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-      <node_state id="3232261593">
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	--       <lrm id="3232261593">
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	--         <lrm_resources/>
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	--       </lrm>
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-      </node_state>
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-    </status>
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-  </cib>
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++ <cib epoch="8" num_updates="2" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:18 2013" update-origin="bl460g1n6" update-client="crmd" have-quorum="1" dc-uuid="3232261592"/>
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: --- 0.8.2
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: +++ 0.8.3 328a1a6255a2308500487efd987910c5
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-- <cib num_updates="2"/>
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  <cib epoch="8" num_updates="3" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:18 2013" update-origin="bl460g1n6" update-client="crmd" have-quorum="1" dc-uuid="3232261592">
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    <status>
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      <node_state id="3232261593" uname="bl460g1n7" in_ccm="true" crmd="online" crm-debug-origin="do_lrm_query_internal" join="member" expected="member">
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++       <lrm id="3232261593">
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         <lrm_resources/>
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++       </lrm>
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      </node_state>
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    </status>
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  </cib>
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: --- 0.8.3
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: +++ 0.8.4 2e14df1e12b10df2f063131723387905
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-  <cib num_updates="3">
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-    <status>
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-      <node_state id="3232261594">
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	--       <lrm id="3232261594">
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	--         <lrm_resources/>
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	--       </lrm>
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-      </node_state>
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-    </status>
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-  </cib>
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++ <cib epoch="8" num_updates="4" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:18 2013" update-origin="bl460g1n6" update-client="crmd" have-quorum="1" dc-uuid="3232261592"/>
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: --- 0.8.4
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: +++ 0.8.5 c3f23330b4e7ee013bf1a0628d958e61
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-- <cib num_updates="4"/>
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  <cib epoch="8" num_updates="5" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:18 2013" update-origin="bl460g1n6" update-client="crmd" have-quorum="1" dc-uuid="3232261592">
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    <status>
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      <node_state id="3232261594" uname="bl460g1n8" in_ccm="true" crmd="online" crm-debug-origin="do_lrm_query_internal" join="member" expected="member">
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++       <lrm id="3232261594">
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         <lrm_resources/>
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++       </lrm>
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      </node_state>
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    </status>
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  </cib>
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: --- 0.8.5
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: +++ 0.8.6 627e955db188390e6d560728d1c0ef56
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-  <cib num_updates="5">
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-    <status>
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-      <node_state id="3232261592">
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	--       <lrm id="3232261592">
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	--         <lrm_resources/>
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	--       </lrm>
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-      </node_state>
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-    </status>
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-  </cib>
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++ <cib epoch="8" num_updates="6" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:18 2013" update-origin="bl460g1n6" update-client="crmd" have-quorum="1" dc-uuid="3232261592"/>
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: --- 0.8.6
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: +++ 0.8.7 a334019f732d60ac13a4a0118da6c041
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-- <cib num_updates="6"/>
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  <cib epoch="8" num_updates="7" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:18 2013" update-origin="bl460g1n6" update-client="crmd" have-quorum="1" dc-uuid="3232261592">
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    <status>
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      <node_state id="3232261592" uname="bl460g1n6" in_ccm="true" crmd="online" crm-debug-origin="do_lrm_query_internal" join="member" expected="member">
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++       <lrm id="3232261592">
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         <lrm_resources/>
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++       </lrm>
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      </node_state>
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    </status>
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  </cib>
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: --- 0.8.7
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: +++ 0.9.1 d8a6e5052646e205343278a9d1e7e1ca
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-- <cib admin_epoch="0" epoch="8" num_updates="7"/>
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  <cib epoch="9" num_updates="1" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:20 2013" update-origin="bl460g1n8" update-client="crm_resource" have-quorum="1" dc-uuid="3232261592">
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    <configuration>
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      <resources>
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        <primitive id="prmVM1" class="ocf" provider="heartbeat" type="VirtualDomain">
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         <utilization id="prmVM1-utilization">
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <nvpair id="prmVM1-utilization-cpu" name="cpu" value="1"/>
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         </utilization>
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        </primitive>
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      </resources>
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    </configuration>
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  </cib>
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:     info: update_cib_stonith_devices: 	Updating device list from the cib: new resource
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: unpack_config: 	STONITH timeout: 60000
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: unpack_config: 	STONITH of failed nodes is enabled
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: unpack_config: 	Stop all active resources: false
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: unpack_config: 	Cluster is symmetric - resources can run anywhere by default
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: unpack_config: 	Default stickiness: 0
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: unpack_config: 	On loss of CCM Quorum: Freeze resources
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: unpack_config: 	Node scores: 'red' = -INFINITY, 'yellow' = 0, 'green' = 0
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: unpack_domains: 	Unpacking domains
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: group_rsc_location: 	Processing rsc_location lo8-rule for grpStonith8
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: group_rsc_location: 	Processing rsc_location lo7-rule for grpStonith7
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: group_rsc_location: 	Processing rsc_location lo6-rule for grpStonith6
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:     info: cib_device_update: 	Device prmStonith6-1 has been disabled on bl460g1n6: score=-INFINITY
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:     info: stonith_device_remove: 	Removed 'prmStonith6-2' from the device list (4 active devices)
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:     info: cib_device_update: 	Device prmStonith6-2 is allowed on bl460g1n6: score=0
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:     info: stonith_action_create: 	Initiating action metadata for agent fence_legacy (target=(null))
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	forking
Oct 21 11:20:23 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	sending args
Oct 21 11:20:24 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	result = 0
Oct 21 11:20:24 [7681] bl460g1n6 stonith-ng:   notice: stonith_device_register: 	Added 'prmStonith6-2' to the device list (5 active devices)
Oct 21 11:20:24 [7681] bl460g1n6 stonith-ng:     info: stonith_device_remove: 	Removed 'prmStonith7-1' from the device list (4 active devices)
Oct 21 11:20:24 [7681] bl460g1n6 stonith-ng:     info: cib_device_update: 	Device prmStonith7-1 is allowed on bl460g1n6: score=0
Oct 21 11:20:24 [7681] bl460g1n6 stonith-ng:     info: stonith_action_create: 	Initiating action metadata for agent fence_legacy (target=(null))
Oct 21 11:20:24 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	forking
Oct 21 11:20:24 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	sending args
Oct 21 11:20:25 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	result = 0
Oct 21 11:20:25 [7681] bl460g1n6 stonith-ng:   notice: stonith_device_register: 	Added 'prmStonith7-1' to the device list (5 active devices)
Oct 21 11:20:25 [7681] bl460g1n6 stonith-ng:     info: stonith_device_remove: 	Removed 'prmStonith7-2' from the device list (4 active devices)
Oct 21 11:20:25 [7681] bl460g1n6 stonith-ng:     info: cib_device_update: 	Device prmStonith7-2 is allowed on bl460g1n6: score=0
Oct 21 11:20:25 [7681] bl460g1n6 stonith-ng:     info: stonith_action_create: 	Initiating action metadata for agent fence_legacy (target=(null))
Oct 21 11:20:25 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	forking
Oct 21 11:20:25 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	sending args
Oct 21 11:20:26 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	result = 0
Oct 21 11:20:26 [7681] bl460g1n6 stonith-ng:   notice: stonith_device_register: 	Added 'prmStonith7-2' to the device list (5 active devices)
Oct 21 11:20:26 [7681] bl460g1n6 stonith-ng:     info: stonith_device_remove: 	Removed 'prmStonith8-1' from the device list (4 active devices)
Oct 21 11:20:26 [7681] bl460g1n6 stonith-ng:     info: cib_device_update: 	Device prmStonith8-1 is allowed on bl460g1n6: score=0
Oct 21 11:20:26 [7681] bl460g1n6 stonith-ng:     info: stonith_action_create: 	Initiating action metadata for agent fence_legacy (target=(null))
Oct 21 11:20:26 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	forking
Oct 21 11:20:26 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	sending args
Oct 21 11:20:27 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	result = 0
Oct 21 11:20:27 [7681] bl460g1n6 stonith-ng:   notice: stonith_device_register: 	Added 'prmStonith8-1' to the device list (5 active devices)
Oct 21 11:20:27 [7681] bl460g1n6 stonith-ng:     info: stonith_device_remove: 	Removed 'prmStonith8-2' from the device list (4 active devices)
Oct 21 11:20:27 [7681] bl460g1n6 stonith-ng:     info: cib_device_update: 	Device prmStonith8-2 is allowed on bl460g1n6: score=0
Oct 21 11:20:27 [7681] bl460g1n6 stonith-ng:     info: stonith_action_create: 	Initiating action metadata for agent fence_legacy (target=(null))
Oct 21 11:20:27 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	forking
Oct 21 11:20:27 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	sending args
Oct 21 11:20:28 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	result = 0
Oct 21 11:20:28 [7681] bl460g1n6 stonith-ng:   notice: stonith_device_register: 	Added 'prmStonith8-2' to the device list (5 active devices)
Oct 21 11:20:28 [7681] bl460g1n6 stonith-ng:    debug: log_cib_diff: 	Config update: Local-only Change: 0.10.1
Oct 21 11:20:28 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-- <cib admin_epoch="0" epoch="9" num_updates="1"/>
Oct 21 11:20:28 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  <cib epoch="10" num_updates="1" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:20 2013" update-origin="bl460g1n8" update-client="crm_resource" have-quorum="1" dc-uuid="3232261592">
Oct 21 11:20:28 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    <configuration>
Oct 21 11:20:28 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      <resources>
Oct 21 11:20:28 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        <primitive id="prmVM2" class="ocf" provider="heartbeat" type="VirtualDomain">
Oct 21 11:20:28 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         <utilization id="prmVM2-utilization">
Oct 21 11:20:28 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <nvpair id="prmVM2-utilization-cpu" name="cpu" value="1"/>
Oct 21 11:20:28 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         </utilization>
Oct 21 11:20:28 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        </primitive>
Oct 21 11:20:28 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      </resources>
Oct 21 11:20:28 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    </configuration>
Oct 21 11:20:28 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  </cib>
Oct 21 11:20:28 [7681] bl460g1n6 stonith-ng:     info: update_cib_stonith_devices: 	Updating device list from the cib: new resource
Oct 21 11:20:28 [7681] bl460g1n6 stonith-ng:    debug: unpack_config: 	STONITH timeout: 60000
Oct 21 11:20:28 [7681] bl460g1n6 stonith-ng:    debug: unpack_config: 	STONITH of failed nodes is enabled
Oct 21 11:20:28 [7681] bl460g1n6 stonith-ng:    debug: unpack_config: 	Stop all active resources: false
Oct 21 11:20:28 [7681] bl460g1n6 stonith-ng:    debug: unpack_config: 	Cluster is symmetric - resources can run anywhere by default
Oct 21 11:20:28 [7681] bl460g1n6 stonith-ng:    debug: unpack_config: 	Default stickiness: 0
Oct 21 11:20:28 [7681] bl460g1n6 stonith-ng:    debug: unpack_config: 	On loss of CCM Quorum: Freeze resources
Oct 21 11:20:28 [7681] bl460g1n6 stonith-ng:    debug: unpack_config: 	Node scores: 'red' = -INFINITY, 'yellow' = 0, 'green' = 0
Oct 21 11:20:28 [7681] bl460g1n6 stonith-ng:    debug: unpack_domains: 	Unpacking domains
Oct 21 11:20:28 [7681] bl460g1n6 stonith-ng:    debug: group_rsc_location: 	Processing rsc_location lo8-rule for grpStonith8
Oct 21 11:20:28 [7681] bl460g1n6 stonith-ng:    debug: group_rsc_location: 	Processing rsc_location lo7-rule for grpStonith7
Oct 21 11:20:28 [7681] bl460g1n6 stonith-ng:    debug: group_rsc_location: 	Processing rsc_location lo6-rule for grpStonith6
Oct 21 11:20:28 [7681] bl460g1n6 stonith-ng:     info: cib_device_update: 	Device prmStonith6-1 has been disabled on bl460g1n6: score=-INFINITY
Oct 21 11:20:28 [7681] bl460g1n6 stonith-ng:     info: stonith_device_remove: 	Removed 'prmStonith6-2' from the device list (4 active devices)
Oct 21 11:20:28 [7681] bl460g1n6 stonith-ng:     info: cib_device_update: 	Device prmStonith6-2 is allowed on bl460g1n6: score=0
Oct 21 11:20:28 [7681] bl460g1n6 stonith-ng:     info: stonith_action_create: 	Initiating action metadata for agent fence_legacy (target=(null))
Oct 21 11:20:28 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	forking
Oct 21 11:20:28 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	sending args
Oct 21 11:20:29 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	result = 0
Oct 21 11:20:29 [7681] bl460g1n6 stonith-ng:   notice: stonith_device_register: 	Added 'prmStonith6-2' to the device list (5 active devices)
Oct 21 11:20:29 [7681] bl460g1n6 stonith-ng:     info: stonith_device_remove: 	Removed 'prmStonith7-1' from the device list (4 active devices)
Oct 21 11:20:29 [7681] bl460g1n6 stonith-ng:     info: cib_device_update: 	Device prmStonith7-1 is allowed on bl460g1n6: score=0
Oct 21 11:20:29 [7681] bl460g1n6 stonith-ng:     info: stonith_action_create: 	Initiating action metadata for agent fence_legacy (target=(null))
Oct 21 11:20:29 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	forking
Oct 21 11:20:29 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	sending args
Oct 21 11:20:30 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	result = 0
Oct 21 11:20:30 [7681] bl460g1n6 stonith-ng:   notice: stonith_device_register: 	Added 'prmStonith7-1' to the device list (5 active devices)
Oct 21 11:20:30 [7681] bl460g1n6 stonith-ng:     info: stonith_device_remove: 	Removed 'prmStonith7-2' from the device list (4 active devices)
Oct 21 11:20:30 [7681] bl460g1n6 stonith-ng:     info: cib_device_update: 	Device prmStonith7-2 is allowed on bl460g1n6: score=0
Oct 21 11:20:30 [7681] bl460g1n6 stonith-ng:     info: stonith_action_create: 	Initiating action metadata for agent fence_legacy (target=(null))
Oct 21 11:20:30 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	forking
Oct 21 11:20:30 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	sending args
Oct 21 11:20:31 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	result = 0
Oct 21 11:20:31 [7681] bl460g1n6 stonith-ng:   notice: stonith_device_register: 	Added 'prmStonith7-2' to the device list (5 active devices)
Oct 21 11:20:31 [7681] bl460g1n6 stonith-ng:     info: stonith_device_remove: 	Removed 'prmStonith8-1' from the device list (4 active devices)
Oct 21 11:20:31 [7681] bl460g1n6 stonith-ng:     info: cib_device_update: 	Device prmStonith8-1 is allowed on bl460g1n6: score=0
Oct 21 11:20:31 [7681] bl460g1n6 stonith-ng:     info: stonith_action_create: 	Initiating action metadata for agent fence_legacy (target=(null))
Oct 21 11:20:31 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	forking
Oct 21 11:20:31 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	sending args
Oct 21 11:20:32 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	result = 0
Oct 21 11:20:32 [7681] bl460g1n6 stonith-ng:   notice: stonith_device_register: 	Added 'prmStonith8-1' to the device list (5 active devices)
Oct 21 11:20:32 [7681] bl460g1n6 stonith-ng:     info: stonith_device_remove: 	Removed 'prmStonith8-2' from the device list (4 active devices)
Oct 21 11:20:32 [7681] bl460g1n6 stonith-ng:     info: cib_device_update: 	Device prmStonith8-2 is allowed on bl460g1n6: score=0
Oct 21 11:20:32 [7681] bl460g1n6 stonith-ng:     info: stonith_action_create: 	Initiating action metadata for agent fence_legacy (target=(null))
Oct 21 11:20:32 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	forking
Oct 21 11:20:32 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	sending args
Oct 21 11:20:33 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	result = 0
Oct 21 11:20:33 [7681] bl460g1n6 stonith-ng:   notice: stonith_device_register: 	Added 'prmStonith8-2' to the device list (5 active devices)
Oct 21 11:20:33 [7681] bl460g1n6 stonith-ng:    debug: log_cib_diff: 	Config update: Local-only Change: 0.11.1
Oct 21 11:20:33 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-- <cib admin_epoch="0" epoch="10" num_updates="1"/>
Oct 21 11:20:33 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  <cib epoch="11" num_updates="1" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:20 2013" update-origin="bl460g1n7" update-client="crm_resource" have-quorum="1" dc-uuid="3232261592">
Oct 21 11:20:33 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    <configuration>
Oct 21 11:20:33 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      <resources>
Oct 21 11:20:33 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        <primitive id="prmVM3" class="ocf" provider="heartbeat" type="VirtualDomain">
Oct 21 11:20:33 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         <utilization id="prmVM3-utilization">
Oct 21 11:20:33 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <nvpair id="prmVM3-utilization-cpu" name="cpu" value="1"/>
Oct 21 11:20:33 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         </utilization>
Oct 21 11:20:33 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        </primitive>
Oct 21 11:20:33 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      </resources>
Oct 21 11:20:33 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    </configuration>
Oct 21 11:20:33 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  </cib>
Oct 21 11:20:33 [7681] bl460g1n6 stonith-ng:     info: update_cib_stonith_devices: 	Updating device list from the cib: new resource
Oct 21 11:20:33 [7681] bl460g1n6 stonith-ng:    debug: unpack_config: 	STONITH timeout: 60000
Oct 21 11:20:33 [7681] bl460g1n6 stonith-ng:    debug: unpack_config: 	STONITH of failed nodes is enabled
Oct 21 11:20:33 [7681] bl460g1n6 stonith-ng:    debug: unpack_config: 	Stop all active resources: false
Oct 21 11:20:33 [7681] bl460g1n6 stonith-ng:    debug: unpack_config: 	Cluster is symmetric - resources can run anywhere by default
Oct 21 11:20:33 [7681] bl460g1n6 stonith-ng:    debug: unpack_config: 	Default stickiness: 0
Oct 21 11:20:33 [7681] bl460g1n6 stonith-ng:    debug: unpack_config: 	On loss of CCM Quorum: Freeze resources
Oct 21 11:20:33 [7681] bl460g1n6 stonith-ng:    debug: unpack_config: 	Node scores: 'red' = -INFINITY, 'yellow' = 0, 'green' = 0
Oct 21 11:20:33 [7681] bl460g1n6 stonith-ng:    debug: unpack_domains: 	Unpacking domains
Oct 21 11:20:33 [7681] bl460g1n6 stonith-ng:    debug: group_rsc_location: 	Processing rsc_location lo8-rule for grpStonith8
Oct 21 11:20:33 [7681] bl460g1n6 stonith-ng:    debug: group_rsc_location: 	Processing rsc_location lo7-rule for grpStonith7
Oct 21 11:20:33 [7681] bl460g1n6 stonith-ng:    debug: group_rsc_location: 	Processing rsc_location lo6-rule for grpStonith6
Oct 21 11:20:33 [7681] bl460g1n6 stonith-ng:     info: cib_device_update: 	Device prmStonith6-1 has been disabled on bl460g1n6: score=-INFINITY
Oct 21 11:20:33 [7681] bl460g1n6 stonith-ng:     info: stonith_device_remove: 	Removed 'prmStonith6-2' from the device list (4 active devices)
Oct 21 11:20:33 [7681] bl460g1n6 stonith-ng:     info: cib_device_update: 	Device prmStonith6-2 is allowed on bl460g1n6: score=0
Oct 21 11:20:33 [7681] bl460g1n6 stonith-ng:     info: stonith_action_create: 	Initiating action metadata for agent fence_legacy (target=(null))
Oct 21 11:20:33 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	forking
Oct 21 11:20:33 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	sending args
Oct 21 11:20:34 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	result = 0
Oct 21 11:20:34 [7681] bl460g1n6 stonith-ng:   notice: stonith_device_register: 	Added 'prmStonith6-2' to the device list (5 active devices)
Oct 21 11:20:34 [7681] bl460g1n6 stonith-ng:     info: stonith_device_remove: 	Removed 'prmStonith7-1' from the device list (4 active devices)
Oct 21 11:20:34 [7681] bl460g1n6 stonith-ng:     info: cib_device_update: 	Device prmStonith7-1 is allowed on bl460g1n6: score=0
Oct 21 11:20:34 [7681] bl460g1n6 stonith-ng:     info: stonith_action_create: 	Initiating action metadata for agent fence_legacy (target=(null))
Oct 21 11:20:34 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	forking
Oct 21 11:20:34 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	sending args
Oct 21 11:20:35 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	result = 0
Oct 21 11:20:35 [7681] bl460g1n6 stonith-ng:   notice: stonith_device_register: 	Added 'prmStonith7-1' to the device list (5 active devices)
Oct 21 11:20:35 [7681] bl460g1n6 stonith-ng:     info: stonith_device_remove: 	Removed 'prmStonith7-2' from the device list (4 active devices)
Oct 21 11:20:35 [7681] bl460g1n6 stonith-ng:     info: cib_device_update: 	Device prmStonith7-2 is allowed on bl460g1n6: score=0
Oct 21 11:20:35 [7681] bl460g1n6 stonith-ng:     info: stonith_action_create: 	Initiating action metadata for agent fence_legacy (target=(null))
Oct 21 11:20:35 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	forking
Oct 21 11:20:35 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	sending args
Oct 21 11:20:36 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	result = 0
Oct 21 11:20:36 [7681] bl460g1n6 stonith-ng:   notice: stonith_device_register: 	Added 'prmStonith7-2' to the device list (5 active devices)
Oct 21 11:20:36 [7681] bl460g1n6 stonith-ng:     info: stonith_device_remove: 	Removed 'prmStonith8-1' from the device list (4 active devices)
Oct 21 11:20:36 [7681] bl460g1n6 stonith-ng:     info: cib_device_update: 	Device prmStonith8-1 is allowed on bl460g1n6: score=0
Oct 21 11:20:36 [7681] bl460g1n6 stonith-ng:     info: stonith_action_create: 	Initiating action metadata for agent fence_legacy (target=(null))
Oct 21 11:20:36 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	forking
Oct 21 11:20:36 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	sending args
Oct 21 11:20:37 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	result = 0
Oct 21 11:20:37 [7681] bl460g1n6 stonith-ng:   notice: stonith_device_register: 	Added 'prmStonith8-1' to the device list (5 active devices)
Oct 21 11:20:37 [7681] bl460g1n6 stonith-ng:     info: stonith_device_remove: 	Removed 'prmStonith8-2' from the device list (4 active devices)
Oct 21 11:20:37 [7681] bl460g1n6 stonith-ng:     info: cib_device_update: 	Device prmStonith8-2 is allowed on bl460g1n6: score=0
Oct 21 11:20:37 [7681] bl460g1n6 stonith-ng:     info: stonith_action_create: 	Initiating action metadata for agent fence_legacy (target=(null))
Oct 21 11:20:37 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	forking
Oct 21 11:20:37 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	sending args
Oct 21 11:20:38 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	result = 0
Oct 21 11:20:38 [7681] bl460g1n6 stonith-ng:   notice: stonith_device_register: 	Added 'prmStonith8-2' to the device list (5 active devices)
Oct 21 11:20:38 [7681] bl460g1n6 stonith-ng:    debug: log_cib_diff: 	Config update: Local-only Change: 0.12.1
Oct 21 11:20:38 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-- <cib admin_epoch="0" epoch="11" num_updates="1"/>
Oct 21 11:20:38 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  <cib epoch="12" num_updates="1" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:21 2013" update-origin="bl460g1n8" update-client="crm_resource" have-quorum="1" dc-uuid="3232261592">
Oct 21 11:20:38 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    <configuration>
Oct 21 11:20:38 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      <resources>
Oct 21 11:20:38 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        <primitive id="prmVM1" class="ocf" provider="heartbeat" type="VirtualDomain">
Oct 21 11:20:38 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          <utilization id="prmVM1-utilization">
Oct 21 11:20:38 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <nvpair id="prmVM1-utilization-hv_memory" name="hv_memory" value="2048"/>
Oct 21 11:20:38 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          </utilization>
Oct 21 11:20:38 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        </primitive>
Oct 21 11:20:38 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      </resources>
Oct 21 11:20:38 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    </configuration>
Oct 21 11:20:38 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  </cib>
Oct 21 11:20:38 [7681] bl460g1n6 stonith-ng:     info: update_cib_stonith_devices: 	Updating device list from the cib: new resource
Oct 21 11:20:38 [7681] bl460g1n6 stonith-ng:    debug: unpack_config: 	STONITH timeout: 60000
Oct 21 11:20:38 [7681] bl460g1n6 stonith-ng:    debug: unpack_config: 	STONITH of failed nodes is enabled
Oct 21 11:20:38 [7681] bl460g1n6 stonith-ng:    debug: unpack_config: 	Stop all active resources: false
Oct 21 11:20:38 [7681] bl460g1n6 stonith-ng:    debug: unpack_config: 	Cluster is symmetric - resources can run anywhere by default
Oct 21 11:20:38 [7681] bl460g1n6 stonith-ng:    debug: unpack_config: 	Default stickiness: 0
Oct 21 11:20:38 [7681] bl460g1n6 stonith-ng:    debug: unpack_config: 	On loss of CCM Quorum: Freeze resources
Oct 21 11:20:38 [7681] bl460g1n6 stonith-ng:    debug: unpack_config: 	Node scores: 'red' = -INFINITY, 'yellow' = 0, 'green' = 0
Oct 21 11:20:38 [7681] bl460g1n6 stonith-ng:    debug: unpack_domains: 	Unpacking domains
Oct 21 11:20:38 [7681] bl460g1n6 stonith-ng:    debug: group_rsc_location: 	Processing rsc_location lo8-rule for grpStonith8
Oct 21 11:20:38 [7681] bl460g1n6 stonith-ng:    debug: group_rsc_location: 	Processing rsc_location lo7-rule for grpStonith7
Oct 21 11:20:38 [7681] bl460g1n6 stonith-ng:    debug: group_rsc_location: 	Processing rsc_location lo6-rule for grpStonith6
Oct 21 11:20:38 [7681] bl460g1n6 stonith-ng:     info: cib_device_update: 	Device prmStonith6-1 has been disabled on bl460g1n6: score=-INFINITY
Oct 21 11:20:38 [7681] bl460g1n6 stonith-ng:     info: stonith_device_remove: 	Removed 'prmStonith6-2' from the device list (4 active devices)
Oct 21 11:20:38 [7681] bl460g1n6 stonith-ng:     info: cib_device_update: 	Device prmStonith6-2 is allowed on bl460g1n6: score=0
Oct 21 11:20:38 [7681] bl460g1n6 stonith-ng:     info: stonith_action_create: 	Initiating action metadata for agent fence_legacy (target=(null))
Oct 21 11:20:38 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	forking
Oct 21 11:20:38 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	sending args
Oct 21 11:20:39 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	result = 0
Oct 21 11:20:39 [7681] bl460g1n6 stonith-ng:   notice: stonith_device_register: 	Added 'prmStonith6-2' to the device list (5 active devices)
Oct 21 11:20:39 [7681] bl460g1n6 stonith-ng:     info: stonith_device_remove: 	Removed 'prmStonith7-1' from the device list (4 active devices)
Oct 21 11:20:39 [7681] bl460g1n6 stonith-ng:     info: cib_device_update: 	Device prmStonith7-1 is allowed on bl460g1n6: score=0
Oct 21 11:20:39 [7681] bl460g1n6 stonith-ng:     info: stonith_action_create: 	Initiating action metadata for agent fence_legacy (target=(null))
Oct 21 11:20:39 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	forking
Oct 21 11:20:39 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	sending args
Oct 21 11:20:40 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	result = 0
Oct 21 11:20:40 [7681] bl460g1n6 stonith-ng:   notice: stonith_device_register: 	Added 'prmStonith7-1' to the device list (5 active devices)
Oct 21 11:20:40 [7681] bl460g1n6 stonith-ng:     info: stonith_device_remove: 	Removed 'prmStonith7-2' from the device list (4 active devices)
Oct 21 11:20:40 [7681] bl460g1n6 stonith-ng:     info: cib_device_update: 	Device prmStonith7-2 is allowed on bl460g1n6: score=0
Oct 21 11:20:40 [7681] bl460g1n6 stonith-ng:     info: stonith_action_create: 	Initiating action metadata for agent fence_legacy (target=(null))
Oct 21 11:20:40 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	forking
Oct 21 11:20:40 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	sending args
Oct 21 11:20:41 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	result = 0
Oct 21 11:20:41 [7681] bl460g1n6 stonith-ng:   notice: stonith_device_register: 	Added 'prmStonith7-2' to the device list (5 active devices)
Oct 21 11:20:41 [7681] bl460g1n6 stonith-ng:     info: stonith_device_remove: 	Removed 'prmStonith8-1' from the device list (4 active devices)
Oct 21 11:20:41 [7681] bl460g1n6 stonith-ng:     info: cib_device_update: 	Device prmStonith8-1 is allowed on bl460g1n6: score=0
Oct 21 11:20:41 [7681] bl460g1n6 stonith-ng:     info: stonith_action_create: 	Initiating action metadata for agent fence_legacy (target=(null))
Oct 21 11:20:41 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	forking
Oct 21 11:20:41 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	sending args
Oct 21 11:20:42 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	result = 0
Oct 21 11:20:42 [7681] bl460g1n6 stonith-ng:   notice: stonith_device_register: 	Added 'prmStonith8-1' to the device list (5 active devices)
Oct 21 11:20:42 [7681] bl460g1n6 stonith-ng:     info: stonith_device_remove: 	Removed 'prmStonith8-2' from the device list (4 active devices)
Oct 21 11:20:42 [7681] bl460g1n6 stonith-ng:     info: cib_device_update: 	Device prmStonith8-2 is allowed on bl460g1n6: score=0
Oct 21 11:20:42 [7681] bl460g1n6 stonith-ng:     info: stonith_action_create: 	Initiating action metadata for agent fence_legacy (target=(null))
Oct 21 11:20:42 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	forking
Oct 21 11:20:42 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	sending args
Oct 21 11:20:43 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	result = 0
Oct 21 11:20:43 [7681] bl460g1n6 stonith-ng:   notice: stonith_device_register: 	Added 'prmStonith8-2' to the device list (5 active devices)
Oct 21 11:20:43 [7681] bl460g1n6 stonith-ng:    debug: log_cib_diff: 	Config update: Local-only Change: 0.13.1
Oct 21 11:20:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-- <cib admin_epoch="0" epoch="12" num_updates="1"/>
Oct 21 11:20:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  <cib epoch="13" num_updates="1" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:21 2013" update-origin="bl460g1n7" update-client="crm_resource" have-quorum="1" dc-uuid="3232261592">
Oct 21 11:20:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    <configuration>
Oct 21 11:20:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      <resources>
Oct 21 11:20:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        <primitive id="prmVM2" class="ocf" provider="heartbeat" type="VirtualDomain">
Oct 21 11:20:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          <utilization id="prmVM2-utilization">
Oct 21 11:20:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <nvpair id="prmVM2-utilization-hv_memory" name="hv_memory" value="2048"/>
Oct 21 11:20:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          </utilization>
Oct 21 11:20:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        </primitive>
Oct 21 11:20:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      </resources>
Oct 21 11:20:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    </configuration>
Oct 21 11:20:43 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  </cib>
Oct 21 11:20:43 [7681] bl460g1n6 stonith-ng:     info: update_cib_stonith_devices: 	Updating device list from the cib: new resource
Oct 21 11:20:43 [7681] bl460g1n6 stonith-ng:    debug: unpack_config: 	STONITH timeout: 60000
Oct 21 11:20:43 [7681] bl460g1n6 stonith-ng:    debug: unpack_config: 	STONITH of failed nodes is enabled
Oct 21 11:20:43 [7681] bl460g1n6 stonith-ng:    debug: unpack_config: 	Stop all active resources: false
Oct 21 11:20:43 [7681] bl460g1n6 stonith-ng:    debug: unpack_config: 	Cluster is symmetric - resources can run anywhere by default
Oct 21 11:20:43 [7681] bl460g1n6 stonith-ng:    debug: unpack_config: 	Default stickiness: 0
Oct 21 11:20:43 [7681] bl460g1n6 stonith-ng:    debug: unpack_config: 	On loss of CCM Quorum: Freeze resources
Oct 21 11:20:43 [7681] bl460g1n6 stonith-ng:    debug: unpack_config: 	Node scores: 'red' = -INFINITY, 'yellow' = 0, 'green' = 0
Oct 21 11:20:43 [7681] bl460g1n6 stonith-ng:    debug: unpack_domains: 	Unpacking domains
Oct 21 11:20:43 [7681] bl460g1n6 stonith-ng:    debug: group_rsc_location: 	Processing rsc_location lo8-rule for grpStonith8
Oct 21 11:20:43 [7681] bl460g1n6 stonith-ng:    debug: group_rsc_location: 	Processing rsc_location lo7-rule for grpStonith7
Oct 21 11:20:43 [7681] bl460g1n6 stonith-ng:    debug: group_rsc_location: 	Processing rsc_location lo6-rule for grpStonith6
Oct 21 11:20:43 [7681] bl460g1n6 stonith-ng:     info: cib_device_update: 	Device prmStonith6-1 has been disabled on bl460g1n6: score=-INFINITY
Oct 21 11:20:43 [7681] bl460g1n6 stonith-ng:     info: stonith_device_remove: 	Removed 'prmStonith6-2' from the device list (4 active devices)
Oct 21 11:20:43 [7681] bl460g1n6 stonith-ng:     info: cib_device_update: 	Device prmStonith6-2 is allowed on bl460g1n6: score=0
Oct 21 11:20:43 [7681] bl460g1n6 stonith-ng:     info: stonith_action_create: 	Initiating action metadata for agent fence_legacy (target=(null))
Oct 21 11:20:43 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	forking
Oct 21 11:20:43 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	sending args
Oct 21 11:20:44 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	result = 0
Oct 21 11:20:44 [7681] bl460g1n6 stonith-ng:   notice: stonith_device_register: 	Added 'prmStonith6-2' to the device list (5 active devices)
Oct 21 11:20:44 [7681] bl460g1n6 stonith-ng:     info: stonith_device_remove: 	Removed 'prmStonith7-1' from the device list (4 active devices)
Oct 21 11:20:44 [7681] bl460g1n6 stonith-ng:     info: cib_device_update: 	Device prmStonith7-1 is allowed on bl460g1n6: score=0
Oct 21 11:20:44 [7681] bl460g1n6 stonith-ng:     info: stonith_action_create: 	Initiating action metadata for agent fence_legacy (target=(null))
Oct 21 11:20:44 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	forking
Oct 21 11:20:44 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	sending args
Oct 21 11:20:45 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	result = 0
Oct 21 11:20:45 [7681] bl460g1n6 stonith-ng:   notice: stonith_device_register: 	Added 'prmStonith7-1' to the device list (5 active devices)
Oct 21 11:20:45 [7681] bl460g1n6 stonith-ng:     info: stonith_device_remove: 	Removed 'prmStonith7-2' from the device list (4 active devices)
Oct 21 11:20:45 [7681] bl460g1n6 stonith-ng:     info: cib_device_update: 	Device prmStonith7-2 is allowed on bl460g1n6: score=0
Oct 21 11:20:45 [7681] bl460g1n6 stonith-ng:     info: stonith_action_create: 	Initiating action metadata for agent fence_legacy (target=(null))
Oct 21 11:20:45 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	forking
Oct 21 11:20:45 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	sending args
Oct 21 11:20:46 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	result = 0
Oct 21 11:20:46 [7681] bl460g1n6 stonith-ng:   notice: stonith_device_register: 	Added 'prmStonith7-2' to the device list (5 active devices)
Oct 21 11:20:46 [7681] bl460g1n6 stonith-ng:     info: stonith_device_remove: 	Removed 'prmStonith8-1' from the device list (4 active devices)
Oct 21 11:20:46 [7681] bl460g1n6 stonith-ng:     info: cib_device_update: 	Device prmStonith8-1 is allowed on bl460g1n6: score=0
Oct 21 11:20:46 [7681] bl460g1n6 stonith-ng:     info: stonith_action_create: 	Initiating action metadata for agent fence_legacy (target=(null))
Oct 21 11:20:46 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	forking
Oct 21 11:20:46 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	sending args
Oct 21 11:20:47 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	result = 0
Oct 21 11:20:47 [7681] bl460g1n6 stonith-ng:   notice: stonith_device_register: 	Added 'prmStonith8-1' to the device list (5 active devices)
Oct 21 11:20:47 [7681] bl460g1n6 stonith-ng:     info: stonith_device_remove: 	Removed 'prmStonith8-2' from the device list (4 active devices)
Oct 21 11:20:47 [7681] bl460g1n6 stonith-ng:     info: cib_device_update: 	Device prmStonith8-2 is allowed on bl460g1n6: score=0
Oct 21 11:20:47 [7681] bl460g1n6 stonith-ng:     info: stonith_action_create: 	Initiating action metadata for agent fence_legacy (target=(null))
Oct 21 11:20:47 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	forking
Oct 21 11:20:47 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	sending args
Oct 21 11:20:48 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	result = 0
Oct 21 11:20:48 [7681] bl460g1n6 stonith-ng:   notice: stonith_device_register: 	Added 'prmStonith8-2' to the device list (5 active devices)
Oct 21 11:20:48 [7681] bl460g1n6 stonith-ng:    debug: log_cib_diff: 	Config update: Local-only Change: 0.14.1
Oct 21 11:20:48 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-- <cib admin_epoch="0" epoch="13" num_updates="1"/>
Oct 21 11:20:48 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  <cib epoch="14" num_updates="1" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:21 2013" update-origin="bl460g1n7" update-client="crm_resource" have-quorum="1" dc-uuid="3232261592">
Oct 21 11:20:48 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    <configuration>
Oct 21 11:20:48 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      <resources>
Oct 21 11:20:48 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        <primitive id="prmVM3" class="ocf" provider="heartbeat" type="VirtualDomain">
Oct 21 11:20:48 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          <utilization id="prmVM3-utilization">
Oct 21 11:20:48 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <nvpair id="prmVM3-utilization-hv_memory" name="hv_memory" value="2048"/>
Oct 21 11:20:48 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          </utilization>
Oct 21 11:20:48 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        </primitive>
Oct 21 11:20:48 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      </resources>
Oct 21 11:20:48 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    </configuration>
Oct 21 11:20:48 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  </cib>
Oct 21 11:20:48 [7681] bl460g1n6 stonith-ng:     info: update_cib_stonith_devices: 	Updating device list from the cib: new resource
Oct 21 11:20:48 [7681] bl460g1n6 stonith-ng:    debug: unpack_config: 	STONITH timeout: 60000
Oct 21 11:20:48 [7681] bl460g1n6 stonith-ng:    debug: unpack_config: 	STONITH of failed nodes is enabled
Oct 21 11:20:48 [7681] bl460g1n6 stonith-ng:    debug: unpack_config: 	Stop all active resources: false
Oct 21 11:20:48 [7681] bl460g1n6 stonith-ng:    debug: unpack_config: 	Cluster is symmetric - resources can run anywhere by default
Oct 21 11:20:48 [7681] bl460g1n6 stonith-ng:    debug: unpack_config: 	Default stickiness: 0
Oct 21 11:20:48 [7681] bl460g1n6 stonith-ng:    debug: unpack_config: 	On loss of CCM Quorum: Freeze resources
Oct 21 11:20:48 [7681] bl460g1n6 stonith-ng:    debug: unpack_config: 	Node scores: 'red' = -INFINITY, 'yellow' = 0, 'green' = 0
Oct 21 11:20:48 [7681] bl460g1n6 stonith-ng:    debug: unpack_domains: 	Unpacking domains
Oct 21 11:20:48 [7681] bl460g1n6 stonith-ng:    debug: group_rsc_location: 	Processing rsc_location lo8-rule for grpStonith8
Oct 21 11:20:48 [7681] bl460g1n6 stonith-ng:    debug: group_rsc_location: 	Processing rsc_location lo7-rule for grpStonith7
Oct 21 11:20:48 [7681] bl460g1n6 stonith-ng:    debug: group_rsc_location: 	Processing rsc_location lo6-rule for grpStonith6
Oct 21 11:20:48 [7681] bl460g1n6 stonith-ng:     info: cib_device_update: 	Device prmStonith6-1 has been disabled on bl460g1n6: score=-INFINITY
Oct 21 11:20:48 [7681] bl460g1n6 stonith-ng:     info: stonith_device_remove: 	Removed 'prmStonith6-2' from the device list (4 active devices)
Oct 21 11:20:48 [7681] bl460g1n6 stonith-ng:     info: cib_device_update: 	Device prmStonith6-2 is allowed on bl460g1n6: score=0
Oct 21 11:20:48 [7681] bl460g1n6 stonith-ng:     info: stonith_action_create: 	Initiating action metadata for agent fence_legacy (target=(null))
Oct 21 11:20:48 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	forking
Oct 21 11:20:48 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	sending args
Oct 21 11:20:49 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	result = 0
Oct 21 11:20:49 [7681] bl460g1n6 stonith-ng:   notice: stonith_device_register: 	Added 'prmStonith6-2' to the device list (5 active devices)
Oct 21 11:20:49 [7681] bl460g1n6 stonith-ng:     info: stonith_device_remove: 	Removed 'prmStonith7-1' from the device list (4 active devices)
Oct 21 11:20:49 [7681] bl460g1n6 stonith-ng:     info: cib_device_update: 	Device prmStonith7-1 is allowed on bl460g1n6: score=0
Oct 21 11:20:49 [7681] bl460g1n6 stonith-ng:     info: stonith_action_create: 	Initiating action metadata for agent fence_legacy (target=(null))
Oct 21 11:20:49 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	forking
Oct 21 11:20:49 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	sending args
Oct 21 11:20:50 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	result = 0
Oct 21 11:20:50 [7681] bl460g1n6 stonith-ng:   notice: stonith_device_register: 	Added 'prmStonith7-1' to the device list (5 active devices)
Oct 21 11:20:50 [7681] bl460g1n6 stonith-ng:     info: stonith_device_remove: 	Removed 'prmStonith7-2' from the device list (4 active devices)
Oct 21 11:20:50 [7681] bl460g1n6 stonith-ng:     info: cib_device_update: 	Device prmStonith7-2 is allowed on bl460g1n6: score=0
Oct 21 11:20:50 [7681] bl460g1n6 stonith-ng:     info: stonith_action_create: 	Initiating action metadata for agent fence_legacy (target=(null))
Oct 21 11:20:50 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	forking
Oct 21 11:20:50 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	sending args
Oct 21 11:20:51 [7685] bl460g1n6       crmd:  warning: crm_ipc_send: 	Request 90 to lrmd (0x1eacdd0) failed: OK (0)
Oct 21 11:20:51 [7685] bl460g1n6       crmd:    error: crm_element_value: 	Couldn't find lrmd_callid in NULL
Oct 21 11:20:51 [7685] bl460g1n6       crmd:    error: crm_abort: 	crm_element_value: Triggered assert at xml.c:3336 : data != NULL
Oct 21 11:20:51 [7685] bl460g1n6       crmd:    error: crm_element_value: 	Couldn't find lrmd_rc in NULL
Oct 21 11:20:51 [7685] bl460g1n6       crmd:    error: crm_abort: 	crm_element_value: Triggered assert at xml.c:3336 : data != NULL
Oct 21 11:20:51 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	result = 0
Oct 21 11:20:51 [7681] bl460g1n6 stonith-ng:   notice: stonith_device_register: 	Added 'prmStonith7-2' to the device list (5 active devices)
Oct 21 11:20:51 [7681] bl460g1n6 stonith-ng:     info: stonith_device_remove: 	Removed 'prmStonith8-1' from the device list (4 active devices)
Oct 21 11:20:51 [7681] bl460g1n6 stonith-ng:     info: cib_device_update: 	Device prmStonith8-1 is allowed on bl460g1n6: score=0
Oct 21 11:20:51 [7681] bl460g1n6 stonith-ng:     info: stonith_action_create: 	Initiating action metadata for agent fence_legacy (target=(null))
Oct 21 11:20:51 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	forking
Oct 21 11:20:51 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	sending args
Oct 21 11:20:52 [7663] bl460g1n6 corosync debug   [CPG   ] cpg.c:message_handler_req_exec_cpg_procleave:1276 got procleave message from cluster node -1062705702
Oct 21 11:20:52 [7663] bl460g1n6 corosync debug   [CPG   ] cpg.c:message_handler_req_exec_cpg_procleave:1276 got procleave message from cluster node -1062705703
Oct 21 11:20:52 [7676] bl460g1n6 pacemakerd:    debug: update_node_processes: 	Node bl460g1n8 now has process list: 00000000000000000000000000111110 (was 00000000000000000000000000111310)
Oct 21 11:20:52 [7676] bl460g1n6 pacemakerd:    debug: update_node_processes: 	Node bl460g1n8 now has process list: 00000000000000000000000000111110 (was 00000000000000000000000000111310)
Oct 21 11:20:52 [7676] bl460g1n6 pacemakerd:    debug: update_node_processes: 	Node bl460g1n8 now has process list: 00000000000000000000000000111310 (was 00000000000000000000000000111110)
Oct 21 11:20:52 [7676] bl460g1n6 pacemakerd:    debug: update_node_processes: 	Node bl460g1n8 now has process list: 00000000000000000000000000111310 (was 00000000000000000000000000111110)
Oct 21 11:20:52 [7676] bl460g1n6 pacemakerd:    debug: update_node_processes: 	Node bl460g1n7 now has process list: 00000000000000000000000000111110 (was 00000000000000000000000000111310)
Oct 21 11:20:52 [7676] bl460g1n6 pacemakerd:    debug: update_node_processes: 	Node bl460g1n7 now has process list: 00000000000000000000000000111110 (was 00000000000000000000000000111310)
Oct 21 11:20:52 [7676] bl460g1n6 pacemakerd:    debug: update_node_processes: 	Node bl460g1n7 now has process list: 00000000000000000000000000111310 (was 00000000000000000000000000111110)
Oct 21 11:20:52 [7676] bl460g1n6 pacemakerd:    debug: update_node_processes: 	Node bl460g1n7 now has process list: 00000000000000000000000000111310 (was 00000000000000000000000000111110)
Oct 21 11:20:52 [7663] bl460g1n6 corosync debug   [CPG   ] cpg.c:message_handler_req_exec_cpg_procjoin:1260 got procjoin message from cluster node -1062705702 (r(0) ip(192.168.101.218) r(1) ip(192.168.102.218) ) for pid 1841
Oct 21 11:20:52 [7663] bl460g1n6 corosync debug   [CPG   ] cpg.c:message_handler_req_exec_cpg_procjoin:1260 got procjoin message from cluster node -1062705703 (r(0) ip(192.168.101.217) r(1) ip(192.168.102.217) ) for pid 12703
Oct 21 11:20:52 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	result = 0
Oct 21 11:20:52 [7681] bl460g1n6 stonith-ng:   notice: stonith_device_register: 	Added 'prmStonith8-1' to the device list (5 active devices)
Oct 21 11:20:52 [7681] bl460g1n6 stonith-ng:     info: stonith_device_remove: 	Removed 'prmStonith8-2' from the device list (4 active devices)
Oct 21 11:20:52 [7681] bl460g1n6 stonith-ng:     info: cib_device_update: 	Device prmStonith8-2 is allowed on bl460g1n6: score=0
Oct 21 11:20:52 [7681] bl460g1n6 stonith-ng:     info: stonith_action_create: 	Initiating action metadata for agent fence_legacy (target=(null))
Oct 21 11:20:52 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	forking
Oct 21 11:20:52 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	sending args
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: internal_stonith_action_execute: 	result = 0
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:   notice: stonith_device_register: 	Added 'prmStonith8-2' to the device list (5 active devices)
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:     info: crm_client_new: 	Connecting 0x1ea7c30 for uid=0 gid=0 pid=7682 id=85093e05-8bcc-40ba-adfe-1cdbcb846770
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: handle_new_connection: 	IPC credentials authenticated (7681-7682-10)
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: qb_ipcs_shm_connect: 	connecting to client [7682]
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:53 [7682] bl460g1n6       lrmd:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:53 [7682] bl460g1n6       lrmd:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:53 [7682] bl460g1n6       lrmd:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: stonith_command: 	Processing register 1 from lrmd.7682 (               0)
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:     info: stonith_command: 	Processed register from lrmd.7682: OK (0)
Oct 21 11:20:53 [7682] bl460g1n6       lrmd:    debug: stonith_api_signon: 	Connection to STONITH successful
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: stonith_command: 	Processing st_notify 2 from lrmd.7682 (               0)
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: handle_request: 	Setting st_notify_disconnect callbacks for lrmd.7682 (85093e05-8bcc-40ba-adfe-1cdbcb846770): ON
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:     info: stonith_command: 	Processed st_notify from lrmd.7682: OK (0)
Oct 21 11:20:53 [7682] bl460g1n6       lrmd:    debug: log_finished: 	finished - rsc:prmStonith6-1 action:monitor call_id:17  exit-code:7 exec-time:33067ms queue-time:0ms
Oct 21 11:20:53 [7682] bl460g1n6       lrmd:    debug: operation_finished: 	prmVM1_monitor_0:7789 - exited with rc=0
Oct 21 11:20:53 [7682] bl460g1n6       lrmd:    debug: operation_finished: 	prmVM1_monitor_0:7789:stderr [ -- empty -- ]
Oct 21 11:20:53 [7682] bl460g1n6       lrmd:    debug: operation_finished: 	prmVM1_monitor_0:7789:stdout [ -- empty -- ]
Oct 21 11:20:53 [7682] bl460g1n6       lrmd:    debug: log_finished: 	finished - rsc:prmVM1 action:monitor call_id:5 pid:7789 exit-code:0 exec-time:33075ms queue-time:0ms
Oct 21 11:20:53 [7682] bl460g1n6       lrmd:    debug: operation_finished: 	prmVM2_monitor_0:7790 - exited with rc=0
Oct 21 11:20:53 [7682] bl460g1n6       lrmd:    debug: operation_finished: 	prmVM2_monitor_0:7790:stderr [ -- empty -- ]
Oct 21 11:20:53 [7682] bl460g1n6       lrmd:    debug: operation_finished: 	prmVM2_monitor_0:7790:stdout [ -- empty -- ]
Oct 21 11:20:53 [7682] bl460g1n6       lrmd:    debug: log_finished: 	finished - rsc:prmVM2 action:monitor call_id:9 pid:7790 exit-code:0 exec-time:33073ms queue-time:0ms
Oct 21 11:20:53 [7682] bl460g1n6       lrmd:    debug: operation_finished: 	prmVM3_monitor_0:7791 - exited with rc=0
Oct 21 11:20:53 [7682] bl460g1n6       lrmd:    debug: operation_finished: 	prmVM3_monitor_0:7791:stderr [ -- empty -- ]
Oct 21 11:20:53 [7682] bl460g1n6       lrmd:    debug: operation_finished: 	prmVM3_monitor_0:7791:stdout [ -- empty -- ]
Oct 21 11:20:53 [7682] bl460g1n6       lrmd:    debug: log_finished: 	finished - rsc:prmVM3 action:monitor call_id:13 pid:7791 exit-code:0 exec-time:33070ms queue-time:1ms
Oct 21 11:20:53 [7682] bl460g1n6       lrmd:     info: process_lrmd_get_rsc_info: 	Resource 'prmStonith6-2' not found (4 active resources)
Oct 21 11:20:53 [7682] bl460g1n6       lrmd:    debug: process_lrmd_message: 	Processed lrmd_rsc_info operation from 666c66a8-da92-4298-beea-12fd671d2b0d: rc=0, reply=0, notify=0, exit=4201864
Oct 21 11:20:53 [7682] bl460g1n6       lrmd:     info: process_lrmd_rsc_register: 	Added 'prmStonith6-2' to the rsc list (5 active resources)
Oct 21 11:20:53 [7682] bl460g1n6       lrmd:    debug: process_lrmd_message: 	Processed lrmd_rsc_register operation from 666c66a8-da92-4298-beea-12fd671d2b0d: rc=0, reply=1, notify=1, exit=4201864
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    error: internal_ipc_get_reply: 	Discarding old reply 90 (need 91)
Oct 21 11:20:53 [7685] bl460g1n6       crmd:   notice: internal_ipc_get_reply: 	OldIpcReply   <lrmd_reply lrmd_origin="process_lrmd_get_rsc_info" lrmd_rc="-19" lrmd_callid="18"/>
Oct 21 11:20:53 [7682] bl460g1n6       lrmd:    debug: process_lrmd_message: 	Processed lrmd_rsc_info operation from 666c66a8-da92-4298-beea-12fd671d2b0d: rc=0, reply=0, notify=0, exit=4201864
Oct 21 11:20:53 [7685] bl460g1n6       crmd:     info: do_lrm_rsc_op: 	Performing key=8:1:7:38db9a68-056c-4e65-8658-75f0c3cc91e5 op=prmStonith6-2_monitor_0
Oct 21 11:20:53 [7682] bl460g1n6       lrmd:    debug: process_lrmd_message: 	Processed lrmd_rsc_exec operation from 666c66a8-da92-4298-beea-12fd671d2b0d: rc=21, reply=1, notify=0, exit=4201864
Oct 21 11:20:53 [7682] bl460g1n6       lrmd:    debug: log_execute: 	executing - rsc:prmStonith6-2 action:monitor call_id:21
Oct 21 11:20:53 [7682] bl460g1n6       lrmd:    debug: log_finished: 	finished - rsc:prmStonith6-2 action:monitor call_id:21  exit-code:7 exec-time:0ms queue-time:0ms
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: te_pseudo_action: 	Pseudo action 54 fired and confirmed
Oct 21 11:20:53 [7685] bl460g1n6       crmd:   notice: te_rsc_command: 	Initiating action 31: monitor prmStonith7-1_monitor_0 on bl460g1n8
Oct 21 11:20:53 [7685] bl460g1n6       crmd:   notice: te_rsc_command: 	Initiating action 20: monitor prmStonith7-1_monitor_0 on bl460g1n7
Oct 21 11:20:53 [7685] bl460g1n6       crmd:   notice: te_rsc_command: 	Initiating action 9: monitor prmStonith7-1_monitor_0 on bl460g1n6 (local)
Oct 21 11:20:53 [7682] bl460g1n6       lrmd:     info: process_lrmd_get_rsc_info: 	Resource 'prmStonith7-1' not found (5 active resources)
Oct 21 11:20:53 [7682] bl460g1n6       lrmd:    debug: process_lrmd_message: 	Processed lrmd_rsc_info operation from 666c66a8-da92-4298-beea-12fd671d2b0d: rc=0, reply=0, notify=0, exit=4201864
Oct 21 11:20:53 [7682] bl460g1n6       lrmd:     info: process_lrmd_rsc_register: 	Added 'prmStonith7-1' to the rsc list (6 active resources)
Oct 21 11:20:53 [7682] bl460g1n6       lrmd:    debug: process_lrmd_message: 	Processed lrmd_rsc_register operation from 666c66a8-da92-4298-beea-12fd671d2b0d: rc=0, reply=1, notify=1, exit=4201864
Oct 21 11:20:53 [7682] bl460g1n6       lrmd:    debug: process_lrmd_message: 	Processed lrmd_rsc_info operation from 666c66a8-da92-4298-beea-12fd671d2b0d: rc=0, reply=0, notify=0, exit=4201864
Oct 21 11:20:53 [7685] bl460g1n6       crmd:     info: do_lrm_rsc_op: 	Performing key=9:1:7:38db9a68-056c-4e65-8658-75f0c3cc91e5 op=prmStonith7-1_monitor_0
Oct 21 11:20:53 [7682] bl460g1n6       lrmd:    debug: process_lrmd_message: 	Processed lrmd_rsc_exec operation from 666c66a8-da92-4298-beea-12fd671d2b0d: rc=25, reply=1, notify=0, exit=4201864
Oct 21 11:20:53 [7682] bl460g1n6       lrmd:    debug: log_execute: 	executing - rsc:prmStonith7-1 action:monitor call_id:25
Oct 21 11:20:53 [7682] bl460g1n6       lrmd:    debug: log_finished: 	finished - rsc:prmStonith7-1 action:monitor call_id:25  exit-code:7 exec-time:0ms queue-time:0ms
Oct 21 11:20:53 [7685] bl460g1n6       crmd:   notice: te_rsc_command: 	Initiating action 32: monitor prmStonith7-2_monitor_0 on bl460g1n8
Oct 21 11:20:53 [7685] bl460g1n6       crmd:   notice: te_rsc_command: 	Initiating action 21: monitor prmStonith7-2_monitor_0 on bl460g1n7
Oct 21 11:20:53 [7685] bl460g1n6       crmd:   notice: te_rsc_command: 	Initiating action 10: monitor prmStonith7-2_monitor_0 on bl460g1n6 (local)
Oct 21 11:20:53 [7682] bl460g1n6       lrmd:     info: process_lrmd_get_rsc_info: 	Resource 'prmStonith7-2' not found (6 active resources)
Oct 21 11:20:53 [7682] bl460g1n6       lrmd:    debug: process_lrmd_message: 	Processed lrmd_rsc_info operation from 666c66a8-da92-4298-beea-12fd671d2b0d: rc=0, reply=0, notify=0, exit=4201864
Oct 21 11:20:53 [7682] bl460g1n6       lrmd:     info: process_lrmd_rsc_register: 	Added 'prmStonith7-2' to the rsc list (7 active resources)
Oct 21 11:20:53 [7682] bl460g1n6       lrmd:    debug: process_lrmd_message: 	Processed lrmd_rsc_register operation from 666c66a8-da92-4298-beea-12fd671d2b0d: rc=0, reply=1, notify=1, exit=4201864
Oct 21 11:20:53 [7682] bl460g1n6       lrmd:    debug: process_lrmd_message: 	Processed lrmd_rsc_info operation from 666c66a8-da92-4298-beea-12fd671d2b0d: rc=0, reply=0, notify=0, exit=4201864
Oct 21 11:20:53 [7685] bl460g1n6       crmd:     info: do_lrm_rsc_op: 	Performing key=10:1:7:38db9a68-056c-4e65-8658-75f0c3cc91e5 op=prmStonith7-2_monitor_0
Oct 21 11:20:53 [7682] bl460g1n6       lrmd:    debug: process_lrmd_message: 	Processed lrmd_rsc_exec operation from 666c66a8-da92-4298-beea-12fd671d2b0d: rc=29, reply=1, notify=0, exit=4201864
Oct 21 11:20:53 [7682] bl460g1n6       lrmd:    debug: log_execute: 	executing - rsc:prmStonith7-2 action:monitor call_id:29
Oct 21 11:20:53 [7682] bl460g1n6       lrmd:    debug: log_finished: 	finished - rsc:prmStonith7-2 action:monitor call_id:29  exit-code:7 exec-time:0ms queue-time:0ms
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: te_pseudo_action: 	Pseudo action 62 fired and confirmed
Oct 21 11:20:53 [7685] bl460g1n6       crmd:   notice: te_rsc_command: 	Initiating action 33: monitor prmStonith8-1_monitor_0 on bl460g1n8
Oct 21 11:20:53 [7685] bl460g1n6       crmd:   notice: te_rsc_command: 	Initiating action 22: monitor prmStonith8-1_monitor_0 on bl460g1n7
Oct 21 11:20:53 [7685] bl460g1n6       crmd:   notice: te_rsc_command: 	Initiating action 11: monitor prmStonith8-1_monitor_0 on bl460g1n6 (local)
Oct 21 11:20:53 [7682] bl460g1n6       lrmd:     info: process_lrmd_get_rsc_info: 	Resource 'prmStonith8-1' not found (7 active resources)
Oct 21 11:20:53 [7682] bl460g1n6       lrmd:    debug: process_lrmd_message: 	Processed lrmd_rsc_info operation from 666c66a8-da92-4298-beea-12fd671d2b0d: rc=0, reply=0, notify=0, exit=4201864
Oct 21 11:20:53 [7682] bl460g1n6       lrmd:     info: process_lrmd_rsc_register: 	Added 'prmStonith8-1' to the rsc list (8 active resources)
Oct 21 11:20:53 [7682] bl460g1n6       lrmd:    debug: process_lrmd_message: 	Processed lrmd_rsc_register operation from 666c66a8-da92-4298-beea-12fd671d2b0d: rc=0, reply=1, notify=1, exit=4201864
Oct 21 11:20:53 [7682] bl460g1n6       lrmd:    debug: process_lrmd_message: 	Processed lrmd_rsc_info operation from 666c66a8-da92-4298-beea-12fd671d2b0d: rc=0, reply=0, notify=0, exit=4201864
Oct 21 11:20:53 [7685] bl460g1n6       crmd:     info: do_lrm_rsc_op: 	Performing key=11:1:7:38db9a68-056c-4e65-8658-75f0c3cc91e5 op=prmStonith8-1_monitor_0
Oct 21 11:20:53 [7682] bl460g1n6       lrmd:    debug: process_lrmd_message: 	Processed lrmd_rsc_exec operation from 666c66a8-da92-4298-beea-12fd671d2b0d: rc=33, reply=1, notify=0, exit=4201864
Oct 21 11:20:53 [7682] bl460g1n6       lrmd:    debug: log_execute: 	executing - rsc:prmStonith8-1 action:monitor call_id:33
Oct 21 11:20:53 [7682] bl460g1n6       lrmd:    debug: log_finished: 	finished - rsc:prmStonith8-1 action:monitor call_id:33  exit-code:7 exec-time:0ms queue-time:0ms
Oct 21 11:20:53 [7685] bl460g1n6       crmd:   notice: te_rsc_command: 	Initiating action 34: monitor prmStonith8-2_monitor_0 on bl460g1n8
Oct 21 11:20:53 [7685] bl460g1n6       crmd:   notice: te_rsc_command: 	Initiating action 23: monitor prmStonith8-2_monitor_0 on bl460g1n7
Oct 21 11:20:53 [7685] bl460g1n6       crmd:   notice: te_rsc_command: 	Initiating action 12: monitor prmStonith8-2_monitor_0 on bl460g1n6 (local)
Oct 21 11:20:53 [7682] bl460g1n6       lrmd:     info: process_lrmd_get_rsc_info: 	Resource 'prmStonith8-2' not found (8 active resources)
Oct 21 11:20:53 [7682] bl460g1n6       lrmd:    debug: process_lrmd_message: 	Processed lrmd_rsc_info operation from 666c66a8-da92-4298-beea-12fd671d2b0d: rc=0, reply=0, notify=0, exit=4201864
Oct 21 11:20:53 [7682] bl460g1n6       lrmd:     info: process_lrmd_rsc_register: 	Added 'prmStonith8-2' to the rsc list (9 active resources)
Oct 21 11:20:53 [7682] bl460g1n6       lrmd:    debug: process_lrmd_message: 	Processed lrmd_rsc_register operation from 666c66a8-da92-4298-beea-12fd671d2b0d: rc=0, reply=1, notify=1, exit=4201864
Oct 21 11:20:53 [7682] bl460g1n6       lrmd:    debug: process_lrmd_message: 	Processed lrmd_rsc_info operation from 666c66a8-da92-4298-beea-12fd671d2b0d: rc=0, reply=0, notify=0, exit=4201864
Oct 21 11:20:53 [7685] bl460g1n6       crmd:     info: do_lrm_rsc_op: 	Performing key=12:1:7:38db9a68-056c-4e65-8658-75f0c3cc91e5 op=prmStonith8-2_monitor_0
Oct 21 11:20:53 [7682] bl460g1n6       lrmd:    debug: process_lrmd_message: 	Processed lrmd_rsc_exec operation from 666c66a8-da92-4298-beea-12fd671d2b0d: rc=37, reply=1, notify=0, exit=4201864
Oct 21 11:20:53 [7682] bl460g1n6       lrmd:    debug: log_execute: 	executing - rsc:prmStonith8-2 action:monitor call_id:37
Oct 21 11:20:53 [7682] bl460g1n6       lrmd:    debug: log_finished: 	finished - rsc:prmStonith8-2 action:monitor call_id:37  exit-code:7 exec-time:0ms queue-time:0ms
Oct 21 11:20:53 [7685] bl460g1n6       crmd:   notice: te_rsc_command: 	Initiating action 13: monitor prmPing:0_monitor_0 on bl460g1n6 (local)
Oct 21 11:20:53 [7682] bl460g1n6       lrmd:     info: process_lrmd_get_rsc_info: 	Resource 'prmPing' not found (9 active resources)
Oct 21 11:20:53 [7682] bl460g1n6       lrmd:    debug: process_lrmd_message: 	Processed lrmd_rsc_info operation from 666c66a8-da92-4298-beea-12fd671d2b0d: rc=0, reply=0, notify=0, exit=4201864
Oct 21 11:20:53 [7682] bl460g1n6       lrmd:     info: process_lrmd_get_rsc_info: 	Resource 'prmPing:0' not found (9 active resources)
Oct 21 11:20:53 [7682] bl460g1n6       lrmd:    debug: process_lrmd_message: 	Processed lrmd_rsc_info operation from 666c66a8-da92-4298-beea-12fd671d2b0d: rc=0, reply=0, notify=0, exit=4201864
Oct 21 11:20:53 [7682] bl460g1n6       lrmd:     info: process_lrmd_rsc_register: 	Added 'prmPing' to the rsc list (10 active resources)
Oct 21 11:20:53 [7682] bl460g1n6       lrmd:    debug: process_lrmd_message: 	Processed lrmd_rsc_register operation from 666c66a8-da92-4298-beea-12fd671d2b0d: rc=0, reply=1, notify=1, exit=4201864
Oct 21 11:20:53 [7682] bl460g1n6       lrmd:    debug: process_lrmd_message: 	Processed lrmd_rsc_info operation from 666c66a8-da92-4298-beea-12fd671d2b0d: rc=0, reply=0, notify=0, exit=4201864
Oct 21 11:20:53 [7685] bl460g1n6       crmd:     info: do_lrm_rsc_op: 	Performing key=13:1:7:38db9a68-056c-4e65-8658-75f0c3cc91e5 op=prmPing_monitor_0
Oct 21 11:20:53 [7682] bl460g1n6       lrmd:    debug: process_lrmd_message: 	Processed lrmd_rsc_exec operation from 666c66a8-da92-4298-beea-12fd671d2b0d: rc=42, reply=1, notify=0, exit=4201864
Oct 21 11:20:53 [7682] bl460g1n6       lrmd:    debug: log_execute: 	executing - rsc:prmPing action:monitor call_id:42
Oct 21 11:20:53 [7685] bl460g1n6       crmd:   notice: te_rsc_command: 	Initiating action 24: monitor prmPing:1_monitor_0 on bl460g1n7
Oct 21 11:20:53 [7685] bl460g1n6       crmd:   notice: te_rsc_command: 	Initiating action 35: monitor prmPing:2_monitor_0 on bl460g1n8
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: run_graph: 	Throttling output: batch limit (30) reached
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: run_graph: 	Transition 1 (Complete=0, Pending=30, Fired=33, Skipped=0, Incomplete=27, Source=/var/lib/pacemaker/pengine/pe-input-1.bz2): In-progress
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: te_update_diff: 	Processing diff (cib_modify): 0.8.7 -> 0.9.1 (S_TRANSITION_ENGINE)
Oct 21 11:20:53 [7685] bl460g1n6       crmd:     info: abort_transition_graph: 	te_update_diff:126 - Triggered transition abort (complete=0, node=, tag=diff, id=(null), magic=NA, cib=0.9.1) : Non-status change
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause   <diff crm_feature_set="3.0.7" digest="d8a6e5052646e205343278a9d1e7e1ca">
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause     <diff-removed admin_epoch="0" epoch="8" num_updates="7">
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause       <cib admin_epoch="0" epoch="8" num_updates="7"/>
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause     </diff-removed>
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause     <diff-added>
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause       <cib epoch="9" num_updates="1" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:20 2013" update-origin="bl460g1n8" update-client="crm_resource" have-quorum="1" dc-uuid="3232261592">
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause         <configuration>
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause           <resources>
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause             <primitive id="prmVM1" class="ocf" provider="heartbeat" type="VirtualDomain">
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause               <utilization id="prmVM1-utilization" __crm_diff_marker__="added:top">
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                 <nvpair id="prmVM1-utilization-cpu" name="cpu" value="1"/>
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause               </utilization>
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause             </primitive>
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause           </resources>
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause         </configuration>
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause       </cib>
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause     </diff-added>
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause   </diff>
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: update_abort_priority: 	Abort priority upgraded from 0 to 1000000
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: update_abort_priority: 	Abort action done superceeded by restart
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: te_update_diff: 	Processing diff (cib_modify): 0.9.1 -> 0.10.1 (S_TRANSITION_ENGINE)
Oct 21 11:20:53 [7685] bl460g1n6       crmd:     info: abort_transition_graph: 	te_update_diff:126 - Triggered transition abort (complete=0, node=, tag=diff, id=(null), magic=NA, cib=0.10.1) : Non-status change
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause   <diff crm_feature_set="3.0.7" digest="44410f64dd923810788e44f3f2b39418">
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause     <diff-removed admin_epoch="0" epoch="9" num_updates="1">
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause       <cib admin_epoch="0" epoch="9" num_updates="1"/>
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause     </diff-removed>
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause     <diff-added>
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause       <cib epoch="10" num_updates="1" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:20 2013" update-origin="bl460g1n8" update-client="crm_resource" have-quorum="1" dc-uuid="3232261592">
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause         <configuration>
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause           <resources>
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause             <primitive id="prmVM2" class="ocf" provider="heartbeat" type="VirtualDomain">
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause               <utilization id="prmVM2-utilization" __crm_diff_marker__="added:top">
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                 <nvpair id="prmVM2-utilization-cpu" name="cpu" value="1"/>
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause               </utilization>
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause             </primitive>
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause           </resources>
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause         </configuration>
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause       </cib>
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause     </diff-added>
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause   </diff>
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: te_update_diff: 	Processing diff (cib_modify): 0.10.1 -> 0.11.1 (S_TRANSITION_ENGINE)
Oct 21 11:20:53 [7685] bl460g1n6       crmd:     info: abort_transition_graph: 	te_update_diff:126 - Triggered transition abort (complete=0, node=, tag=diff, id=(null), magic=NA, cib=0.11.1) : Non-status change
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause   <diff crm_feature_set="3.0.7" digest="470b067daaa58c5251c3aa91969e4bf4">
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause     <diff-removed admin_epoch="0" epoch="10" num_updates="1">
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause       <cib admin_epoch="0" epoch="10" num_updates="1"/>
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause     </diff-removed>
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause     <diff-added>
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause       <cib epoch="11" num_updates="1" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:20 2013" update-origin="bl460g1n7" update-client="crm_resource" have-quorum="1" dc-uuid="3232261592">
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause         <configuration>
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause           <resources>
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause             <primitive id="prmVM3" class="ocf" provider="heartbeat" type="VirtualDomain">
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause               <utilization id="prmVM3-utilization" __crm_diff_marker__="added:top">
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                 <nvpair id="prmVM3-utilization-cpu" name="cpu" value="1"/>
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause               </utilization>
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause             </primitive>
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause           </resources>
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause         </configuration>
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause       </cib>
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause     </diff-added>
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause   </diff>
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: te_update_diff: 	Processing diff (cib_modify): 0.11.1 -> 0.12.1 (S_TRANSITION_ENGINE)
Oct 21 11:20:53 [7685] bl460g1n6       crmd:     info: abort_transition_graph: 	te_update_diff:126 - Triggered transition abort (complete=0, node=, tag=diff, id=(null), magic=NA, cib=0.12.1) : Non-status change
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause   <diff crm_feature_set="3.0.7" digest="6832c5685d4afeccbc2cf16f88987da6">
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause     <diff-removed admin_epoch="0" epoch="11" num_updates="1">
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause       <cib admin_epoch="0" epoch="11" num_updates="1"/>
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause     </diff-removed>
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause     <diff-added>
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause       <cib epoch="12" num_updates="1" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:21 2013" update-origin="bl460g1n8" update-client="crm_resource" have-quorum="1" dc-uuid="3232261592">
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause         <configuration>
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause           <resources>
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause             <primitive id="prmVM1" class="ocf" provider="heartbeat" type="VirtualDomain">
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause               <utilization id="prmVM1-utilization">
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                 <nvpair id="prmVM1-utilization-hv_memory" name="hv_memory" value="2048" __crm_diff_marker__="added:top"/>
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause               </utilization>
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause             </primitive>
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause           </resources>
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause         </configuration>
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause       </cib>
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause     </diff-added>
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause   </diff>
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: te_update_diff: 	Processing diff (cib_modify): 0.12.1 -> 0.13.1 (S_TRANSITION_ENGINE)
Oct 21 11:20:53 [7685] bl460g1n6       crmd:     info: abort_transition_graph: 	te_update_diff:126 - Triggered transition abort (complete=0, node=, tag=diff, id=(null), magic=NA, cib=0.13.1) : Non-status change
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause   <diff crm_feature_set="3.0.7" digest="63ee27ddf3ff33e330094c1b6b4564c1">
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause     <diff-removed admin_epoch="0" epoch="12" num_updates="1">
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause       <cib admin_epoch="0" epoch="12" num_updates="1"/>
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause     </diff-removed>
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause     <diff-added>
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause       <cib epoch="13" num_updates="1" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:21 2013" update-origin="bl460g1n7" update-client="crm_resource" have-quorum="1" dc-uuid="3232261592">
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause         <configuration>
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause           <resources>
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause             <primitive id="prmVM2" class="ocf" provider="heartbeat" type="VirtualDomain">
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause               <utilization id="prmVM2-utilization">
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                 <nvpair id="prmVM2-utilization-hv_memory" name="hv_memory" value="2048" __crm_diff_marker__="added:top"/>
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause               </utilization>
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause             </primitive>
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause           </resources>
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause         </configuration>
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause       </cib>
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause     </diff-added>
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause   </diff>
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: te_update_diff: 	Processing diff (cib_modify): 0.13.1 -> 0.14.1 (S_TRANSITION_ENGINE)
Oct 21 11:20:53 [7685] bl460g1n6       crmd:     info: abort_transition_graph: 	te_update_diff:126 - Triggered transition abort (complete=0, node=, tag=diff, id=(null), magic=NA, cib=0.14.1) : Non-status change
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause   <diff crm_feature_set="3.0.7" digest="8f80fc0e8ac375c07cf3f640126098ed">
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause     <diff-removed admin_epoch="0" epoch="13" num_updates="1">
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause       <cib admin_epoch="0" epoch="13" num_updates="1"/>
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause     </diff-removed>
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause     <diff-added>
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause       <cib epoch="14" num_updates="1" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:21 2013" update-origin="bl460g1n7" update-client="crm_resource" have-quorum="1" dc-uuid="3232261592">
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause         <configuration>
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause           <resources>
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause             <primitive id="prmVM3" class="ocf" provider="heartbeat" type="VirtualDomain">
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause               <utilization id="prmVM3-utilization">
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause                 <nvpair id="prmVM3-utilization-hv_memory" name="hv_memory" value="2048" __crm_diff_marker__="added:top"/>
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause               </utilization>
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause             </primitive>
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause           </resources>
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause         </configuration>
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause       </cib>
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause     </diff-added>
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause   </diff>
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: create_operation_update: 	do_update_resource: Updating resource prmStonith6-1 after monitor op complete (interval=0)
Oct 21 11:20:53 [7682] bl460g1n6       lrmd:    debug: operation_finished: 	prmPing_monitor_0:7961 - exited with rc=7
Oct 21 11:20:53 [7682] bl460g1n6       lrmd:    debug: operation_finished: 	prmPing_monitor_0:7961:stderr [ -- empty -- ]
Oct 21 11:20:53 [7682] bl460g1n6       lrmd:    debug: operation_finished: 	prmPing_monitor_0:7961:stdout [ -- empty -- ]
Oct 21 11:20:53 [7682] bl460g1n6       lrmd:    debug: log_finished: 	finished - rsc:prmPing action:monitor call_id:42 pid:7961 exit-code:7 exec-time:24ms queue-time:0ms
Oct 21 11:20:53 [7685] bl460g1n6       crmd:     info: process_lrm_event: 	LRM operation prmStonith6-1_monitor_0 (call=17, rc=7, cib-update=60, confirmed=true) not running
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: update_history_cache: 	Updating history for 'prmStonith6-1' with monitor op
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: create_operation_update: 	do_update_resource: Updating resource prmVM1 after monitor op complete (interval=0)
Oct 21 11:20:53 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=local/crmd/60, version=0.14.2)
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: --- 0.14.1
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: +++ 0.14.2 22005875acbc815a128998134c8a0773
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-- <cib num_updates="1"/>
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  <cib epoch="14" num_updates="2" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:21 2013" update-origin="bl460g1n7" update-client="crm_resource" have-quorum="1" dc-uuid="3232261592">
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    <status>
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      <node_state id="3232261592" uname="bl460g1n6" in_ccm="true" crmd="online" crm-debug-origin="do_update_resource" join="member" expected="member">
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        <lrm id="3232261592">
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          <lrm_resources>
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <lrm_resource id="prmStonith6-1" type="external/ipmi" class="stonith">
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <lrm_rsc_op id="prmStonith6-1_last_0" operation_key="prmStonith6-1_monitor_0" operation="monitor" crm-debug-origin="do_update_resource" crm_feature_set="3.0.7" transition-key="7:1:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" transition-magic="0:7;7:1:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" call-id="17" rc-code="7" op-status="0" interval="0" last-run="1382322020" last-rc-change="1382322020" exec-time="33067" queue-time="0" op-d
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           </lrm_resource>
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          </lrm_resources>
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        </lrm>
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      </node_state>
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    </status>
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  </cib>
Oct 21 11:20:53 [7685] bl460g1n6       crmd:     info: services_os_action_execute: 	Managed VirtualDomain_meta-data_0 process 7977 exited with rc=0
Oct 21 11:20:53 [7685] bl460g1n6       crmd:   notice: process_lrm_event: 	LRM operation prmVM1_monitor_0 (call=5, rc=0, cib-update=61, confirmed=true) ok
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: update_history_cache: 	Updating history for 'prmVM1' with monitor op
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: create_operation_update: 	do_update_resource: Updating resource prmVM2 after monitor op complete (interval=0)
Oct 21 11:20:53 [7685] bl460g1n6       crmd:   notice: process_lrm_event: 	LRM operation prmVM2_monitor_0 (call=9, rc=0, cib-update=62, confirmed=true) ok
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: update_history_cache: 	Updating history for 'prmVM2' with monitor op
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: create_operation_update: 	do_update_resource: Updating resource prmVM3 after monitor op complete (interval=0)
Oct 21 11:20:53 [7685] bl460g1n6       crmd:   notice: process_lrm_event: 	LRM operation prmVM3_monitor_0 (call=13, rc=0, cib-update=63, confirmed=true) ok
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: update_history_cache: 	Updating history for 'prmVM3' with monitor op
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: create_operation_update: 	do_update_resource: Updating resource prmStonith6-2 after monitor op complete (interval=0)
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: --- 0.14.2
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: +++ 0.14.3 403d33e0364bd6d8e417c90884958d86
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-- <cib num_updates="2"/>
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  <cib epoch="14" num_updates="3" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:21 2013" update-origin="bl460g1n7" update-client="crm_resource" have-quorum="1" dc-uuid="3232261592">
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    <status>
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      <node_state id="3232261592" uname="bl460g1n6" in_ccm="true" crmd="online" crm-debug-origin="do_update_resource" join="member" expected="member">
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        <lrm id="3232261592">
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          <lrm_resources>
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <lrm_resource id="prmVM1" type="VirtualDomain" class="ocf" provider="heartbeat">
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <lrm_rsc_op id="prmVM1_last_failure_0" operation_key="prmVM1_monitor_0" operation="monitor" crm-debug-origin="do_update_resource" crm_feature_set="3.0.7" transition-key="4:1:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" transition-magic="0:0;4:1:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" call-id="5" rc-code="0" op-status="0" interval="0" last-run="1382322020" last-rc-change="1382322020" exec-time="33075" queue-time="0" op-digest="
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           </lrm_resource>
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          </lrm_resources>
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        </lrm>
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      </node_state>
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    </status>
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  </cib>
Oct 21 11:20:53 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=local/crmd/61, version=0.14.3)
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: --- 0.14.3
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: +++ 0.14.4 4c2c85927c5be977e7ff8f1bee33ef22
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-- <cib num_updates="3"/>
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  <cib epoch="14" num_updates="4" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:21 2013" update-origin="bl460g1n7" update-client="crm_resource" have-quorum="1" dc-uuid="3232261592">
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    <status>
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      <node_state id="3232261592" uname="bl460g1n6" in_ccm="true" crmd="online" crm-debug-origin="do_update_resource" join="member" expected="member">
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        <lrm id="3232261592">
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          <lrm_resources>
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <lrm_resource id="prmVM2" type="VirtualDomain" class="ocf" provider="heartbeat">
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <lrm_rsc_op id="prmVM2_last_failure_0" operation_key="prmVM2_monitor_0" operation="monitor" crm-debug-origin="do_update_resource" crm_feature_set="3.0.7" transition-key="5:1:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" transition-magic="0:0;5:1:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" call-id="9" rc-code="0" op-status="0" interval="0" last-run="1382322020" last-rc-change="1382322020" exec-time="33073" queue-time="0" op-digest="
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           </lrm_resource>
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          </lrm_resources>
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        </lrm>
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      </node_state>
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    </status>
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  </cib>
Oct 21 11:20:53 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=local/crmd/62, version=0.14.4)
Oct 21 11:20:53 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=local/crmd/63, version=0.14.5)
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: --- 0.14.4
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: +++ 0.14.5 e416018b95651c0d5c359697d597f8bb
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-- <cib num_updates="4"/>
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  <cib epoch="14" num_updates="5" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:21 2013" update-origin="bl460g1n7" update-client="crm_resource" have-quorum="1" dc-uuid="3232261592">
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    <status>
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      <node_state id="3232261592" uname="bl460g1n6" in_ccm="true" crmd="online" crm-debug-origin="do_update_resource" join="member" expected="member">
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        <lrm id="3232261592">
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          <lrm_resources>
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <lrm_resource id="prmVM3" type="VirtualDomain" class="ocf" provider="heartbeat">
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <lrm_rsc_op id="prmVM3_last_failure_0" operation_key="prmVM3_monitor_0" operation="monitor" crm-debug-origin="do_update_resource" crm_feature_set="3.0.7" transition-key="6:1:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" transition-magic="0:0;6:1:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" call-id="13" rc-code="0" op-status="0" interval="0" last-run="1382322020" last-rc-change="1382322020" exec-time="33070" queue-time="1" op-digest=
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           </lrm_resource>
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          </lrm_resources>
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        </lrm>
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      </node_state>
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    </status>
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  </cib>
Oct 21 11:20:53 [7685] bl460g1n6       crmd:     info: process_lrm_event: 	LRM operation prmStonith6-2_monitor_0 (call=21, rc=7, cib-update=64, confirmed=true) not running
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: update_history_cache: 	Updating history for 'prmStonith6-2' with monitor op
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: te_update_diff: 	Processing diff (cib_modify): 0.14.1 -> 0.14.2 (S_TRANSITION_ENGINE)
Oct 21 11:20:53 [7685] bl460g1n6       crmd:     info: match_graph_event: 	Action prmStonith6-1_monitor_0 (7) confirmed on bl460g1n6 (rc=0)
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: te_update_diff: 	Processing diff (cib_modify): 0.14.2 -> 0.14.3 (S_TRANSITION_ENGINE)
Oct 21 11:20:53 [7685] bl460g1n6       crmd:  warning: status_from_rc: 	Action 4 (prmVM1_monitor_0) on bl460g1n6 failed (target: 7 vs. rc: 0): Error
Oct 21 11:20:53 [7685] bl460g1n6       crmd:     info: abort_transition_graph: 	match_graph_event:369 - Triggered transition abort (complete=0, node=bl460g1n6, tag=lrm_rsc_op, id=prmVM1_last_failure_0, magic=0:0;4:1:7:38db9a68-056c-4e65-8658-75f0c3cc91e5, cib=0.14.3) : Event failed
Oct 21 11:20:53 [7685] bl460g1n6       crmd:     info: match_graph_event: 	Action prmVM1_monitor_0 (4) confirmed on bl460g1n6 (rc=4)
Oct 21 11:20:53 [7685] bl460g1n6       crmd:     info: process_graph_event: 	Detected action (1.4) prmVM1_monitor_0.5=ok: failed
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: te_update_diff: 	Processing diff (cib_modify): 0.14.3 -> 0.14.4 (S_TRANSITION_ENGINE)
Oct 21 11:20:53 [7685] bl460g1n6       crmd:  warning: status_from_rc: 	Action 5 (prmVM2_monitor_0) on bl460g1n6 failed (target: 7 vs. rc: 0): Error
Oct 21 11:20:53 [7685] bl460g1n6       crmd:     info: abort_transition_graph: 	match_graph_event:369 - Triggered transition abort (complete=0, node=bl460g1n6, tag=lrm_rsc_op, id=prmVM2_last_failure_0, magic=0:0;5:1:7:38db9a68-056c-4e65-8658-75f0c3cc91e5, cib=0.14.4) : Event failed
Oct 21 11:20:53 [7685] bl460g1n6       crmd:     info: match_graph_event: 	Action prmVM2_monitor_0 (5) confirmed on bl460g1n6 (rc=4)
Oct 21 11:20:53 [7685] bl460g1n6       crmd:     info: process_graph_event: 	Detected action (1.5) prmVM2_monitor_0.9=ok: failed
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: te_update_diff: 	Processing diff (cib_modify): 0.14.4 -> 0.14.5 (S_TRANSITION_ENGINE)
Oct 21 11:20:53 [7685] bl460g1n6       crmd:  warning: status_from_rc: 	Action 6 (prmVM3_monitor_0) on bl460g1n6 failed (target: 7 vs. rc: 0): Error
Oct 21 11:20:53 [7685] bl460g1n6       crmd:     info: abort_transition_graph: 	match_graph_event:369 - Triggered transition abort (complete=0, node=bl460g1n6, tag=lrm_rsc_op, id=prmVM3_last_failure_0, magic=0:0;6:1:7:38db9a68-056c-4e65-8658-75f0c3cc91e5, cib=0.14.5) : Event failed
Oct 21 11:20:53 [7685] bl460g1n6       crmd:     info: match_graph_event: 	Action prmVM3_monitor_0 (6) confirmed on bl460g1n6 (rc=4)
Oct 21 11:20:53 [7685] bl460g1n6       crmd:     info: process_graph_event: 	Detected action (1.6) prmVM3_monitor_0.13=ok: failed
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: te_update_diff: 	Processing diff (cib_modify): 0.14.5 -> 0.14.6 (S_TRANSITION_ENGINE)
Oct 21 11:20:53 [7685] bl460g1n6       crmd:     info: match_graph_event: 	Action prmStonith6-2_monitor_0 (8) confirmed on bl460g1n6 (rc=0)
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: --- 0.14.5
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: +++ 0.14.6 c84ab35d9d5a27d85ae0ecc82c2fe910
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-- <cib num_updates="5"/>
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  <cib epoch="14" num_updates="6" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:21 2013" update-origin="bl460g1n7" update-client="crm_resource" have-quorum="1" dc-uuid="3232261592">
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    <status>
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: create_operation_update: 	do_update_resource: Updating resource prmStonith7-1 after monitor op complete (interval=0)
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      <node_state id="3232261592" uname="bl460g1n6" in_ccm="true" crmd="online" crm-debug-origin="do_update_resource" join="member" expected="member">
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        <lrm id="3232261592">
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          <lrm_resources>
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <lrm_resource id="prmStonith6-2" type="external/ssh" class="stonith">
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <lrm_rsc_op id="prmStonith6-2_last_0" operation_key="prmStonith6-2_monitor_0" operation="monitor" crm-debug-origin="do_update_resource" crm_feature_set="3.0.7" transition-key="8:1:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" transition-magic="0:7;8:1:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" call-id="21" rc-code="7" op-status="0" interval="0" last-run="1382322053" last-rc-change="1382322053" exec-time="0" queue-time="0" op-diges
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           </lrm_resource>
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          </lrm_resources>
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        </lrm>
Oct 21 11:20:53 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=local/crmd/64, version=0.14.6)
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      </node_state>
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    </status>
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  </cib>
Oct 21 11:20:53 [7685] bl460g1n6       crmd:     info: process_lrm_event: 	LRM operation prmStonith7-1_monitor_0 (call=25, rc=7, cib-update=65, confirmed=true) not running
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: update_history_cache: 	Updating history for 'prmStonith7-1' with monitor op
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: create_operation_update: 	do_update_resource: Updating resource prmStonith7-2 after monitor op complete (interval=0)
Oct 21 11:20:53 [7685] bl460g1n6       crmd:     info: process_lrm_event: 	LRM operation prmStonith7-2_monitor_0 (call=29, rc=7, cib-update=66, confirmed=true) not running
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: update_history_cache: 	Updating history for 'prmStonith7-2' with monitor op
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: create_operation_update: 	do_update_resource: Updating resource prmStonith8-1 after monitor op complete (interval=0)
Oct 21 11:20:53 [7685] bl460g1n6       crmd:     info: process_lrm_event: 	LRM operation prmStonith8-1_monitor_0 (call=33, rc=7, cib-update=67, confirmed=true) not running
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: update_history_cache: 	Updating history for 'prmStonith8-1' with monitor op
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: create_operation_update: 	do_update_resource: Updating resource prmStonith8-2 after monitor op complete (interval=0)
Oct 21 11:20:53 [7685] bl460g1n6       crmd:     info: process_lrm_event: 	LRM operation prmStonith8-2_monitor_0 (call=37, rc=7, cib-update=68, confirmed=true) not running
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: update_history_cache: 	Updating history for 'prmStonith8-2' with monitor op
Oct 21 11:20:53 [7685] bl460g1n6       crmd:    debug: create_operation_update: 	do_update_resource: Updating resource prmPing after monitor op complete (interval=0)
Oct 21 11:20:53 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=local/crmd/65, version=0.14.7)
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: --- 0.14.6
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: +++ 0.14.7 370cb933e9f9793fc4f2608bde6fc923
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-- <cib num_updates="6"/>
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  <cib epoch="14" num_updates="7" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:21 2013" update-origin="bl460g1n7" update-client="crm_resource" have-quorum="1" dc-uuid="3232261592">
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    <status>
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      <node_state id="3232261592" uname="bl460g1n6" in_ccm="true" crmd="online" crm-debug-origin="do_update_resource" join="member" expected="member">
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        <lrm id="3232261592">
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          <lrm_resources>
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <lrm_resource id="prmStonith7-1" type="external/ipmi" class="stonith">
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <lrm_rsc_op id="prmStonith7-1_last_0" operation_key="prmStonith7-1_monitor_0" operation="monitor" crm-debug-origin="do_update_resource" crm_feature_set="3.0.7" transition-key="9:1:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" transition-magic="0:7;9:1:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" call-id="25" rc-code="7" op-status="0" interval="0" last-run="1382322053" last-rc-change="1382322053" exec-time="0" queue-time="0" op-diges
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           </lrm_resource>
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          </lrm_resources>
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        </lrm>
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      </node_state>
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    </status>
Oct 21 11:20:53 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  </cib>
Oct 21 11:20:54 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=local/crmd/66, version=0.14.8)
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: --- 0.14.7
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: +++ 0.14.8 94501f81b3e88376d9b530c37bf46f50
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-- <cib num_updates="7"/>
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  <cib epoch="14" num_updates="8" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:21 2013" update-origin="bl460g1n7" update-client="crm_resource" have-quorum="1" dc-uuid="3232261592">
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    <status>
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      <node_state id="3232261592" uname="bl460g1n6" in_ccm="true" crmd="online" crm-debug-origin="do_update_resource" join="member" expected="member">
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        <lrm id="3232261592">
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          <lrm_resources>
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <lrm_resource id="prmStonith7-2" type="external/ssh" class="stonith">
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <lrm_rsc_op id="prmStonith7-2_last_0" operation_key="prmStonith7-2_monitor_0" operation="monitor" crm-debug-origin="do_update_resource" crm_feature_set="3.0.7" transition-key="10:1:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" transition-magic="0:7;10:1:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" call-id="29" rc-code="7" op-status="0" interval="0" last-run="1382322053" last-rc-change="1382322053" exec-time="0" queue-time="0" op-dig
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           </lrm_resource>
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          </lrm_resources>
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        </lrm>
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      </node_state>
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    </status>
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  </cib>
Oct 21 11:20:54 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=local/crmd/67, version=0.14.9)
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: --- 0.14.8
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: +++ 0.14.9 41d1a8bf9831d5840c8b842db17fe1c4
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-- <cib num_updates="8"/>
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  <cib epoch="14" num_updates="9" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:21 2013" update-origin="bl460g1n7" update-client="crm_resource" have-quorum="1" dc-uuid="3232261592">
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    <status>
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      <node_state id="3232261592" uname="bl460g1n6" in_ccm="true" crmd="online" crm-debug-origin="do_update_resource" join="member" expected="member">
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        <lrm id="3232261592">
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          <lrm_resources>
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <lrm_resource id="prmStonith8-1" type="external/ipmi" class="stonith">
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <lrm_rsc_op id="prmStonith8-1_last_0" operation_key="prmStonith8-1_monitor_0" operation="monitor" crm-debug-origin="do_update_resource" crm_feature_set="3.0.7" transition-key="11:1:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" transition-magic="0:7;11:1:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" call-id="33" rc-code="7" op-status="0" interval="0" last-run="1382322053" last-rc-change="1382322053" exec-time="0" queue-time="0" op-dig
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           </lrm_resource>
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          </lrm_resources>
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        </lrm>
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      </node_state>
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    </status>
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  </cib>
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: --- 0.14.9
Oct 21 11:20:54 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=local/crmd/68, version=0.14.10)
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: +++ 0.14.10 9773a8013f7bd6a2f9eb29ae48c1f801
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-- <cib num_updates="9"/>
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  <cib epoch="14" num_updates="10" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:21 2013" update-origin="bl460g1n7" update-client="crm_resource" have-quorum="1" dc-uuid="3232261592">
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    <status>
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      <node_state id="3232261592" uname="bl460g1n6" in_ccm="true" crmd="online" crm-debug-origin="do_update_resource" join="member" expected="member">
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        <lrm id="3232261592">
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          <lrm_resources>
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <lrm_resource id="prmStonith8-2" type="external/ssh" class="stonith">
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <lrm_rsc_op id="prmStonith8-2_last_0" operation_key="prmStonith8-2_monitor_0" operation="monitor" crm-debug-origin="do_update_resource" crm_feature_set="3.0.7" transition-key="12:1:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" transition-magic="0:7;12:1:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" call-id="37" rc-code="7" op-status="0" interval="0" last-run="1382322053" last-rc-change="1382322053" exec-time="0" queue-time="0" op-dig
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           </lrm_resource>
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          </lrm_resources>
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        </lrm>
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      </node_state>
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    </status>
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  </cib>
Oct 21 11:20:54 [7685] bl460g1n6       crmd:     info: services_os_action_execute: 	Managed ping_meta-data_0 process 7996 exited with rc=0
Oct 21 11:20:54 [7685] bl460g1n6       crmd:   notice: process_lrm_event: 	LRM operation prmPing_monitor_0 (call=42, rc=7, cib-update=69, confirmed=true) not running
Oct 21 11:20:54 [7685] bl460g1n6       crmd:    debug: update_history_cache: 	Updating history for 'prmPing' with monitor op
Oct 21 11:20:54 [7685] bl460g1n6       crmd:    debug: te_update_diff: 	Processing diff (cib_modify): 0.14.6 -> 0.14.7 (S_TRANSITION_ENGINE)
Oct 21 11:20:54 [7685] bl460g1n6       crmd:     info: match_graph_event: 	Action prmStonith7-1_monitor_0 (9) confirmed on bl460g1n6 (rc=0)
Oct 21 11:20:54 [7685] bl460g1n6       crmd:    debug: te_update_diff: 	Processing diff (cib_modify): 0.14.7 -> 0.14.8 (S_TRANSITION_ENGINE)
Oct 21 11:20:54 [7685] bl460g1n6       crmd:     info: match_graph_event: 	Action prmStonith7-2_monitor_0 (10) confirmed on bl460g1n6 (rc=0)
Oct 21 11:20:54 [7685] bl460g1n6       crmd:    debug: te_update_diff: 	Processing diff (cib_modify): 0.14.8 -> 0.14.9 (S_TRANSITION_ENGINE)
Oct 21 11:20:54 [7685] bl460g1n6       crmd:     info: match_graph_event: 	Action prmStonith8-1_monitor_0 (11) confirmed on bl460g1n6 (rc=0)
Oct 21 11:20:54 [7685] bl460g1n6       crmd:    debug: te_update_diff: 	Processing diff (cib_modify): 0.14.9 -> 0.14.10 (S_TRANSITION_ENGINE)
Oct 21 11:20:54 [7685] bl460g1n6       crmd:     info: match_graph_event: 	Action prmStonith8-2_monitor_0 (12) confirmed on bl460g1n6 (rc=0)
Oct 21 11:20:54 [7685] bl460g1n6       crmd:     info: pcmk_cpg_membership: 	Left[2.0] crmd.3232261594 
Oct 21 11:20:54 [7685] bl460g1n6       crmd:     info: crm_update_peer_proc: 	pcmk_cpg_membership: Node bl460g1n8[3232261594] - corosync-cpg is now offline
Oct 21 11:20:54 [7685] bl460g1n6       crmd:     info: peer_update_callback: 	Client bl460g1n8/peer now has status [offline] (DC=true)
Oct 21 11:20:54 [7685] bl460g1n6       crmd:  warning: match_down_event: 	No match for shutdown action on 3232261594
Oct 21 11:20:54 [7685] bl460g1n6       crmd:   notice: peer_update_callback: 	Stonith/shutdown of bl460g1n8 not matched
Oct 21 11:20:54 [7685] bl460g1n6       crmd:     info: crm_update_peer_join: 	peer_update_callback: Node bl460g1n8[3232261594] - join-4 phase 4 -> 0
Oct 21 11:20:54 [7685] bl460g1n6       crmd:    debug: check_join_state: 	Invoked by peer_update_callback in state: S_TRANSITION_ENGINE
Oct 21 11:20:54 [7685] bl460g1n6       crmd:     info: abort_transition_graph: 	peer_update_callback:217 - Triggered transition abort (complete=0) : Node failure
Oct 21 11:20:54 [7685] bl460g1n6       crmd:   notice: fail_incompletable_actions: 	Action 26 (26) was pending on 3232261594 (offline)
Oct 21 11:20:54 [7685] bl460g1n6       crmd:   notice: fail_incompletable_actions: 	Action 27 (27) was pending on 3232261594 (offline)
Oct 21 11:20:54 [7685] bl460g1n6       crmd:   notice: fail_incompletable_actions: 	Action 28 (28) was pending on 3232261594 (offline)
Oct 21 11:20:54 [7685] bl460g1n6       crmd:   notice: fail_incompletable_actions: 	Action 29 (29) was pending on 3232261594 (offline)
Oct 21 11:20:54 [7685] bl460g1n6       crmd:   notice: fail_incompletable_actions: 	Action 30 (30) was pending on 3232261594 (offline)
Oct 21 11:20:54 [7685] bl460g1n6       crmd:   notice: fail_incompletable_actions: 	Action 51 (51) is scheduled for 3232261594 (offline)
Oct 21 11:20:54 [7685] bl460g1n6       crmd:   notice: fail_incompletable_actions: 	Action 50 (50) is scheduled for 3232261594 (offline)
Oct 21 11:20:54 [7685] bl460g1n6       crmd:   notice: fail_incompletable_actions: 	Action 31 (31) was pending on 3232261594 (offline)
Oct 21 11:20:54 [7685] bl460g1n6       crmd:   notice: fail_incompletable_actions: 	Action 53 (53) is scheduled for 3232261594 (offline)
Oct 21 11:20:54 [7685] bl460g1n6       crmd:   notice: fail_incompletable_actions: 	Action 52 (52) is scheduled for 3232261594 (offline)
Oct 21 11:20:54 [7685] bl460g1n6       crmd:   notice: fail_incompletable_actions: 	Action 32 (32) was pending on 3232261594 (offline)
Oct 21 11:20:54 [7685] bl460g1n6       crmd:   notice: fail_incompletable_actions: 	Action 33 (33) was pending on 3232261594 (offline)
Oct 21 11:20:54 [7685] bl460g1n6       crmd:   notice: fail_incompletable_actions: 	Action 34 (34) was pending on 3232261594 (offline)
Oct 21 11:20:54 [7685] bl460g1n6       crmd:   notice: fail_incompletable_actions: 	Action 71 (71) is scheduled for 3232261594 (offline)
Oct 21 11:20:54 [7685] bl460g1n6       crmd:   notice: fail_incompletable_actions: 	Action 70 (70) is scheduled for 3232261594 (offline)
Oct 21 11:20:54 [7685] bl460g1n6       crmd:   notice: fail_incompletable_actions: 	Action 35 (35) was pending on 3232261594 (offline)
Oct 21 11:20:54 [7685] bl460g1n6       crmd:   notice: fail_incompletable_actions: 	Action 25 (25) is scheduled for 3232261594 (offline)
Oct 21 11:20:54 [7685] bl460g1n6       crmd:  warning: fail_incompletable_actions: 	Node 3232261594 shutdown resulted in un-runnable actions
Oct 21 11:20:54 [7685] bl460g1n6       crmd:     info: abort_transition_graph: 	fail_incompletable_actions:93 - Triggered transition abort (complete=0, node=, tag=rsc_op, id=25, magic=NA) : Node failure
Oct 21 11:20:54 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause   <rsc_op id="25" operation="probe_complete" operation_key="probe_complete" on_node="bl460g1n8" on_node_uuid="3232261594">
Oct 21 11:20:54 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause     <attributes CRM_meta_op_no_wait="true" crm_feature_set="3.0.7"/>
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: --- 0.14.10
Oct 21 11:20:54 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause   </rsc_op>
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: +++ 0.14.11 3418ad294fa213e15827bc260f012c1e
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-- <cib num_updates="10"/>
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  <cib epoch="14" num_updates="11" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:21 2013" update-origin="bl460g1n7" update-client="crm_resource" have-quorum="1" dc-uuid="3232261592">
Oct 21 11:20:54 [7685] bl460g1n6       crmd:     info: pcmk_cpg_membership: 	Member[2.0] crmd.3232261592 
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    <status>
Oct 21 11:20:54 [7685] bl460g1n6       crmd:     info: pcmk_cpg_membership: 	Member[2.1] crmd.3232261593 
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      <node_state id="3232261592" uname="bl460g1n6" in_ccm="true" crmd="online" crm-debug-origin="do_update_resource" join="member" expected="member">
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        <lrm id="3232261592">
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          <lrm_resources>
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <lrm_resource id="prmPing" type="ping" class="ocf" provider="pacemaker">
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <lrm_rsc_op id="prmPing_last_0" operation_key="prmPing_monitor_0" operation="monitor" crm-debug-origin="do_update_resource" crm_feature_set="3.0.7" transition-key="13:1:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" transition-magic="0:7;13:1:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" call-id="42" rc-code="7" op-status="0" interval="0" last-run="1382322053" last-rc-change="1382322053" exec-time="24" queue-time="0" op-digest="bc586d
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           </lrm_resource>
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          </lrm_resources>
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        </lrm>
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      </node_state>
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    </status>
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  </cib>
Oct 21 11:20:54 [7685] bl460g1n6       crmd:     info: pcmk_cpg_membership: 	Left[3.0] crmd.3232261593 
Oct 21 11:20:54 [7685] bl460g1n6       crmd:     info: crm_update_peer_proc: 	pcmk_cpg_membership: Node bl460g1n7[3232261593] - corosync-cpg is now offline
Oct 21 11:20:54 [7685] bl460g1n6       crmd:     info: peer_update_callback: 	Client bl460g1n7/peer now has status [offline] (DC=true)
Oct 21 11:20:54 [7685] bl460g1n6       crmd:  warning: match_down_event: 	No match for shutdown action on 3232261593
Oct 21 11:20:54 [7685] bl460g1n6       crmd:   notice: peer_update_callback: 	Stonith/shutdown of bl460g1n7 not matched
Oct 21 11:20:54 [7685] bl460g1n6       crmd:     info: crm_update_peer_join: 	peer_update_callback: Node bl460g1n7[3232261593] - join-4 phase 4 -> 0
Oct 21 11:20:54 [7685] bl460g1n6       crmd:    debug: check_join_state: 	Invoked by peer_update_callback in state: S_TRANSITION_ENGINE
Oct 21 11:20:54 [7685] bl460g1n6       crmd:     info: abort_transition_graph: 	peer_update_callback:217 - Triggered transition abort (complete=0) : Node failure
Oct 21 11:20:54 [7685] bl460g1n6       crmd:   notice: fail_incompletable_actions: 	Action 15 (15) was pending on 3232261593 (offline)
Oct 21 11:20:54 [7685] bl460g1n6       crmd:   notice: fail_incompletable_actions: 	Action 16 (16) was pending on 3232261593 (offline)
Oct 21 11:20:54 [7685] bl460g1n6       crmd:   notice: fail_incompletable_actions: 	Action 17 (17) was pending on 3232261593 (offline)
Oct 21 11:20:54 [7685] bl460g1n6       crmd:   notice: fail_incompletable_actions: 	Action 43 (43) is scheduled for 3232261593 (offline)
Oct 21 11:20:54 [7685] bl460g1n6       crmd:   notice: fail_incompletable_actions: 	Action 42 (42) is scheduled for 3232261593 (offline)
Oct 21 11:20:54 [7685] bl460g1n6       crmd:   notice: fail_incompletable_actions: 	Action 18 (18) was pending on 3232261593 (offline)
Oct 21 11:20:54 [7685] bl460g1n6       crmd:   notice: fail_incompletable_actions: 	Action 45 (45) is scheduled for 3232261593 (offline)
Oct 21 11:20:54 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=local/crmd/69, version=0.14.11)
Oct 21 11:20:54 [7685] bl460g1n6       crmd:   notice: fail_incompletable_actions: 	Action 44 (44) is scheduled for 3232261593 (offline)
Oct 21 11:20:54 [7685] bl460g1n6       crmd:   notice: fail_incompletable_actions: 	Action 19 (19) was pending on 3232261593 (offline)
Oct 21 11:20:54 [7685] bl460g1n6       crmd:   notice: fail_incompletable_actions: 	Action 20 (20) was pending on 3232261593 (offline)
Oct 21 11:20:54 [7685] bl460g1n6       crmd:   notice: fail_incompletable_actions: 	Action 21 (21) was pending on 3232261593 (offline)
Oct 21 11:20:54 [7685] bl460g1n6       crmd:   notice: fail_incompletable_actions: 	Action 59 (59) is scheduled for 3232261593 (offline)
Oct 21 11:20:54 [7685] bl460g1n6       crmd:   notice: fail_incompletable_actions: 	Action 58 (58) is scheduled for 3232261593 (offline)
Oct 21 11:20:54 [7685] bl460g1n6       crmd:   notice: fail_incompletable_actions: 	Action 22 (22) was pending on 3232261593 (offline)
Oct 21 11:20:54 [7685] bl460g1n6       crmd:   notice: fail_incompletable_actions: 	Action 61 (61) is scheduled for 3232261593 (offline)
Oct 21 11:20:54 [7685] bl460g1n6       crmd:   notice: fail_incompletable_actions: 	Action 60 (60) is scheduled for 3232261593 (offline)
Oct 21 11:20:54 [7685] bl460g1n6       crmd:   notice: fail_incompletable_actions: 	Action 23 (23) was pending on 3232261593 (offline)
Oct 21 11:20:54 [7685] bl460g1n6       crmd:   notice: fail_incompletable_actions: 	Action 69 (69) is scheduled for 3232261593 (offline)
Oct 21 11:20:54 [7685] bl460g1n6       crmd:   notice: fail_incompletable_actions: 	Action 68 (68) is scheduled for 3232261593 (offline)
Oct 21 11:20:54 [7685] bl460g1n6       crmd:   notice: fail_incompletable_actions: 	Action 24 (24) was pending on 3232261593 (offline)
Oct 21 11:20:54 [7685] bl460g1n6       crmd:   notice: fail_incompletable_actions: 	Action 14 (14) is scheduled for 3232261593 (offline)
Oct 21 11:20:54 [7685] bl460g1n6       crmd:  warning: fail_incompletable_actions: 	Node 3232261593 shutdown resulted in un-runnable actions
Oct 21 11:20:54 [7685] bl460g1n6       crmd:     info: abort_transition_graph: 	fail_incompletable_actions:93 - Triggered transition abort (complete=0, node=, tag=rsc_op, id=14, magic=NA) : Node failure
Oct 21 11:20:54 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause   <rsc_op id="14" operation="probe_complete" operation_key="probe_complete" on_node="bl460g1n7" on_node_uuid="3232261593">
Oct 21 11:20:54 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause     <attributes CRM_meta_op_no_wait="true" crm_feature_set="3.0.7"/>
Oct 21 11:20:54 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause   </rsc_op>
Oct 21 11:20:54 [7685] bl460g1n6       crmd:     info: pcmk_cpg_membership: 	Member[3.0] crmd.3232261592 
Oct 21 11:20:54 [7685] bl460g1n6       crmd:     info: pcmk_cpg_membership: 	Joined[4.0] crmd.3232261594 
Oct 21 11:20:54 [7685] bl460g1n6       crmd:     info: pcmk_cpg_membership: 	Member[4.0] crmd.3232261592 
Oct 21 11:20:54 [7685] bl460g1n6       crmd:     info: pcmk_cpg_membership: 	Member[4.1] crmd.3232261594 
Oct 21 11:20:54 [7685] bl460g1n6       crmd:     info: crm_update_peer_proc: 	pcmk_cpg_membership: Node bl460g1n8[3232261594] - corosync-cpg is now online
Oct 21 11:20:54 [7685] bl460g1n6       crmd:     info: peer_update_callback: 	Client bl460g1n8/peer now has status [online] (DC=true)
Oct 21 11:20:54 [7685] bl460g1n6       crmd:     info: pcmk_cpg_membership: 	Joined[5.0] crmd.3232261593 
Oct 21 11:20:54 [7685] bl460g1n6       crmd:     info: pcmk_cpg_membership: 	Member[5.0] crmd.3232261592 
Oct 21 11:20:54 [7685] bl460g1n6       crmd:     info: pcmk_cpg_membership: 	Member[5.1] crmd.3232261593 
Oct 21 11:20:54 [7685] bl460g1n6       crmd:     info: crm_update_peer_proc: 	pcmk_cpg_membership: Node bl460g1n7[3232261593] - corosync-cpg is now online
Oct 21 11:20:54 [7685] bl460g1n6       crmd:     info: peer_update_callback: 	Client bl460g1n7/peer now has status [online] (DC=true)
Oct 21 11:20:54 [7685] bl460g1n6       crmd:     info: pcmk_cpg_membership: 	Member[5.2] crmd.3232261594 
Oct 21 11:20:54 [7685] bl460g1n6       crmd:    debug: s_crmd_fsa: 	Processing I_NODE_JOIN: [ state=S_TRANSITION_ENGINE cause=C_FSA_INTERNAL origin=peer_update_callback ]
Oct 21 11:20:54 [7685] bl460g1n6       crmd:     info: do_state_transition: 	State transition S_TRANSITION_ENGINE -> S_INTEGRATION [ input=I_NODE_JOIN cause=C_FSA_INTERNAL origin=peer_update_callback ]
Oct 21 11:20:54 [7685] bl460g1n6       crmd:    debug: crm_timer_start: 	Started Integration Timer (I_INTEGRATED:180000ms), src=107
Oct 21 11:20:54 [7685] bl460g1n6       crmd:     info: do_dc_join_offer_one: 	An unknown node joined - (re-)offer to any unconfirmed nodes
Oct 21 11:20:54 [7685] bl460g1n6       crmd:     info: join_make_offer: 	join-4: Sending offer to bl460g1n7
Oct 21 11:20:54 [7685] bl460g1n6       crmd:     info: crm_update_peer_join: 	join_make_offer: Node bl460g1n7[3232261593] - join-4 phase 0 -> 1
Oct 21 11:20:54 [7685] bl460g1n6       crmd:     info: join_make_offer: 	join-4: Sending offer to bl460g1n8
Oct 21 11:20:54 [7685] bl460g1n6       crmd:     info: crm_update_peer_join: 	join_make_offer: Node bl460g1n8[3232261594] - join-4 phase 0 -> 1
Oct 21 11:20:54 [7685] bl460g1n6       crmd:     info: join_make_offer: 	Skipping bl460g1n6: already known 4
Oct 21 11:20:54 [7685] bl460g1n6       crmd:    debug: check_join_state: 	Invoked by do_dc_join_offer_one in state: S_INTEGRATION
Oct 21 11:20:54 [7685] bl460g1n6       crmd:    debug: do_te_invoke: 	Halting the transition: active
Oct 21 11:20:54 [7685] bl460g1n6       crmd:     info: abort_transition_graph: 	do_te_invoke:158 - Triggered transition abort (complete=0) : Peer Halt
Oct 21 11:20:54 [7685] bl460g1n6       crmd:    debug: register_fsa_input_adv: 	Stalling the FSA pending further input: source=do_te_invoke cause=C_FSA_INTERNAL data=(nil) queue=1
Oct 21 11:20:54 [7685] bl460g1n6       crmd:    debug: s_crmd_fsa: 	Exiting the FSA: queue=1, fsa_actions=0x1000000000000, stalled=true
Oct 21 11:20:54 [7685] bl460g1n6       crmd:    debug: fsa_dump_queue: 	queue[0.51]: input I_NODE_JOIN raised by peer_update_callback((nil).0)	(cause=C_FSA_INTERNAL)
Oct 21 11:20:54 [7685] bl460g1n6       crmd:    debug: te_update_diff: 	Processing diff (cib_modify): 0.14.10 -> 0.14.11 (S_INTEGRATION)
Oct 21 11:20:54 [7685] bl460g1n6       crmd:     info: match_graph_event: 	Action prmPing_monitor_0 (13) confirmed on bl460g1n6 (rc=0)
Oct 21 11:20:54 [7685] bl460g1n6       crmd:   notice: te_rsc_command: 	Initiating action 3: probe_complete probe_complete on bl460g1n6 (local) - no waiting
Oct 21 11:20:54 [7685] bl460g1n6       crmd:    debug: attrd_update_delegate: 	Sent update: probe_complete=true for bl460g1n6
Oct 21 11:20:54 [7685] bl460g1n6       crmd:     info: te_rsc_command: 	Action 3 confirmed - no wait
Oct 21 11:20:54 [7685] bl460g1n6       crmd:    debug: run_graph: 	Transition 1 (Complete=13, Pending=0, Fired=1, Skipped=51, Incomplete=1, Source=/var/lib/pacemaker/pengine/pe-input-1.bz2): In-progress
Oct 21 11:20:54 [7683] bl460g1n6      attrd:     info: attrd_client_message: 	Broadcasting probe_complete[bl460g1n6] = true (writer)
Oct 21 11:20:54 [7685] bl460g1n6       crmd:   notice: run_graph: 	Transition 1 (Complete=14, Pending=0, Fired=0, Skipped=51, Incomplete=1, Source=/var/lib/pacemaker/pengine/pe-input-1.bz2): Stopped
Oct 21 11:20:54 [7685] bl460g1n6       crmd:    debug: te_graph_trigger: 	Transition 1 is now complete
Oct 21 11:20:54 [7685] bl460g1n6       crmd:    debug: notify_crmd: 	Processing transition completion in state S_INTEGRATION
Oct 21 11:20:54 [7685] bl460g1n6       crmd:    debug: notify_crmd: 	Transition 1 status: restart - Non-status change
Oct 21 11:20:54 [7685] bl460g1n6       crmd:    debug: s_crmd_fsa: 	Processing I_NODE_JOIN: [ state=S_INTEGRATION cause=C_FSA_INTERNAL origin=peer_update_callback ]
Oct 21 11:20:54 [7685] bl460g1n6       crmd:     info: do_dc_join_offer_one: 	An unknown node joined - (re-)offer to any unconfirmed nodes
Oct 21 11:20:54 [7685] bl460g1n6       crmd:     info: join_make_offer: 	Skipping bl460g1n7: already known 1
Oct 21 11:20:54 [7685] bl460g1n6       crmd:     info: join_make_offer: 	Skipping bl460g1n8: already known 1
Oct 21 11:20:54 [7685] bl460g1n6       crmd:     info: join_make_offer: 	Skipping bl460g1n6: already known 4
Oct 21 11:20:54 [7685] bl460g1n6       crmd:    debug: check_join_state: 	Invoked by do_dc_join_offer_one in state: S_INTEGRATION
Oct 21 11:20:54 [7685] bl460g1n6       crmd:    debug: do_te_invoke: 	Halting the transition: inactive
Oct 21 11:20:54 [7685] bl460g1n6       crmd:     info: abort_transition_graph: 	do_te_invoke:158 - Triggered transition abort (complete=1) : Peer Halt
Oct 21 11:20:54 [7685] bl460g1n6       crmd:    debug: crm_timer_start: 	Started New Transition Timer (I_PE_CALC:2000ms), src=108
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: --- 0.14.11
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: +++ 0.14.12 db72e1e759b28143eab830427d505dfa
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-  <cib num_updates="11">
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-    <status>
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	--     <node_state crmd="online" id="3232261594"/>
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-    </status>
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-  </cib>
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  <cib epoch="14" num_updates="12" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:21 2013" update-origin="bl460g1n7" update-client="crm_resource" have-quorum="1" dc-uuid="3232261592">
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    <status>
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++     <node_state id="3232261594" uname="bl460g1n8" in_ccm="true" crmd="offline" crm-debug-origin="peer_update_callback" join="member" expected="member"/>
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    </status>
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  </cib>
Oct 21 11:20:54 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=local/crmd/70, version=0.14.12)
Oct 21 11:20:54 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=local/crmd/71, version=0.14.13)
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: --- 0.14.12
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: +++ 0.14.13 27b8c2cf9b3f13af917fec213d73b055
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-  <cib num_updates="12">
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-    <status>
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	--     <node_state crmd="online" id="3232261593"/>
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-    </status>
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-  </cib>
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  <cib epoch="14" num_updates="13" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:21 2013" update-origin="bl460g1n7" update-client="crm_resource" have-quorum="1" dc-uuid="3232261592">
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    <status>
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++     <node_state id="3232261593" uname="bl460g1n7" in_ccm="true" crmd="offline" crm-debug-origin="peer_update_callback" join="member" expected="member"/>
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    </status>
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  </cib>
Oct 21 11:20:54 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=local/crmd/72, version=0.14.14)
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: --- 0.14.13
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: +++ 0.14.14 ad629840f8a1c1a5d12f7187c6d4e763
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-  <cib num_updates="13">
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-    <status>
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	--     <node_state crmd="offline" id="3232261594"/>
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-    </status>
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-  </cib>
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  <cib epoch="14" num_updates="14" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:21 2013" update-origin="bl460g1n7" update-client="crm_resource" have-quorum="1" dc-uuid="3232261592">
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    <status>
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++     <node_state id="3232261594" uname="bl460g1n8" in_ccm="true" crmd="online" crm-debug-origin="peer_update_callback" join="member" expected="member"/>
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    </status>
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  </cib>
Oct 21 11:20:54 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=local/crmd/73, version=0.14.15)
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: --- 0.14.14
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: +++ 0.14.15 9e817daca05ff13f2c868ff292f7703a
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-  <cib num_updates="14">
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-    <status>
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	--     <node_state crmd="offline" id="3232261593"/>
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-    </status>
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-  </cib>
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  <cib epoch="14" num_updates="15" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:21 2013" update-origin="bl460g1n7" update-client="crm_resource" have-quorum="1" dc-uuid="3232261592">
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    <status>
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++     <node_state id="3232261593" uname="bl460g1n7" in_ccm="true" crmd="online" crm-debug-origin="peer_update_callback" join="member" expected="member"/>
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    </status>
Oct 21 11:20:54 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  </cib>
Oct 21 11:20:54 [7685] bl460g1n6       crmd:    debug: s_crmd_fsa: 	Processing I_NODE_JOIN: [ state=S_INTEGRATION cause=C_HA_MESSAGE origin=route_message ]
Oct 21 11:20:54 [7685] bl460g1n6       crmd:     info: do_dc_join_offer_one: 	join-4: Processing join_announce request from bl460g1n8 in state S_INTEGRATION
Oct 21 11:20:54 [7685] bl460g1n6       crmd:     info: crm_update_peer_join: 	do_dc_join_offer_one: Node bl460g1n8[3232261594] - join-4 phase 1 -> 0
Oct 21 11:20:54 [7685] bl460g1n6       crmd:     info: join_make_offer: 	join-4: Sending offer to bl460g1n8
Oct 21 11:20:54 [7685] bl460g1n6       crmd:     info: crm_update_peer_join: 	join_make_offer: Node bl460g1n8[3232261594] - join-4 phase 0 -> 1
Oct 21 11:20:54 [7685] bl460g1n6       crmd:     info: crm_update_peer_join: 	join_make_offer: Node bl460g1n6[3232261592] - join-4 phase 4 -> 0
Oct 21 11:20:54 [7685] bl460g1n6       crmd:     info: join_make_offer: 	join-4: Sending offer to bl460g1n6
Oct 21 11:20:54 [7685] bl460g1n6       crmd:     info: crm_update_peer_join: 	join_make_offer: Node bl460g1n6[3232261592] - join-4 phase 0 -> 1
Oct 21 11:20:54 [7685] bl460g1n6       crmd:     info: abort_transition_graph: 	do_dc_join_offer_one:242 - Triggered transition abort (complete=1) : Node join
Oct 21 11:20:54 [7685] bl460g1n6       crmd:    debug: crm_timer_start: 	Started New Transition Timer (I_PE_CALC:2000ms), src=109
Oct 21 11:20:54 [7685] bl460g1n6       crmd:    debug: do_dc_join_offer_one: 	Waiting on 3 outstanding join acks for join-4
Oct 21 11:20:54 [7685] bl460g1n6       crmd:    debug: handle_request: 	Raising I_JOIN_OFFER: join-4
Oct 21 11:20:54 [7685] bl460g1n6       crmd:    debug: s_crmd_fsa: 	Processing I_JOIN_OFFER: [ state=S_INTEGRATION cause=C_HA_MESSAGE origin=route_message ]
Oct 21 11:20:54 [7685] bl460g1n6       crmd:    debug: do_cl_join_offer_respond: 	do_cl_join_offer_respond added action A_DC_TIMER_STOP to the FSA
Oct 21 11:20:54 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_query operation for section 'all': OK (rc=0, origin=local/crmd/74, version=0.14.15)
Oct 21 11:20:54 [7685] bl460g1n6       crmd:    debug: join_query_callback: 	Respond to join offer join-4
Oct 21 11:20:54 [7685] bl460g1n6       crmd:    debug: join_query_callback: 	Acknowledging bl460g1n6 as our DC
Oct 21 11:20:54 [7685] bl460g1n6       crmd:    debug: s_crmd_fsa: 	Processing I_JOIN_REQUEST: [ state=S_INTEGRATION cause=C_HA_MESSAGE origin=route_message ]
Oct 21 11:20:54 [7685] bl460g1n6       crmd:    debug: do_dc_join_filter_offer: 	Processing req from bl460g1n6
Oct 21 11:20:54 [7685] bl460g1n6       crmd:    debug: do_dc_join_filter_offer: 	bl460g1n6 has a better generation number than the current max bl460g1n6
Oct 21 11:20:54 [7685] bl460g1n6       crmd:    debug: do_dc_join_filter_offer: 	Max generation   <generation_tuple epoch="8" num_updates="1" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:18 2013" update-origin="bl460g1n6" update-client="crmd" have-quorum="1" dc-uuid="3232261592"/>
Oct 21 11:20:54 [7685] bl460g1n6       crmd:    debug: do_dc_join_filter_offer: 	Their generation   <generation_tuple epoch="14" num_updates="15" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:21 2013" update-origin="bl460g1n7" update-client="crm_resource" have-quorum="1" dc-uuid="3232261592"/>
Oct 21 11:20:54 [7685] bl460g1n6       crmd:    debug: do_dc_join_filter_offer: 	join-4: Welcoming node bl460g1n6 (ref join_request-crmd-1382322054-68)
Oct 21 11:20:54 [7685] bl460g1n6       crmd:     info: crm_update_peer_join: 	do_dc_join_filter_offer: Node bl460g1n6[3232261592] - join-4 phase 1 -> 2
Oct 21 11:20:54 [7685] bl460g1n6       crmd:    debug: do_dc_join_filter_offer: 	1 nodes have been integrated into join-4
Oct 21 11:20:54 [7685] bl460g1n6       crmd:    debug: check_join_state: 	Invoked by do_dc_join_filter_offer in state: S_INTEGRATION
Oct 21 11:20:54 [7685] bl460g1n6       crmd:    debug: do_dc_join_filter_offer: 	join-4: Still waiting on 2 outstanding offers
Oct 21 11:20:54 [7685] bl460g1n6       crmd:    debug: process_te_message: 	Processing (N)ACK lrm_invoke-lrmd-1382322054-2 from bl460g1n8
Oct 21 11:20:54 [7685] bl460g1n6       crmd:     info: abort_transition_graph: 	process_graph_event:583 - Triggered transition abort (complete=1, node=bl460g1n8, tag=lrm_rsc_op, id=prmStonith7-1_last_failure_0, magic=4:99;31:1:7:38db9a68-056c-4e65-8658-75f0c3cc91e5) : Inactive graph
Oct 21 11:20:54 [7685] bl460g1n6       crmd:    debug: crm_timer_start: 	Started New Transition Timer (I_PE_CALC:2000ms), src=111
Oct 21 11:20:54 [7685] bl460g1n6       crmd:     info: process_graph_event: 	Detected action (1.31) prmStonith7-1_monitor_0.0=unknown: arrived late
Oct 21 11:20:54 [7685] bl460g1n6       crmd:    debug: process_te_message: 	Processing (N)ACK lrm_invoke-lrmd-1382322054-3 from bl460g1n8
Oct 21 11:20:54 [7685] bl460g1n6       crmd:     info: abort_transition_graph: 	process_graph_event:583 - Triggered transition abort (complete=1, node=bl460g1n8, tag=lrm_rsc_op, id=prmStonith7-2_last_failure_0, magic=4:99;32:1:7:38db9a68-056c-4e65-8658-75f0c3cc91e5) : Inactive graph
Oct 21 11:20:54 [7685] bl460g1n6       crmd:    debug: crm_timer_start: 	Started New Transition Timer (I_PE_CALC:2000ms), src=112
Oct 21 11:20:54 [7685] bl460g1n6       crmd:     info: process_graph_event: 	Detected action (1.32) prmStonith7-2_monitor_0.0=unknown: arrived late
Oct 21 11:20:54 [7685] bl460g1n6       crmd:    debug: process_te_message: 	Processing (N)ACK lrm_invoke-lrmd-1382322054-4 from bl460g1n8
Oct 21 11:20:54 [7685] bl460g1n6       crmd:     info: abort_transition_graph: 	process_graph_event:583 - Triggered transition abort (complete=1, node=bl460g1n8, tag=lrm_rsc_op, id=prmStonith8-1_last_failure_0, magic=4:99;33:1:7:38db9a68-056c-4e65-8658-75f0c3cc91e5) : Inactive graph
Oct 21 11:20:54 [7685] bl460g1n6       crmd:    debug: crm_timer_start: 	Started New Transition Timer (I_PE_CALC:2000ms), src=113
Oct 21 11:20:54 [7685] bl460g1n6       crmd:     info: process_graph_event: 	Detected action (1.33) prmStonith8-1_monitor_0.0=unknown: arrived late
Oct 21 11:20:54 [7685] bl460g1n6       crmd:    debug: s_crmd_fsa: 	Processing I_NODE_JOIN: [ state=S_INTEGRATION cause=C_HA_MESSAGE origin=route_message ]
Oct 21 11:20:54 [7685] bl460g1n6       crmd:     info: do_dc_join_offer_one: 	join-4: Processing join_announce request from bl460g1n7 in state S_INTEGRATION
Oct 21 11:20:54 [7685] bl460g1n6       crmd:     info: crm_update_peer_join: 	do_dc_join_offer_one: Node bl460g1n7[3232261593] - join-4 phase 1 -> 0
Oct 21 11:20:54 [7685] bl460g1n6       crmd:     info: join_make_offer: 	join-4: Sending offer to bl460g1n7
Oct 21 11:20:54 [7685] bl460g1n6       crmd:     info: crm_update_peer_join: 	join_make_offer: Node bl460g1n7[3232261593] - join-4 phase 0 -> 1
Oct 21 11:20:54 [7685] bl460g1n6       crmd:     info: crm_update_peer_join: 	join_make_offer: Node bl460g1n6[3232261592] - join-4 phase 2 -> 0
Oct 21 11:20:54 [7685] bl460g1n6       crmd:     info: join_make_offer: 	join-4: Sending offer to bl460g1n6
Oct 21 11:20:54 [7685] bl460g1n6       crmd:     info: crm_update_peer_join: 	join_make_offer: Node bl460g1n6[3232261592] - join-4 phase 0 -> 1
Oct 21 11:20:54 [7685] bl460g1n6       crmd:     info: abort_transition_graph: 	do_dc_join_offer_one:242 - Triggered transition abort (complete=1) : Node join
Oct 21 11:20:54 [7685] bl460g1n6       crmd:    debug: crm_timer_start: 	Started New Transition Timer (I_PE_CALC:2000ms), src=114
Oct 21 11:20:54 [7685] bl460g1n6       crmd:    debug: do_dc_join_offer_one: 	Waiting on 3 outstanding join acks for join-4
Oct 21 11:20:54 [7685] bl460g1n6       crmd:    debug: process_te_message: 	Processing (N)ACK lrm_invoke-lrmd-1382322054-5 from bl460g1n8
Oct 21 11:20:54 [7685] bl460g1n6       crmd:     info: abort_transition_graph: 	process_graph_event:583 - Triggered transition abort (complete=1, node=bl460g1n8, tag=lrm_rsc_op, id=prmStonith8-2_last_failure_0, magic=4:99;34:1:7:38db9a68-056c-4e65-8658-75f0c3cc91e5) : Inactive graph
Oct 21 11:20:54 [7685] bl460g1n6       crmd:    debug: crm_timer_start: 	Started New Transition Timer (I_PE_CALC:2000ms), src=115
Oct 21 11:20:54 [7685] bl460g1n6       crmd:     info: process_graph_event: 	Detected action (1.34) prmStonith8-2_monitor_0.0=unknown: arrived late
Oct 21 11:20:54 [7685] bl460g1n6       crmd:    debug: handle_request: 	Raising I_JOIN_OFFER: join-4
Oct 21 11:20:54 [7685] bl460g1n6       crmd:    debug: s_crmd_fsa: 	Processing I_JOIN_OFFER: [ state=S_INTEGRATION cause=C_HA_MESSAGE origin=route_message ]
Oct 21 11:20:54 [7685] bl460g1n6       crmd:    debug: do_cl_join_offer_respond: 	do_cl_join_offer_respond added action A_DC_TIMER_STOP to the FSA
Oct 21 11:20:54 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_query operation for section 'all': OK (rc=0, origin=local/crmd/75, version=0.14.15)
Oct 21 11:20:54 [7685] bl460g1n6       crmd:    debug: join_query_callback: 	Respond to join offer join-4
Oct 21 11:20:54 [7685] bl460g1n6       crmd:    debug: join_query_callback: 	Acknowledging bl460g1n6 as our DC
Oct 21 11:20:54 [7685] bl460g1n6       crmd:    debug: s_crmd_fsa: 	Processing I_JOIN_REQUEST: [ state=S_INTEGRATION cause=C_HA_MESSAGE origin=route_message ]
Oct 21 11:20:54 [7685] bl460g1n6       crmd:    debug: do_dc_join_filter_offer: 	Processing req from bl460g1n6
Oct 21 11:20:54 [7685] bl460g1n6       crmd:    debug: do_dc_join_filter_offer: 	bl460g1n6 has a better generation number than the current max bl460g1n6
Oct 21 11:20:54 [7685] bl460g1n6       crmd:    debug: do_dc_join_filter_offer: 	Max generation   <generation_tuple epoch="14" num_updates="15" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:21 2013" update-origin="bl460g1n7" update-client="crm_resource" have-quorum="1" dc-uuid="3232261592"/>
Oct 21 11:20:54 [7685] bl460g1n6       crmd:    debug: do_dc_join_filter_offer: 	Their generation   <generation_tuple epoch="14" num_updates="15" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:21 2013" update-origin="bl460g1n7" update-client="crm_resource" have-quorum="1" dc-uuid="3232261592"/>
Oct 21 11:20:54 [7685] bl460g1n6       crmd:    debug: do_dc_join_filter_offer: 	join-4: Welcoming node bl460g1n6 (ref join_request-crmd-1382322054-71)
Oct 21 11:20:54 [7685] bl460g1n6       crmd:     info: crm_update_peer_join: 	do_dc_join_filter_offer: Node bl460g1n6[3232261592] - join-4 phase 1 -> 2
Oct 21 11:20:54 [7685] bl460g1n6       crmd:    debug: do_dc_join_filter_offer: 	1 nodes have been integrated into join-4
Oct 21 11:20:54 [7685] bl460g1n6       crmd:    debug: check_join_state: 	Invoked by do_dc_join_filter_offer in state: S_INTEGRATION
Oct 21 11:20:54 [7685] bl460g1n6       crmd:    debug: do_dc_join_filter_offer: 	join-4: Still waiting on 2 outstanding offers
Oct 21 11:20:54 [7685] bl460g1n6       crmd:    debug: process_te_message: 	Processing (N)ACK lrm_invoke-lrmd-1382322054-6 from bl460g1n8
Oct 21 11:20:54 [7685] bl460g1n6       crmd:     info: abort_transition_graph: 	process_graph_event:583 - Triggered transition abort (complete=1, node=bl460g1n8, tag=lrm_rsc_op, id=prmPing_last_failure_0, magic=4:99;35:1:7:38db9a68-056c-4e65-8658-75f0c3cc91e5) : Inactive graph
Oct 21 11:20:54 [7685] bl460g1n6       crmd:    debug: crm_timer_start: 	Started New Transition Timer (I_PE_CALC:2000ms), src=117
Oct 21 11:20:54 [7685] bl460g1n6       crmd:     info: process_graph_event: 	Detected action (1.35) prmPing_monitor_0.0=unknown: arrived late
Oct 21 11:20:54 [7685] bl460g1n6       crmd:    debug: process_te_message: 	Processing (N)ACK lrm_invoke-lrmd-1382322054-2 from bl460g1n7
Oct 21 11:20:54 [7685] bl460g1n6       crmd:     info: abort_transition_graph: 	process_graph_event:583 - Triggered transition abort (complete=1, node=bl460g1n7, tag=lrm_rsc_op, id=prmStonith7-1_last_failure_0, magic=4:99;20:1:7:38db9a68-056c-4e65-8658-75f0c3cc91e5) : Inactive graph
Oct 21 11:20:54 [7685] bl460g1n6       crmd:    debug: crm_timer_start: 	Started New Transition Timer (I_PE_CALC:2000ms), src=118
Oct 21 11:20:54 [7685] bl460g1n6       crmd:     info: process_graph_event: 	Detected action (1.20) prmStonith7-1_monitor_0.0=unknown: arrived late
Oct 21 11:20:54 [7685] bl460g1n6       crmd:    debug: process_te_message: 	Processing (N)ACK lrm_invoke-lrmd-1382322054-3 from bl460g1n7
Oct 21 11:20:54 [7685] bl460g1n6       crmd:     info: abort_transition_graph: 	process_graph_event:583 - Triggered transition abort (complete=1, node=bl460g1n7, tag=lrm_rsc_op, id=prmStonith7-2_last_failure_0, magic=4:99;21:1:7:38db9a68-056c-4e65-8658-75f0c3cc91e5) : Inactive graph
Oct 21 11:20:54 [7685] bl460g1n6       crmd:    debug: crm_timer_start: 	Started New Transition Timer (I_PE_CALC:2000ms), src=119
Oct 21 11:20:54 [7685] bl460g1n6       crmd:     info: process_graph_event: 	Detected action (1.21) prmStonith7-2_monitor_0.0=unknown: arrived late
Oct 21 11:20:54 [7685] bl460g1n6       crmd:    debug: process_te_message: 	Processing (N)ACK lrm_invoke-lrmd-1382322054-4 from bl460g1n7
Oct 21 11:20:54 [7685] bl460g1n6       crmd:     info: abort_transition_graph: 	process_graph_event:583 - Triggered transition abort (complete=1, node=bl460g1n7, tag=lrm_rsc_op, id=prmStonith8-1_last_failure_0, magic=4:99;22:1:7:38db9a68-056c-4e65-8658-75f0c3cc91e5) : Inactive graph
Oct 21 11:20:54 [7685] bl460g1n6       crmd:    debug: crm_timer_start: 	Started New Transition Timer (I_PE_CALC:2000ms), src=120
Oct 21 11:20:54 [7685] bl460g1n6       crmd:     info: process_graph_event: 	Detected action (1.22) prmStonith8-1_monitor_0.0=unknown: arrived late
Oct 21 11:20:54 [7685] bl460g1n6       crmd:    debug: process_te_message: 	Processing (N)ACK lrm_invoke-lrmd-1382322054-5 from bl460g1n7
Oct 21 11:20:54 [7685] bl460g1n6       crmd:     info: abort_transition_graph: 	process_graph_event:583 - Triggered transition abort (complete=1, node=bl460g1n7, tag=lrm_rsc_op, id=prmStonith8-2_last_failure_0, magic=4:99;23:1:7:38db9a68-056c-4e65-8658-75f0c3cc91e5) : Inactive graph
Oct 21 11:20:54 [7685] bl460g1n6       crmd:    debug: crm_timer_start: 	Started New Transition Timer (I_PE_CALC:2000ms), src=121
Oct 21 11:20:54 [7685] bl460g1n6       crmd:     info: process_graph_event: 	Detected action (1.23) prmStonith8-2_monitor_0.0=unknown: arrived late
Oct 21 11:20:54 [7685] bl460g1n6       crmd:    debug: process_te_message: 	Processing (N)ACK lrm_invoke-lrmd-1382322054-6 from bl460g1n7
Oct 21 11:20:54 [7685] bl460g1n6       crmd:     info: abort_transition_graph: 	process_graph_event:583 - Triggered transition abort (complete=1, node=bl460g1n7, tag=lrm_rsc_op, id=prmPing_last_failure_0, magic=4:99;24:1:7:38db9a68-056c-4e65-8658-75f0c3cc91e5) : Inactive graph
Oct 21 11:20:54 [7685] bl460g1n6       crmd:    debug: crm_timer_start: 	Started New Transition Timer (I_PE_CALC:2000ms), src=122
Oct 21 11:20:54 [7685] bl460g1n6       crmd:     info: process_graph_event: 	Detected action (1.24) prmPing_monitor_0.0=unknown: arrived late
Oct 21 11:20:55 [7685] bl460g1n6       crmd:    debug: s_crmd_fsa: 	Processing I_JOIN_REQUEST: [ state=S_INTEGRATION cause=C_HA_MESSAGE origin=route_message ]
Oct 21 11:20:55 [7685] bl460g1n6       crmd:    debug: do_dc_join_filter_offer: 	Processing req from bl460g1n8
Oct 21 11:20:55 [7685] bl460g1n6       crmd:    debug: do_dc_join_filter_offer: 	join-4: Welcoming node bl460g1n8 (ref join_request-crmd-1382322055-7)
Oct 21 11:20:55 [7685] bl460g1n6       crmd:     info: crm_update_peer_join: 	do_dc_join_filter_offer: Node bl460g1n8[3232261594] - join-4 phase 1 -> 2
Oct 21 11:20:55 [7685] bl460g1n6       crmd:    debug: do_dc_join_filter_offer: 	2 nodes have been integrated into join-4
Oct 21 11:20:55 [7685] bl460g1n6       crmd:    debug: check_join_state: 	Invoked by do_dc_join_filter_offer in state: S_INTEGRATION
Oct 21 11:20:55 [7685] bl460g1n6       crmd:    debug: do_dc_join_filter_offer: 	join-4: Still waiting on 1 outstanding offers
Oct 21 11:20:55 [7685] bl460g1n6       crmd:    debug: s_crmd_fsa: 	Processing I_JOIN_REQUEST: [ state=S_INTEGRATION cause=C_HA_MESSAGE origin=route_message ]
Oct 21 11:20:55 [7685] bl460g1n6       crmd:    debug: do_dc_join_filter_offer: 	Processing req from bl460g1n7
Oct 21 11:20:55 [7685] bl460g1n6       crmd:    debug: do_dc_join_filter_offer: 	join-4: Welcoming node bl460g1n7 (ref join_request-crmd-1382322055-7)
Oct 21 11:20:55 [7685] bl460g1n6       crmd:     info: crm_update_peer_join: 	do_dc_join_filter_offer: Node bl460g1n7[3232261593] - join-4 phase 1 -> 2
Oct 21 11:20:55 [7685] bl460g1n6       crmd:    debug: do_dc_join_filter_offer: 	3 nodes have been integrated into join-4
Oct 21 11:20:55 [7685] bl460g1n6       crmd:    debug: check_join_state: 	Invoked by do_dc_join_filter_offer in state: S_INTEGRATION
Oct 21 11:20:55 [7685] bl460g1n6       crmd:    debug: check_join_state: 	join-4: Integration of 3 peers complete: do_dc_join_filter_offer
Oct 21 11:20:55 [7685] bl460g1n6       crmd:    debug: s_crmd_fsa: 	Processing I_INTEGRATED: [ state=S_INTEGRATION cause=C_FSA_INTERNAL origin=check_join_state ]
Oct 21 11:20:55 [7685] bl460g1n6       crmd:     info: do_state_transition: 	State transition S_INTEGRATION -> S_FINALIZE_JOIN [ input=I_INTEGRATED cause=C_FSA_INTERNAL origin=check_join_state ]
Oct 21 11:20:55 [7685] bl460g1n6       crmd:    debug: do_state_transition: 	All 3 cluster nodes responded to the join offer.
Oct 21 11:20:55 [7685] bl460g1n6       crmd:    debug: crm_timer_start: 	Started Finalization Timer (I_ELECTION:1800000ms), src=123
Oct 21 11:20:55 [7685] bl460g1n6       crmd:    debug: do_dc_join_finalize: 	Finializing join-4 for 3 clients
Oct 21 11:20:55 [7685] bl460g1n6       crmd:     info: crmd_join_phase_log: 	join-4: bl460g1n7=integrated
Oct 21 11:20:55 [7685] bl460g1n6       crmd:     info: crmd_join_phase_log: 	join-4: bl460g1n8=integrated
Oct 21 11:20:55 [7685] bl460g1n6       crmd:     info: crmd_join_phase_log: 	join-4: bl460g1n6=integrated
Oct 21 11:20:55 [7685] bl460g1n6       crmd:     info: do_dc_join_finalize: 	join-4: Syncing our CIB to the rest of the cluster
Oct 21 11:20:55 [7685] bl460g1n6       crmd:    debug: do_dc_join_finalize: 	Requested version   <generation_tuple epoch="14" num_updates="15" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:21 2013" update-origin="bl460g1n7" update-client="crm_resource" have-quorum="1" dc-uuid="3232261592"/>
Oct 21 11:20:55 [7680] bl460g1n6        cib:    debug: sync_our_cib: 	Syncing CIB to all peers
Oct 21 11:20:56 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_sync operation for section 'all': OK (rc=0, origin=local/crmd/76, version=0.14.15)
Oct 21 11:20:56 [7685] bl460g1n6       crmd:    debug: check_join_state: 	Invoked by finalize_sync_callback in state: S_FINALIZE_JOIN
Oct 21 11:20:56 [7685] bl460g1n6       crmd:    debug: check_join_state: 	join-4: Still waiting on 3 integrated nodes
Oct 21 11:20:56 [7685] bl460g1n6       crmd:    debug: crmd_join_phase_log: 	join-4: bl460g1n7=integrated
Oct 21 11:20:56 [7685] bl460g1n6       crmd:    debug: crmd_join_phase_log: 	join-4: bl460g1n8=integrated
Oct 21 11:20:56 [7685] bl460g1n6       crmd:    debug: crmd_join_phase_log: 	join-4: bl460g1n6=integrated
Oct 21 11:20:56 [7685] bl460g1n6       crmd:    debug: finalize_sync_callback: 	Notifying 3 clients of join-4 results
Oct 21 11:20:56 [7685] bl460g1n6       crmd:    debug: finalize_join_for: 	join-4: ACK'ing join request from bl460g1n7
Oct 21 11:20:56 [7685] bl460g1n6       crmd:     info: crm_update_peer_join: 	finalize_join_for: Node bl460g1n7[3232261593] - join-4 phase 2 -> 3
Oct 21 11:20:56 [7685] bl460g1n6       crmd:    debug: finalize_join_for: 	join-4: ACK'ing join request from bl460g1n8
Oct 21 11:20:56 [7685] bl460g1n6       crmd:     info: crm_update_peer_join: 	finalize_join_for: Node bl460g1n8[3232261594] - join-4 phase 2 -> 3
Oct 21 11:20:56 [7685] bl460g1n6       crmd:    debug: finalize_join_for: 	join-4: ACK'ing join request from bl460g1n6
Oct 21 11:20:56 [7685] bl460g1n6       crmd:     info: crm_update_peer_join: 	finalize_join_for: Node bl460g1n6[3232261592] - join-4 phase 2 -> 3
Oct 21 11:20:56 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section nodes: OK (rc=0, origin=local/crmd/77, version=0.14.15)
Oct 21 11:20:56 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section nodes: OK (rc=0, origin=local/crmd/78, version=0.14.15)
Oct 21 11:20:56 [7685] bl460g1n6       crmd:    debug: handle_request: 	Raising I_JOIN_RESULT: join-4
Oct 21 11:20:56 [7685] bl460g1n6       crmd:    debug: s_crmd_fsa: 	Processing I_JOIN_RESULT: [ state=S_FINALIZE_JOIN cause=C_HA_MESSAGE origin=route_message ]
Oct 21 11:20:56 [7685] bl460g1n6       crmd:    debug: do_cl_join_finalize_respond: 	Confirming join join-4: join_ack_nack
Oct 21 11:20:56 [7685] bl460g1n6       crmd:    debug: create_operation_update: 	build_active_RAs: Updating resource prmPing after monitor op complete (interval=0)
Oct 21 11:20:56 [7685] bl460g1n6       crmd:    debug: create_operation_update: 	build_active_RAs: Updating resource prmStonith6-1 after monitor op complete (interval=0)
Oct 21 11:20:56 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section nodes: OK (rc=0, origin=local/crmd/79, version=0.14.15)
Oct 21 11:20:56 [7685] bl460g1n6       crmd:    debug: create_operation_update: 	build_active_RAs: Updating resource prmStonith7-2 after monitor op complete (interval=0)
Oct 21 11:20:56 [7685] bl460g1n6       crmd:    debug: create_operation_update: 	build_active_RAs: Updating resource prmVM1 after monitor op complete (interval=0)
Oct 21 11:20:56 [7685] bl460g1n6       crmd:    debug: create_operation_update: 	build_active_RAs: Updating resource prmVM2 after monitor op complete (interval=0)
Oct 21 11:20:56 [7685] bl460g1n6       crmd:    debug: create_operation_update: 	build_active_RAs: Updating resource prmVM3 after monitor op complete (interval=0)
Oct 21 11:20:56 [7685] bl460g1n6       crmd:    debug: create_operation_update: 	build_active_RAs: Updating resource prmStonith7-1 after monitor op complete (interval=0)
Oct 21 11:20:56 [7685] bl460g1n6       crmd:    debug: create_operation_update: 	build_active_RAs: Updating resource prmStonith6-2 after monitor op complete (interval=0)
Oct 21 11:20:56 [7685] bl460g1n6       crmd:    debug: create_operation_update: 	build_active_RAs: Updating resource prmStonith8-1 after monitor op complete (interval=0)
Oct 21 11:20:56 [7685] bl460g1n6       crmd:    debug: create_operation_update: 	build_active_RAs: Updating resource prmStonith8-2 after monitor op complete (interval=0)
Oct 21 11:20:56 [7685] bl460g1n6       crmd:    debug: do_cl_join_finalize_respond: 	join-4: Join complete.  Sending local LRM status to bl460g1n6
Oct 21 11:20:56 [7685] bl460g1n6       crmd:    debug: do_dc_join_ack: 	Ignoring op=join_ack_nack message from bl460g1n6
Oct 21 11:20:56 [7685] bl460g1n6       crmd:    debug: s_crmd_fsa: 	Processing I_JOIN_RESULT: [ state=S_FINALIZE_JOIN cause=C_HA_MESSAGE origin=route_message ]
Oct 21 11:20:56 [7685] bl460g1n6       crmd:     info: crm_update_peer_join: 	do_dc_join_ack: Node bl460g1n6[3232261592] - join-4 phase 3 -> 4
Oct 21 11:20:56 [7685] bl460g1n6       crmd:     info: do_dc_join_ack: 	join-4: Updating node state to member for bl460g1n6
Oct 21 11:20:56 [7685] bl460g1n6       crmd:     info: erase_status_tag: 	Deleting xpath: //node_state[@uname='bl460g1n6']/lrm
Oct 21 11:20:56 [7685] bl460g1n6       crmd:    debug: do_dc_join_ack: 	join-4: Registered callback for LRM update 81
Oct 21 11:20:56 [7685] bl460g1n6       crmd:    debug: s_crmd_fsa: 	Processing I_JOIN_RESULT: [ state=S_FINALIZE_JOIN cause=C_HA_MESSAGE origin=route_message ]
Oct 21 11:20:56 [7685] bl460g1n6       crmd:     info: crm_update_peer_join: 	do_dc_join_ack: Node bl460g1n7[3232261593] - join-4 phase 3 -> 4
Oct 21 11:20:56 [7685] bl460g1n6       crmd:     info: do_dc_join_ack: 	join-4: Updating node state to member for bl460g1n7
Oct 21 11:20:56 [7685] bl460g1n6       crmd:     info: erase_status_tag: 	Deleting xpath: //node_state[@uname='bl460g1n7']/lrm
Oct 21 11:20:56 [7680] bl460g1n6        cib:    debug: cib_process_xpath: 	Processing cib_delete op for //node_state[@uname='bl460g1n6']/lrm (/cib/status/node_state[2]/lrm)
Oct 21 11:20:56 [7685] bl460g1n6       crmd:    debug: do_dc_join_ack: 	join-4: Registered callback for LRM update 83
Oct 21 11:20:56 [7685] bl460g1n6       crmd:    debug: s_crmd_fsa: 	Processing I_JOIN_RESULT: [ state=S_FINALIZE_JOIN cause=C_HA_MESSAGE origin=route_message ]
Oct 21 11:20:56 [7685] bl460g1n6       crmd:     info: crm_update_peer_join: 	do_dc_join_ack: Node bl460g1n8[3232261594] - join-4 phase 3 -> 4
Oct 21 11:20:56 [7685] bl460g1n6       crmd:     info: do_dc_join_ack: 	join-4: Updating node state to member for bl460g1n8
Oct 21 11:20:56 [7685] bl460g1n6       crmd:     info: erase_status_tag: 	Deleting xpath: //node_state[@uname='bl460g1n8']/lrm
Oct 21 11:20:56 [7685] bl460g1n6       crmd:    debug: do_dc_join_ack: 	join-4: Registered callback for LRM update 85
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: --- 0.14.15
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: +++ 0.14.16 a4be001dc74929d834858c3bfc2a0d3f
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-  <cib num_updates="15">
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-    <status>
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-      <node_state id="3232261592">
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	--       <lrm id="3232261592">
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	--         <lrm_resources>
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	--           <lrm_resource id="prmStonith6-1" type="external/ipmi" class="stonith">
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	--             <lrm_rsc_op id="prmStonith6-1_last_0" operation_key="prmStonith6-1_monitor_0" operation="monitor" crm-debug-origin="do_update_resource" crm_feature_set="3.0.7" transition-key="7:1:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" transition-magic="0:7;7:1:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" call-id="17" rc-code="7" op-status="0" interval="0" last-run="1382322020" last-rc-change="1382322020" exec-time="33067" queue-time="0" op-d
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	--           </lrm_resource>
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	--           <lrm_resource id="prmVM1" type="VirtualDomain" class="ocf" provider="heartbeat">
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	--             <lrm_rsc_op id="prmVM1_last_failure_0" operation_key="prmVM1_monitor_0" operation="monitor" crm-debug-origin="do_update_resource" crm_feature_set="3.0.7" transition-key="4:1:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" transition-magic="0:0;4:1:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" call-id="5" rc-code="0" op-status="0" interval="0" last-run="1382322020" last-rc-change="1382322020" exec-time="33075" queue-time="0" op-digest="
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	--           </lrm_resource>
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	--           <lrm_resource id="prmVM2" type="VirtualDomain" class="ocf" provider="heartbeat">
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	--             <lrm_rsc_op id="prmVM2_last_failure_0" operation_key="prmVM2_monitor_0" operation="monitor" crm-debug-origin="do_update_resource" crm_feature_set="3.0.7" transition-key="5:1:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" transition-magic="0:0;5:1:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" call-id="9" rc-code="0" op-status="0" interval="0" last-run="1382322020" last-rc-change="1382322020" exec-time="33073" queue-time="0" op-digest="
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	--           </lrm_resource>
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	--           <lrm_resource id="prmVM3" type="VirtualDomain" class="ocf" provider="heartbeat">
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	--             <lrm_rsc_op id="prmVM3_last_failure_0" operation_key="prmVM3_monitor_0" operation="monitor" crm-debug-origin="do_update_resource" crm_feature_set="3.0.7" transition-key="6:1:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" transition-magic="0:0;6:1:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" call-id="13" rc-code="0" op-status="0" interval="0" last-run="1382322020" last-rc-change="1382322020" exec-time="33070" queue-time="1" op-digest=
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	--           </lrm_resource>
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	--           <lrm_resource id="prmStonith6-2" type="external/ssh" class="stonith">
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	--             <lrm_rsc_op id="prmStonith6-2_last_0" operation_key="prmStonith6-2_monitor_0" operation="monitor" crm-debug-origin="do_update_resource" crm_feature_set="3.0.7" transition-key="8:1:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" transition-magic="0:7;8:1:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" call-id="21" rc-code="7" op-status="0" interval="0" last-run="1382322053" last-rc-change="1382322053" exec-time="0" queue-time="0" op-diges
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	--           </lrm_resource>
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	--           <lrm_resource id="prmStonith7-1" type="external/ipmi" class="stonith">
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	--             <lrm_rsc_op id="prmStonith7-1_last_0" operation_key="prmStonith7-1_monitor_0" operation="monitor" crm-debug-origin="do_update_resource" crm_feature_set="3.0.7" transition-key="9:1:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" transition-magic="0:7;9:1:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" call-id="25" rc-code="7" op-status="0" interval="0" last-run="1382322053" last-rc-change="1382322053" exec-time="0" queue-time="0" op-diges
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	--           </lrm_resource>
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	--           <lrm_resource id="prmStonith7-2" type="external/ssh" class="stonith">
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	--             <lrm_rsc_op id="prmStonith7-2_last_0" operation_key="prmStonith7-2_monitor_0" operation="monitor" crm-debug-origin="do_update_resource" crm_feature_set="3.0.7" transition-key="10:1:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" transition-magic="0:7;10:1:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" call-id="29" rc-code="7" op-status="0" interval="0" last-run="1382322053" last-rc-change="1382322053" exec-time="0" queue-time="0" op-dig
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	--           </lrm_resource>
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	--           <lrm_resource id="prmStonith8-1" type="external/ipmi" class="stonith">
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	--             <lrm_rsc_op id="prmStonith8-1_last_0" operation_key="prmStonith8-1_monitor_0" operation="monitor" crm-debug-origin="do_update_resource" crm_feature_set="3.0.7" transition-key="11:1:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" transition-magic="0:7;11:1:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" call-id="33" rc-code="7" op-status="0" interval="0" last-run="1382322053" last-rc-change="1382322053" exec-time="0" queue-time="0" op-dig
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	--           </lrm_resource>
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	--           <lrm_resource id="prmStonith8-2" type="external/ssh" class="stonith">
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	--             <lrm_rsc_op id="prmStonith8-2_last_0" operation_key="prmStonith8-2_monitor_0" operation="monitor" crm-debug-origin="do_update_resource" crm_feature_set="3.0.7" transition-key="12:1:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" transition-magic="0:7;12:1:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" call-id="37" rc-code="7" op-status="0" interval="0" last-run="1382322053" last-rc-change="1382322053" exec-time="0" queue-time="0" op-dig
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	--           </lrm_resource>
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	--           <lrm_resource id="prmPing" type="ping" class="ocf" provider="pacemaker">
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	--             <lrm_rsc_op id="prmPing_last_0" operation_key="prmPing_monitor_0" operation="monitor" crm-debug-origin="do_update_resource" crm_feature_set="3.0.7" transition-key="13:1:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" transition-magic="0:7;13:1:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" call-id="42" rc-code="7" op-status="0" interval="0" last-run="1382322053" last-rc-change="1382322053" exec-time="24" queue-time="0" op-digest="bc586d
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	--           </lrm_resource>
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	--         </lrm_resources>
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	--       </lrm>
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-      </node_state>
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-    </status>
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-  </cib>
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++ <cib epoch="14" num_updates="16" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:21 2013" update-origin="bl460g1n7" update-client="crm_resource" have-quorum="1" dc-uuid="3232261592"/>
Oct 21 11:20:56 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_delete operation for section //node_state[@uname='bl460g1n6']/lrm: OK (rc=0, origin=local/crmd/80, version=0.14.16)
Oct 21 11:20:56 [7680] bl460g1n6        cib:    debug: cib_process_xpath: 	Processing cib_delete op for //node_state[@uname='bl460g1n7']/transient_attributes (/cib/status/node_state[3]/transient_attributes)
Oct 21 11:20:56 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_delete operation for section //node_state[@uname='bl460g1n7']/transient_attributes: OK (rc=0, origin=bl460g1n7/crmd/8, version=0.14.17)
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: --- 0.14.16
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: +++ 0.14.17 8586cc7ec6eddecbb7e4116476291f26
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-  <cib num_updates="16">
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-    <status>
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-      <node_state id="3232261593">
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	--       <transient_attributes id="3232261593">
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	--         <instance_attributes id="status-3232261593">
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	--           <nvpair id="status-3232261593-probe_complete" name="probe_complete" value="true"/>
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	--         </instance_attributes>
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	--       </transient_attributes>
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-      </node_state>
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-    </status>
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-  </cib>
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++ <cib epoch="14" num_updates="17" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:21 2013" update-origin="bl460g1n7" update-client="crm_resource" have-quorum="1" dc-uuid="3232261592"/>
Oct 21 11:20:56 [7680] bl460g1n6        cib:    debug: cib_process_xpath: 	Processing cib_delete op for //node_state[@uname='bl460g1n8']/transient_attributes (/cib/status/node_state[1]/transient_attributes)
Oct 21 11:20:56 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_delete operation for section //node_state[@uname='bl460g1n8']/transient_attributes: OK (rc=0, origin=bl460g1n8/crmd/8, version=0.14.18)
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: --- 0.14.17
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: +++ 0.14.18 8c541b561897d0b25e717eed149d883b
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-  <cib num_updates="17">
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-    <status>
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-      <node_state id="3232261594">
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	--       <transient_attributes id="3232261594">
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	--         <instance_attributes id="status-3232261594">
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	--           <nvpair id="status-3232261594-probe_complete" name="probe_complete" value="true"/>
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	--         </instance_attributes>
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	--       </transient_attributes>
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-      </node_state>
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-    </status>
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-  </cib>
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++ <cib epoch="14" num_updates="18" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:21 2013" update-origin="bl460g1n7" update-client="crm_resource" have-quorum="1" dc-uuid="3232261592"/>
Oct 21 11:20:56 [7685] bl460g1n6       crmd:    debug: erase_xpath_callback: 	Deletion of "//node_state[@uname='bl460g1n6']/lrm": OK (rc=0)
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: --- 0.14.18
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: +++ 0.14.19 3a8d7d4d5f960171a3e39478c868e0d1
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-- <cib num_updates="18"/>
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  <cib epoch="14" num_updates="19" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:21 2013" update-origin="bl460g1n7" update-client="crm_resource" have-quorum="1" dc-uuid="3232261592">
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    <status>
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      <node_state id="3232261592" uname="bl460g1n6" in_ccm="true" crmd="online" crm-debug-origin="do_lrm_query_internal" join="member" expected="member">
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++       <lrm id="3232261592">
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         <lrm_resources>
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <lrm_resource id="prmPing" type="ping" class="ocf" provider="pacemaker">
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <lrm_rsc_op id="prmPing_last_0" operation_key="prmPing_monitor_0" operation="monitor" crm-debug-origin="build_active_RAs" crm_feature_set="3.0.7" transition-key="13:1:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" transition-magic="0:7;13:1:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" call-id="42" rc-code="7" op-status="0" interval="0" last-run="1382322053" last-rc-change="1382322053" exec-time="24" queue-time="0" op-digest="bc586d1a
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           </lrm_resource>
Oct 21 11:20:56 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=local/crmd/81, version=0.14.19)
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <lrm_resource id="prmStonith6-1" type="external/ipmi" class="stonith">
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <lrm_rsc_op id="prmStonith6-1_last_0" operation_key="prmStonith6-1_monitor_0" operation="monitor" crm-debug-origin="build_active_RAs" crm_feature_set="3.0.7" transition-key="7:1:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" transition-magic="0:7;7:1:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" call-id="17" rc-code="7" op-status="0" interval="0" last-run="1382322020" last-rc-change="1382322020" exec-time="33067" queue-time="0" op-dig
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           </lrm_resource>
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <lrm_resource id="prmStonith7-2" type="external/ssh" class="stonith">
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <lrm_rsc_op id="prmStonith7-2_last_0" operation_key="prmStonith7-2_monitor_0" operation="monitor" crm-debug-origin="build_active_RAs" crm_feature_set="3.0.7" transition-key="10:1:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" transition-magic="0:7;10:1:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" call-id="29" rc-code="7" op-status="0" interval="0" last-run="1382322053" last-rc-change="1382322053" exec-time="0" queue-time="0" op-diges
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           </lrm_resource>
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <lrm_resource id="prmVM1" type="VirtualDomain" class="ocf" provider="heartbeat">
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <lrm_rsc_op id="prmVM1_last_failure_0" operation_key="prmVM1_monitor_0" operation="monitor" crm-debug-origin="build_active_RAs" crm_feature_set="3.0.7" transition-key="4:1:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" transition-magic="0:0;4:1:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" call-id="5" rc-code="0" op-status="0" interval="0" last-run="1382322020" last-rc-change="1382322020" exec-time="33075" queue-time="0" op-digest="0d
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           </lrm_resource>
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <lrm_resource id="prmVM2" type="VirtualDomain" class="ocf" provider="heartbeat">
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <lrm_rsc_op id="prmVM2_last_failure_0" operation_key="prmVM2_monitor_0" operation="monitor" crm-debug-origin="build_active_RAs" crm_feature_set="3.0.7" transition-key="5:1:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" transition-magic="0:0;5:1:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" call-id="9" rc-code="0" op-status="0" interval="0" last-run="1382322020" last-rc-change="1382322020" exec-time="33073" queue-time="0" op-digest="65
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           </lrm_resource>
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <lrm_resource id="prmVM3" type="VirtualDomain" class="ocf" provider="heartbeat">
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <lrm_rsc_op id="prmVM3_last_failure_0" operation_key="prmVM3_monitor_0" operation="monitor" crm-debug-origin="build_active_RAs" crm_feature_set="3.0.7" transition-key="6:1:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" transition-magic="0:0;6:1:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" call-id="13" rc-code="0" op-status="0" interval="0" last-run="1382322020" last-rc-change="1382322020" exec-time="33070" queue-time="1" op-digest="b
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           </lrm_resource>
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <lrm_resource id="prmStonith7-1" type="external/ipmi" class="stonith">
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <lrm_rsc_op id="prmStonith7-1_last_0" operation_key="prmStonith7-1_monitor_0" operation="monitor" crm-debug-origin="build_active_RAs" crm_feature_set="3.0.7" transition-key="9:1:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" transition-magic="0:7;9:1:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" call-id="25" rc-code="7" op-status="0" interval="0" last-run="1382322053" last-rc-change="1382322053" exec-time="0" queue-time="0" op-digest=
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           </lrm_resource>
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <lrm_resource id="prmStonith6-2" type="external/ssh" class="stonith">
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <lrm_rsc_op id="prmStonith6-2_last_0" operation_key="prmStonith6-2_monitor_0" operation="monitor" crm-debug-origin="build_active_RAs" crm_feature_set="3.0.7" transition-key="8:1:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" transition-magic="0:7;8:1:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" call-id="21" rc-code="7" op-status="0" interval="0" last-run="1382322053" last-rc-change="1382322053" exec-time="0" queue-time="0" op-digest=
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           </lrm_resource>
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <lrm_resource id="prmStonith8-1" type="external/ipmi" class="stonith">
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <lrm_rsc_op id="prmStonith8-1_last_0" operation_key="prmStonith8-1_monitor_0" operation="monitor" crm-debug-origin="build_active_RAs" crm_feature_set="3.0.7" transition-key="11:1:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" transition-magic="0:7;11:1:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" call-id="33" rc-code="7" op-status="0" interval="0" last-run="1382322053" last-rc-change="1382322053" exec-time="0" queue-time="0" op-diges
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           </lrm_resource>
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <lrm_resource id="prmStonith8-2" type="external/ssh" class="stonith">
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <lrm_rsc_op id="prmStonith8-2_last_0" operation_key="prmStonith8-2_monitor_0" operation="monitor" crm-debug-origin="build_active_RAs" crm_feature_set="3.0.7" transition-key="12:1:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" transition-magic="0:7;12:1:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" call-id="37" rc-code="7" op-status="0" interval="0" last-run="1382322053" last-rc-change="1382322053" exec-time="0" queue-time="0" op-diges
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           </lrm_resource>
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         </lrm_resources>
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++       </lrm>
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      </node_state>
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    </status>
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  </cib>
Oct 21 11:20:56 [7680] bl460g1n6        cib:    debug: cib_process_xpath: 	Processing cib_delete op for //node_state[@uname='bl460g1n7']/lrm (/cib/status/node_state[3]/lrm)
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: --- 0.14.19
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: +++ 0.14.20 119f29500402bf34a66ba4e55d770a06
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-  <cib num_updates="19">
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-    <status>
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-      <node_state id="3232261593">
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	--       <lrm id="3232261593">
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	--         <lrm_resources/>
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	--       </lrm>
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-      </node_state>
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-    </status>
Oct 21 11:20:56 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_delete operation for section //node_state[@uname='bl460g1n7']/lrm: OK (rc=0, origin=local/crmd/82, version=0.14.20)
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-  </cib>
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++ <cib epoch="14" num_updates="20" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:21 2013" update-origin="bl460g1n7" update-client="crm_resource" have-quorum="1" dc-uuid="3232261592"/>
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: --- 0.14.20
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: +++ 0.14.21 525b6e996ff8be7075c5b08673a3d6e3
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-- <cib num_updates="20"/>
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  <cib epoch="14" num_updates="21" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:21 2013" update-origin="bl460g1n7" update-client="crm_resource" have-quorum="1" dc-uuid="3232261592">
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    <status>
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      <node_state id="3232261593" uname="bl460g1n7" in_ccm="true" crmd="online" crm-debug-origin="do_lrm_query_internal" join="member" expected="member">
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++       <lrm id="3232261593">
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         <lrm_resources/>
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++       </lrm>
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      </node_state>
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    </status>
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  </cib>
Oct 21 11:20:56 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=local/crmd/83, version=0.14.21)
Oct 21 11:20:56 [7680] bl460g1n6        cib:    debug: cib_process_xpath: 	Processing cib_delete op for //node_state[@uname='bl460g1n8']/lrm (/cib/status/node_state[1]/lrm)
Oct 21 11:20:56 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_delete operation for section //node_state[@uname='bl460g1n8']/lrm: OK (rc=0, origin=local/crmd/84, version=0.14.22)
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: --- 0.14.21
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: +++ 0.14.22 72d59bd93cc77ca284171525727495cc
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-  <cib num_updates="21">
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-    <status>
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-      <node_state id="3232261594">
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	--       <lrm id="3232261594">
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	--         <lrm_resources/>
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	--       </lrm>
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-      </node_state>
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-    </status>
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-  </cib>
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++ <cib epoch="14" num_updates="22" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:21 2013" update-origin="bl460g1n7" update-client="crm_resource" have-quorum="1" dc-uuid="3232261592"/>
Oct 21 11:20:56 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=local/crmd/85, version=0.14.23)
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: --- 0.14.22
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: +++ 0.14.23 bfe6964bdf044757335e50b213a3ddb1
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-- <cib num_updates="22"/>
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  <cib epoch="14" num_updates="23" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:21 2013" update-origin="bl460g1n7" update-client="crm_resource" have-quorum="1" dc-uuid="3232261592">
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    <status>
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      <node_state id="3232261594" uname="bl460g1n8" in_ccm="true" crmd="online" crm-debug-origin="do_lrm_query_internal" join="member" expected="member">
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++       <lrm id="3232261594">
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         <lrm_resources/>
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++       </lrm>
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      </node_state>
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    </status>
Oct 21 11:20:56 [7685] bl460g1n6       crmd:    debug: join_update_complete_callback: 	Join update 81 complete
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  </cib>
Oct 21 11:20:56 [7685] bl460g1n6       crmd:    debug: check_join_state: 	Invoked by join_update_complete_callback in state: S_FINALIZE_JOIN
Oct 21 11:20:56 [7685] bl460g1n6       crmd:    debug: check_join_state: 	join-4 complete: join_update_complete_callback
Oct 21 11:20:56 [7685] bl460g1n6       crmd:    debug: erase_xpath_callback: 	Deletion of "//node_state[@uname='bl460g1n7']/lrm": OK (rc=0)
Oct 21 11:20:56 [7685] bl460g1n6       crmd:    debug: join_update_complete_callback: 	Join update 83 complete
Oct 21 11:20:56 [7685] bl460g1n6       crmd:    debug: check_join_state: 	Invoked by join_update_complete_callback in state: S_FINALIZE_JOIN
Oct 21 11:20:56 [7685] bl460g1n6       crmd:    debug: check_join_state: 	join-4 complete: join_update_complete_callback
Oct 21 11:20:56 [7685] bl460g1n6       crmd:    debug: erase_xpath_callback: 	Deletion of "//node_state[@uname='bl460g1n8']/lrm": OK (rc=0)
Oct 21 11:20:56 [7685] bl460g1n6       crmd:    debug: s_crmd_fsa: 	Processing I_FINALIZED: [ state=S_FINALIZE_JOIN cause=C_FSA_INTERNAL origin=check_join_state ]
Oct 21 11:20:56 [7685] bl460g1n6       crmd:     info: do_state_transition: 	State transition S_FINALIZE_JOIN -> S_POLICY_ENGINE [ input=I_FINALIZED cause=C_FSA_INTERNAL origin=check_join_state ]
Oct 21 11:20:56 [7685] bl460g1n6       crmd:    debug: do_state_transition: 	All 3 cluster nodes are eligible to run resources.
Oct 21 11:20:56 [7685] bl460g1n6       crmd:    debug: do_dc_join_final: 	Ensuring DC, quorum and node attributes are up-to-date
Oct 21 11:20:56 [7685] bl460g1n6       crmd:    debug: attrd_update_delegate: 	Sent update: (null)=(null) for localhost
Oct 21 11:20:56 [7685] bl460g1n6       crmd:    debug: crm_update_quorum: 	Updating quorum status to true (call=88)
Oct 21 11:20:56 [7685] bl460g1n6       crmd:    debug: do_te_invoke: 	Cancelling the transition: inactive
Oct 21 11:20:56 [7685] bl460g1n6       crmd:     info: abort_transition_graph: 	do_te_invoke:151 - Triggered transition abort (complete=1) : Peer Cancelled
Oct 21 11:20:56 [7685] bl460g1n6       crmd:    debug: crm_timer_start: 	Started New Transition Timer (I_PE_CALC:2000ms), src=134
Oct 21 11:20:56 [7685] bl460g1n6       crmd:    debug: s_crmd_fsa: 	Processing I_FINALIZED: [ state=S_POLICY_ENGINE cause=C_FSA_INTERNAL origin=check_join_state ]
Oct 21 11:20:56 [7685] bl460g1n6       crmd:    debug: join_update_complete_callback: 	Join update 85 complete
Oct 21 11:20:56 [7685] bl460g1n6       crmd:    debug: check_join_state: 	Invoked by join_update_complete_callback in state: S_POLICY_ENGINE
Oct 21 11:20:56 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section nodes: OK (rc=0, origin=local/crmd/86, version=0.14.23)
Oct 21 11:20:56 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=local/crmd/87, version=0.14.24)
Oct 21 11:20:56 [7685] bl460g1n6       crmd:    debug: te_update_diff: 	Processing diff (cib_modify): 0.14.23 -> 0.14.24 (S_POLICY_ENGINE)
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: --- 0.14.23
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: +++ 0.14.24 2ec1dea58af540a06168060270cf361a
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-  <cib num_updates="23">
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-    <status>
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	--     <node_state crm-debug-origin="do_lrm_query_internal" id="3232261592"/>
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-    </status>
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-  </cib>
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  <cib epoch="14" num_updates="24" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:21 2013" update-origin="bl460g1n7" update-client="crm_resource" have-quorum="1" dc-uuid="3232261592">
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    <status>
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++     <node_state id="3232261592" uname="bl460g1n6" in_ccm="true" crmd="online" crm-debug-origin="do_state_transition" join="member" expected="member"/>
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    </status>
Oct 21 11:20:56 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  </cib>
Oct 21 11:20:56 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section cib: OK (rc=0, origin=local/crmd/88, version=0.14.24)
Oct 21 11:20:58 [7685] bl460g1n6       crmd:     info: crm_timer_popped: 	New Transition Timer (I_PE_CALC) just popped (2000ms)
Oct 21 11:20:58 [7685] bl460g1n6       crmd:    debug: s_crmd_fsa: 	Processing I_PE_CALC: [ state=S_POLICY_ENGINE cause=C_TIMER_POPPED origin=crm_timer_popped ]
Oct 21 11:20:58 [7685] bl460g1n6       crmd:    debug: do_pe_invoke: 	Query 89: Requesting the current CIB: S_POLICY_ENGINE
Oct 21 11:20:58 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_query operation for section 'all': OK (rc=0, origin=local/crmd/89, version=0.14.24)
Oct 21 11:20:58 [7685] bl460g1n6       crmd:    debug: do_pe_invoke_callback: 	Invoking the PE: query=89, ref=pe_calc-dc-1382322058-76, seq=16, quorate=1
Oct 21 11:20:58 [7684] bl460g1n6    pengine:    debug: unpack_config: 	STONITH timeout: 60000
Oct 21 11:20:58 [7684] bl460g1n6    pengine:    debug: unpack_config: 	STONITH of failed nodes is enabled
Oct 21 11:20:58 [7684] bl460g1n6    pengine:    debug: unpack_config: 	Stop all active resources: false
Oct 21 11:20:58 [7684] bl460g1n6    pengine:    debug: unpack_config: 	Cluster is symmetric - resources can run anywhere by default
Oct 21 11:20:58 [7684] bl460g1n6    pengine:    debug: unpack_config: 	Default stickiness: 0
Oct 21 11:20:58 [7684] bl460g1n6    pengine:    debug: unpack_config: 	On loss of CCM Quorum: Freeze resources
Oct 21 11:20:58 [7684] bl460g1n6    pengine:    debug: unpack_config: 	Node scores: 'red' = -INFINITY, 'yellow' = 0, 'green' = 0
Oct 21 11:20:58 [7684] bl460g1n6    pengine:    debug: unpack_domains: 	Unpacking domains
Oct 21 11:20:58 [7684] bl460g1n6    pengine:     info: determine_online_status_fencing: 	Node bl460g1n8 is active
Oct 21 11:20:58 [7684] bl460g1n6    pengine:     info: determine_online_status: 	Node bl460g1n8 is online
Oct 21 11:20:58 [7684] bl460g1n6    pengine:     info: determine_online_status_fencing: 	Node bl460g1n6 is active
Oct 21 11:20:58 [7684] bl460g1n6    pengine:     info: determine_online_status: 	Node bl460g1n6 is online
Oct 21 11:20:58 [7684] bl460g1n6    pengine:     info: determine_online_status_fencing: 	Node bl460g1n7 is active
Oct 21 11:20:58 [7684] bl460g1n6    pengine:     info: determine_online_status: 	Node bl460g1n7 is online
Oct 21 11:20:58 [7684] bl460g1n6    pengine:    debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n6 to prmPing:0
Oct 21 11:20:58 [7684] bl460g1n6    pengine:    debug: determine_op_status: 	prmVM1_monitor_0 on bl460g1n6 returned 'ok' (0) instead of the expected value: 'not running' (7)
Oct 21 11:20:58 [7684] bl460g1n6    pengine:     info: determine_op_status: 	Operation monitor found resource prmVM1 active on bl460g1n6
Oct 21 11:20:58 [7684] bl460g1n6    pengine:    debug: determine_op_status: 	prmVM2_monitor_0 on bl460g1n6 returned 'ok' (0) instead of the expected value: 'not running' (7)
Oct 21 11:20:58 [7684] bl460g1n6    pengine:     info: determine_op_status: 	Operation monitor found resource prmVM2 active on bl460g1n6
Oct 21 11:20:58 [7684] bl460g1n6    pengine:    debug: determine_op_status: 	prmVM3_monitor_0 on bl460g1n6 returned 'ok' (0) instead of the expected value: 'not running' (7)
Oct 21 11:20:58 [7684] bl460g1n6    pengine:     info: determine_op_status: 	Operation monitor found resource prmVM3 active on bl460g1n6
Oct 21 11:20:58 [7684] bl460g1n6    pengine:     info: native_print: 	prmVM1	(ocf::heartbeat:VirtualDomain):	Started bl460g1n6 
Oct 21 11:20:58 [7684] bl460g1n6    pengine:     info: native_print: 	prmVM2	(ocf::heartbeat:VirtualDomain):	Started bl460g1n6 
Oct 21 11:20:58 [7684] bl460g1n6    pengine:     info: native_print: 	prmVM3	(ocf::heartbeat:VirtualDomain):	Started bl460g1n6 
Oct 21 11:20:58 [7684] bl460g1n6    pengine:     info: group_print: 	 Resource Group: grpStonith6
Oct 21 11:20:58 [7684] bl460g1n6    pengine:     info: native_print: 	     prmStonith6-1	(stonith:external/ipmi):	Stopped 
Oct 21 11:20:58 [7684] bl460g1n6    pengine:     info: native_print: 	     prmStonith6-2	(stonith:external/ssh):	Stopped 
Oct 21 11:20:58 [7684] bl460g1n6    pengine:     info: group_print: 	 Resource Group: grpStonith7
Oct 21 11:20:58 [7684] bl460g1n6    pengine:     info: native_print: 	     prmStonith7-1	(stonith:external/ipmi):	Stopped 
Oct 21 11:20:58 [7684] bl460g1n6    pengine:     info: native_print: 	     prmStonith7-2	(stonith:external/ssh):	Stopped 
Oct 21 11:20:58 [7684] bl460g1n6    pengine:     info: group_print: 	 Resource Group: grpStonith8
Oct 21 11:20:58 [7684] bl460g1n6    pengine:     info: native_print: 	     prmStonith8-1	(stonith:external/ipmi):	Stopped 
Oct 21 11:20:58 [7684] bl460g1n6    pengine:     info: native_print: 	     prmStonith8-2	(stonith:external/ssh):	Stopped 
Oct 21 11:20:58 [7684] bl460g1n6    pengine:     info: clone_print: 	 Clone Set: clnPing [prmPing]
Oct 21 11:20:58 [7684] bl460g1n6    pengine:     info: short_print: 	     Stopped: [ bl460g1n6 bl460g1n7 bl460g1n8 ]
Oct 21 11:20:58 [7684] bl460g1n6    pengine:    debug: group_rsc_location: 	Processing rsc_location lo8-rule for grpStonith8
Oct 21 11:20:58 [7684] bl460g1n6    pengine:    debug: group_rsc_location: 	Processing rsc_location lo7-rule for grpStonith7
Oct 21 11:20:58 [7684] bl460g1n6    pengine:    debug: group_rsc_location: 	Processing rsc_location lo6-rule for grpStonith6
Oct 21 11:20:58 [7684] bl460g1n6    pengine:    debug: common_apply_stickiness: 	Resource prmVM1: preferring current location (node=bl460g1n6, weight=1000000)
Oct 21 11:20:58 [7684] bl460g1n6    pengine:    debug: common_apply_stickiness: 	Resource prmVM2: preferring current location (node=bl460g1n6, weight=1000000)
Oct 21 11:20:58 [7684] bl460g1n6    pengine:    debug: common_apply_stickiness: 	Resource prmVM3: preferring current location (node=bl460g1n6, weight=1000000)
Oct 21 11:20:58 [7684] bl460g1n6    pengine:    debug: native_assign_node: 	Assigning bl460g1n6 to prmPing:0
Oct 21 11:20:58 [7684] bl460g1n6    pengine:    debug: native_assign_node: 	Assigning bl460g1n7 to prmPing:1
Oct 21 11:20:58 [7684] bl460g1n6    pengine:    debug: native_assign_node: 	Assigning bl460g1n8 to prmPing:2
Oct 21 11:20:58 [7684] bl460g1n6    pengine:    debug: clone_color: 	Allocated 3 clnPing instances of a possible 3
Oct 21 11:20:58 [7684] bl460g1n6    pengine:    debug: native_assign_node: 	Assigning bl460g1n6 to prmVM1
Oct 21 11:20:58 [7684] bl460g1n6    pengine:    debug: native_assign_node: 	Assigning bl460g1n6 to prmVM2
Oct 21 11:20:58 [7684] bl460g1n6    pengine:    debug: native_assign_node: 	Assigning bl460g1n6 to prmVM3
Oct 21 11:20:58 [7684] bl460g1n6    pengine:    debug: native_assign_node: 	Assigning bl460g1n7 to prmStonith6-1
Oct 21 11:20:58 [7684] bl460g1n6    pengine:    debug: native_assign_node: 	Assigning bl460g1n7 to prmStonith6-2
Oct 21 11:20:58 [7684] bl460g1n6    pengine:    debug: native_assign_node: 	Assigning bl460g1n8 to prmStonith7-1
Oct 21 11:20:58 [7684] bl460g1n6    pengine:    debug: native_assign_node: 	Assigning bl460g1n8 to prmStonith7-2
Oct 21 11:20:58 [7684] bl460g1n6    pengine:    debug: native_assign_node: 	Assigning bl460g1n7 to prmStonith8-1
Oct 21 11:20:58 [7684] bl460g1n6    pengine:    debug: native_assign_node: 	Assigning bl460g1n7 to prmStonith8-2
Oct 21 11:20:58 [7684] bl460g1n6    pengine:    debug: native_create_probe: 	Probing prmVM1 on bl460g1n7 (Started)
Oct 21 11:20:58 [7684] bl460g1n6    pengine:    debug: native_create_probe: 	Probing prmVM2 on bl460g1n7 (Started)
Oct 21 11:20:58 [7684] bl460g1n6    pengine:    debug: native_create_probe: 	Probing prmVM3 on bl460g1n7 (Started)
Oct 21 11:20:58 [7684] bl460g1n6    pengine:    debug: native_create_probe: 	Probing prmStonith6-1 on bl460g1n7 (Stopped)
Oct 21 11:20:58 [7684] bl460g1n6    pengine:    debug: native_create_probe: 	Probing prmStonith6-2 on bl460g1n7 (Stopped)
Oct 21 11:20:58 [7684] bl460g1n6    pengine:    debug: native_create_probe: 	Probing prmStonith7-1 on bl460g1n7 (Stopped)
Oct 21 11:20:58 [7684] bl460g1n6    pengine:    debug: native_create_probe: 	Probing prmStonith7-2 on bl460g1n7 (Stopped)
Oct 21 11:20:58 [7684] bl460g1n6    pengine:    debug: native_create_probe: 	Probing prmStonith8-1 on bl460g1n7 (Stopped)
Oct 21 11:20:58 [7684] bl460g1n6    pengine:    debug: native_create_probe: 	Probing prmStonith8-2 on bl460g1n7 (Stopped)
Oct 21 11:20:58 [7684] bl460g1n6    pengine:    debug: native_create_probe: 	Probing prmPing:1 on bl460g1n7 (Stopped)
Oct 21 11:20:58 [7684] bl460g1n6    pengine:    debug: native_create_probe: 	Probing prmVM1 on bl460g1n8 (Started)
Oct 21 11:20:58 [7684] bl460g1n6    pengine:    debug: native_create_probe: 	Probing prmVM2 on bl460g1n8 (Started)
Oct 21 11:20:58 [7684] bl460g1n6    pengine:    debug: native_create_probe: 	Probing prmVM3 on bl460g1n8 (Started)
Oct 21 11:20:58 [7684] bl460g1n6    pengine:    debug: native_create_probe: 	Probing prmStonith6-1 on bl460g1n8 (Stopped)
Oct 21 11:20:58 [7684] bl460g1n6    pengine:    debug: native_create_probe: 	Probing prmStonith6-2 on bl460g1n8 (Stopped)
Oct 21 11:20:58 [7684] bl460g1n6    pengine:    debug: native_create_probe: 	Probing prmStonith7-1 on bl460g1n8 (Stopped)
Oct 21 11:20:58 [7684] bl460g1n6    pengine:    debug: native_create_probe: 	Probing prmStonith7-2 on bl460g1n8 (Stopped)
Oct 21 11:20:58 [7684] bl460g1n6    pengine:    debug: native_create_probe: 	Probing prmStonith8-1 on bl460g1n8 (Stopped)
Oct 21 11:20:58 [7684] bl460g1n6    pengine:    debug: native_create_probe: 	Probing prmStonith8-2 on bl460g1n8 (Stopped)
Oct 21 11:20:58 [7684] bl460g1n6    pengine:    debug: native_create_probe: 	Probing prmPing:2 on bl460g1n8 (Stopped)
Oct 21 11:20:58 [7684] bl460g1n6    pengine:     info: RecurringOp: 	 Start recurring monitor (10s) for prmVM1 on bl460g1n6
Oct 21 11:20:58 [7684] bl460g1n6    pengine:     info: RecurringOp: 	 Start recurring monitor (10s) for prmVM2 on bl460g1n6
Oct 21 11:20:58 [7684] bl460g1n6    pengine:     info: RecurringOp: 	 Start recurring monitor (10s) for prmVM3 on bl460g1n6
Oct 21 11:20:58 [7684] bl460g1n6    pengine:     info: RecurringOp: 	 Start recurring monitor (360s) for prmStonith6-1 on bl460g1n7
Oct 21 11:20:58 [7684] bl460g1n6    pengine:     info: RecurringOp: 	 Start recurring monitor (10s) for prmStonith6-2 on bl460g1n7
Oct 21 11:20:58 [7684] bl460g1n6    pengine:     info: RecurringOp: 	 Start recurring monitor (360s) for prmStonith7-1 on bl460g1n8
Oct 21 11:20:58 [7684] bl460g1n6    pengine:     info: RecurringOp: 	 Start recurring monitor (10s) for prmStonith7-2 on bl460g1n8
Oct 21 11:20:58 [7684] bl460g1n6    pengine:     info: RecurringOp: 	 Start recurring monitor (360s) for prmStonith8-1 on bl460g1n7
Oct 21 11:20:58 [7684] bl460g1n6    pengine:     info: RecurringOp: 	 Start recurring monitor (10s) for prmStonith8-2 on bl460g1n7
Oct 21 11:20:58 [7684] bl460g1n6    pengine:     info: RecurringOp: 	 Start recurring monitor (10s) for prmPing:0 on bl460g1n6
Oct 21 11:20:58 [7684] bl460g1n6    pengine:     info: RecurringOp: 	 Start recurring monitor (10s) for prmPing:1 on bl460g1n7
Oct 21 11:20:58 [7684] bl460g1n6    pengine:     info: RecurringOp: 	 Start recurring monitor (10s) for prmPing:2 on bl460g1n8
Oct 21 11:20:58 [7684] bl460g1n6    pengine:     info: LogActions: 	Leave   prmVM1	(Started bl460g1n6)
Oct 21 11:20:58 [7684] bl460g1n6    pengine:     info: LogActions: 	Leave   prmVM2	(Started bl460g1n6)
Oct 21 11:20:58 [7684] bl460g1n6    pengine:     info: LogActions: 	Leave   prmVM3	(Started bl460g1n6)
Oct 21 11:20:58 [7684] bl460g1n6    pengine:   notice: LogActions: 	Start   prmStonith6-1	(bl460g1n7)
Oct 21 11:20:58 [7684] bl460g1n6    pengine:   notice: LogActions: 	Start   prmStonith6-2	(bl460g1n7)
Oct 21 11:20:58 [7684] bl460g1n6    pengine:   notice: LogActions: 	Start   prmStonith7-1	(bl460g1n8)
Oct 21 11:20:58 [7684] bl460g1n6    pengine:   notice: LogActions: 	Start   prmStonith7-2	(bl460g1n8)
Oct 21 11:20:58 [7684] bl460g1n6    pengine:   notice: LogActions: 	Start   prmStonith8-1	(bl460g1n7)
Oct 21 11:20:58 [7684] bl460g1n6    pengine:   notice: LogActions: 	Start   prmStonith8-2	(bl460g1n7)
Oct 21 11:20:58 [7684] bl460g1n6    pengine:   notice: LogActions: 	Start   prmPing:0	(bl460g1n6)
Oct 21 11:20:58 [7684] bl460g1n6    pengine:   notice: LogActions: 	Start   prmPing:1	(bl460g1n7)
Oct 21 11:20:58 [7684] bl460g1n6    pengine:   notice: LogActions: 	Start   prmPing:2	(bl460g1n8)
Oct 21 11:20:58 [7684] bl460g1n6    pengine:    debug: action2xml: 	Using anonymous clone name prmPing for prmPing:0 (aka. prmPing)
Oct 21 11:20:58 [7684] bl460g1n6    pengine:    debug: action2xml: 	Using anonymous clone name prmPing for prmPing:0 (aka. prmPing)
Oct 21 11:20:58 [7684] bl460g1n6    pengine:    debug: action2xml: 	Using anonymous clone name prmPing for prmPing:1 (aka. (null))
Oct 21 11:20:58 [7684] bl460g1n6    pengine:    debug: action2xml: 	Using anonymous clone name prmPing for prmPing:1 (aka. (null))
Oct 21 11:20:58 [7684] bl460g1n6    pengine:    debug: action2xml: 	Using anonymous clone name prmPing for prmPing:1 (aka. (null))
Oct 21 11:20:58 [7684] bl460g1n6    pengine:    debug: action2xml: 	Using anonymous clone name prmPing for prmPing:2 (aka. (null))
Oct 21 11:20:58 [7684] bl460g1n6    pengine:    debug: action2xml: 	Using anonymous clone name prmPing for prmPing:2 (aka. (null))
Oct 21 11:20:58 [7684] bl460g1n6    pengine:    debug: action2xml: 	Using anonymous clone name prmPing for prmPing:2 (aka. (null))
Oct 21 11:20:58 [7684] bl460g1n6    pengine:   notice: process_pe_message: 	Calculated Transition 2: /var/lib/pacemaker/pengine/pe-input-2.bz2
Oct 21 11:20:58 [7685] bl460g1n6       crmd:    debug: s_crmd_fsa: 	Processing I_PE_SUCCESS: [ state=S_POLICY_ENGINE cause=C_IPC_MESSAGE origin=handle_response ]
Oct 21 11:20:58 [7685] bl460g1n6       crmd:     info: do_state_transition: 	State transition S_POLICY_ENGINE -> S_TRANSITION_ENGINE [ input=I_PE_SUCCESS cause=C_IPC_MESSAGE origin=handle_response ]
Oct 21 11:20:58 [7685] bl460g1n6       crmd:    debug: unpack_graph: 	Unpacked transition 2: 52 actions in 52 synapses
Oct 21 11:20:58 [7685] bl460g1n6       crmd:     info: do_te_invoke: 	Processing graph 2 (ref=pe_calc-dc-1382322058-76) derived from /var/lib/pacemaker/pengine/pe-input-2.bz2
Oct 21 11:20:58 [7685] bl460g1n6       crmd:   notice: te_rsc_command: 	Initiating action 28: monitor prmVM1_monitor_10000 on bl460g1n6 (local)
Oct 21 11:20:58 [7685] bl460g1n6       crmd:     info: do_lrm_rsc_op: 	Performing key=28:2:0:38db9a68-056c-4e65-8658-75f0c3cc91e5 op=prmVM1_monitor_10000
Oct 21 11:20:58 [7682] bl460g1n6       lrmd:    debug: process_lrmd_message: 	Processed lrmd_rsc_exec operation from 666c66a8-da92-4298-beea-12fd671d2b0d: rc=43, reply=1, notify=0, exit=4201864
Oct 21 11:20:58 [7682] bl460g1n6       lrmd:    debug: log_execute: 	executing - rsc:prmVM1 action:monitor call_id:43
Oct 21 11:20:58 [7685] bl460g1n6       crmd:   notice: te_rsc_command: 	Initiating action 16: monitor prmVM1_monitor_0 on bl460g1n8
Oct 21 11:20:58 [7685] bl460g1n6       crmd:   notice: te_rsc_command: 	Initiating action 5: monitor prmVM1_monitor_0 on bl460g1n7
Oct 21 11:20:58 [7685] bl460g1n6       crmd:   notice: te_rsc_command: 	Initiating action 31: monitor prmVM2_monitor_10000 on bl460g1n6 (local)
Oct 21 11:20:58 [7685] bl460g1n6       crmd:     info: do_lrm_rsc_op: 	Performing key=31:2:0:38db9a68-056c-4e65-8658-75f0c3cc91e5 op=prmVM2_monitor_10000
Oct 21 11:20:58 [7682] bl460g1n6       lrmd:    debug: process_lrmd_message: 	Processed lrmd_rsc_exec operation from 666c66a8-da92-4298-beea-12fd671d2b0d: rc=44, reply=1, notify=0, exit=4201864
Oct 21 11:20:58 [7682] bl460g1n6       lrmd:    debug: log_execute: 	executing - rsc:prmVM2 action:monitor call_id:44
Oct 21 11:20:58 [7685] bl460g1n6       crmd:   notice: te_rsc_command: 	Initiating action 17: monitor prmVM2_monitor_0 on bl460g1n8
Oct 21 11:20:58 [7685] bl460g1n6       crmd:   notice: te_rsc_command: 	Initiating action 6: monitor prmVM2_monitor_0 on bl460g1n7
Oct 21 11:20:58 [7685] bl460g1n6       crmd:   notice: te_rsc_command: 	Initiating action 34: monitor prmVM3_monitor_10000 on bl460g1n6 (local)
Oct 21 11:20:58 [7685] bl460g1n6       crmd:     info: do_lrm_rsc_op: 	Performing key=34:2:0:38db9a68-056c-4e65-8658-75f0c3cc91e5 op=prmVM3_monitor_10000
Oct 21 11:20:58 [7682] bl460g1n6       lrmd:    debug: process_lrmd_message: 	Processed lrmd_rsc_exec operation from 666c66a8-da92-4298-beea-12fd671d2b0d: rc=45, reply=1, notify=0, exit=4201864
Oct 21 11:20:58 [7682] bl460g1n6       lrmd:    debug: log_execute: 	executing - rsc:prmVM3 action:monitor call_id:45
Oct 21 11:20:58 [7685] bl460g1n6       crmd:   notice: te_rsc_command: 	Initiating action 18: monitor prmVM3_monitor_0 on bl460g1n8
Oct 21 11:20:58 [7685] bl460g1n6       crmd:   notice: te_rsc_command: 	Initiating action 7: monitor prmVM3_monitor_0 on bl460g1n7
Oct 21 11:20:58 [7685] bl460g1n6       crmd:    debug: te_pseudo_action: 	Pseudo action 39 fired and confirmed
Oct 21 11:20:58 [7685] bl460g1n6       crmd:   notice: te_rsc_command: 	Initiating action 19: monitor prmStonith6-1_monitor_0 on bl460g1n8
Oct 21 11:20:58 [7685] bl460g1n6       crmd:   notice: te_rsc_command: 	Initiating action 8: monitor prmStonith6-1_monitor_0 on bl460g1n7
Oct 21 11:20:58 [7685] bl460g1n6       crmd:   notice: te_rsc_command: 	Initiating action 20: monitor prmStonith6-2_monitor_0 on bl460g1n8
Oct 21 11:20:58 [7685] bl460g1n6       crmd:   notice: te_rsc_command: 	Initiating action 9: monitor prmStonith6-2_monitor_0 on bl460g1n7
Oct 21 11:20:58 [7685] bl460g1n6       crmd:    debug: te_pseudo_action: 	Pseudo action 47 fired and confirmed
Oct 21 11:20:58 [7685] bl460g1n6       crmd:   notice: te_rsc_command: 	Initiating action 21: monitor prmStonith7-1_monitor_0 on bl460g1n8
Oct 21 11:20:58 [7685] bl460g1n6       crmd:   notice: te_rsc_command: 	Initiating action 10: monitor prmStonith7-1_monitor_0 on bl460g1n7
Oct 21 11:20:58 [7685] bl460g1n6       crmd:   notice: te_rsc_command: 	Initiating action 22: monitor prmStonith7-2_monitor_0 on bl460g1n8
Oct 21 11:20:58 [7685] bl460g1n6       crmd:   notice: te_rsc_command: 	Initiating action 11: monitor prmStonith7-2_monitor_0 on bl460g1n7
Oct 21 11:20:58 [7685] bl460g1n6       crmd:    debug: te_pseudo_action: 	Pseudo action 55 fired and confirmed
Oct 21 11:20:58 [7685] bl460g1n6       crmd:   notice: te_rsc_command: 	Initiating action 23: monitor prmStonith8-1_monitor_0 on bl460g1n8
Oct 21 11:20:58 [7685] bl460g1n6       crmd:   notice: te_rsc_command: 	Initiating action 12: monitor prmStonith8-1_monitor_0 on bl460g1n7
Oct 21 11:20:58 [7685] bl460g1n6       crmd:   notice: te_rsc_command: 	Initiating action 24: monitor prmStonith8-2_monitor_0 on bl460g1n8
Oct 21 11:20:58 [7685] bl460g1n6       crmd:   notice: te_rsc_command: 	Initiating action 13: monitor prmStonith8-2_monitor_0 on bl460g1n7
Oct 21 11:20:58 [7685] bl460g1n6       crmd:   notice: te_rsc_command: 	Initiating action 14: monitor prmPing:1_monitor_0 on bl460g1n7
Oct 21 11:20:58 [7685] bl460g1n6       crmd:   notice: te_rsc_command: 	Initiating action 25: monitor prmPing:2_monitor_0 on bl460g1n8
Oct 21 11:20:58 [7685] bl460g1n6       crmd:    debug: te_pseudo_action: 	Pseudo action 65 fired and confirmed
Oct 21 11:20:58 [7685] bl460g1n6       crmd:    debug: run_graph: 	Transition 2 (Complete=0, Pending=23, Fired=27, Skipped=0, Incomplete=25, Source=/var/lib/pacemaker/pengine/pe-input-2.bz2): In-progress
Oct 21 11:20:58 [7685] bl460g1n6       crmd:    debug: run_graph: 	Transition 2 (Complete=4, Pending=23, Fired=0, Skipped=0, Incomplete=25, Source=/var/lib/pacemaker/pengine/pe-input-2.bz2): In-progress
Oct 21 11:20:58 [7685] bl460g1n6       crmd:    debug: te_update_diff: 	Processing diff (cib_modify): 0.14.24 -> 0.14.25 (S_TRANSITION_ENGINE)
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: --- 0.14.24
Oct 21 11:20:58 [7685] bl460g1n6       crmd:     info: match_graph_event: 	Action prmStonith6-1_monitor_0 (8) confirmed on bl460g1n7 (rc=0)
Oct 21 11:20:58 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=bl460g1n7/crmd/9, version=0.14.25)
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: +++ 0.14.25 1a8b27f7d9027919e830f19958d35997
Oct 21 11:20:58 [7685] bl460g1n6       crmd:    debug: run_graph: 	Transition 2 (Complete=5, Pending=22, Fired=0, Skipped=0, Incomplete=25, Source=/var/lib/pacemaker/pengine/pe-input-2.bz2): In-progress
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-- <cib num_updates="24"/>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  <cib epoch="14" num_updates="25" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:21 2013" update-origin="bl460g1n7" update-client="crm_resource" have-quorum="1" dc-uuid="3232261592">
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    <status>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      <node_state id="3232261593" uname="bl460g1n7" in_ccm="true" crmd="online" crm-debug-origin="do_update_resource" join="member" expected="member">
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        <lrm id="3232261593">
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          <lrm_resources>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <lrm_resource id="prmStonith6-1" type="external/ipmi" class="stonith">
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <lrm_rsc_op id="prmStonith6-1_last_0" operation_key="prmStonith6-1_monitor_0" operation="monitor" crm-debug-origin="do_update_resource" crm_feature_set="3.0.7" transition-key="8:2:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" transition-magic="0:7;8:2:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" call-id="42" rc-code="7" op-status="0" interval="0" last-run="1382322058" last-rc-change="1382322058" exec-time="0" queue-time="0" op-diges
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           </lrm_resource>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          </lrm_resources>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        </lrm>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      </node_state>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    </status>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  </cib>
Oct 21 11:20:58 [7685] bl460g1n6       crmd:    debug: te_update_diff: 	Processing diff (cib_modify): 0.14.25 -> 0.14.26 (S_TRANSITION_ENGINE)
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: --- 0.14.25
Oct 21 11:20:58 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=bl460g1n7/crmd/10, version=0.14.26)
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: +++ 0.14.26 b0d4961cc1588f4811a1f780034231fb
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-- <cib num_updates="25"/>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  <cib epoch="14" num_updates="26" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:21 2013" update-origin="bl460g1n7" update-client="crm_resource" have-quorum="1" dc-uuid="3232261592">
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    <status>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      <node_state id="3232261593" uname="bl460g1n7" in_ccm="true" crmd="online" crm-debug-origin="do_update_resource" join="member" expected="member">
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        <lrm id="3232261593">
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          <lrm_resources>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <lrm_resource id="prmStonith6-2" type="external/ssh" class="stonith">
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <lrm_rsc_op id="prmStonith6-2_last_0" operation_key="prmStonith6-2_monitor_0" operation="monitor" crm-debug-origin="do_update_resource" crm_feature_set="3.0.7" transition-key="9:2:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" transition-magic="0:7;9:2:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" call-id="46" rc-code="7" op-status="0" interval="0" last-run="1382322058" last-rc-change="1382322058" exec-time="0" queue-time="0" op-diges
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           </lrm_resource>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          </lrm_resources>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        </lrm>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      </node_state>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    </status>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  </cib>
Oct 21 11:20:58 [7685] bl460g1n6       crmd:     info: match_graph_event: 	Action prmStonith6-2_monitor_0 (9) confirmed on bl460g1n7 (rc=0)
Oct 21 11:20:58 [7685] bl460g1n6       crmd:    debug: run_graph: 	Transition 2 (Complete=6, Pending=21, Fired=0, Skipped=0, Incomplete=25, Source=/var/lib/pacemaker/pengine/pe-input-2.bz2): In-progress
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: --- 0.14.26
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: +++ 0.14.27 7c18d30ec555d60cea96607380fba014
Oct 21 11:20:58 [7685] bl460g1n6       crmd:    debug: te_update_diff: 	Processing diff (cib_modify): 0.14.26 -> 0.14.27 (S_TRANSITION_ENGINE)
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-- <cib num_updates="26"/>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  <cib epoch="14" num_updates="27" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:21 2013" update-origin="bl460g1n7" update-client="crm_resource" have-quorum="1" dc-uuid="3232261592">
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    <status>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      <node_state id="3232261593" uname="bl460g1n7" in_ccm="true" crmd="online" crm-debug-origin="do_update_resource" join="member" expected="member">
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        <lrm id="3232261593">
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          <lrm_resources>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <lrm_resource id="prmStonith7-1" type="external/ipmi" class="stonith">
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <lrm_rsc_op id="prmStonith7-1_last_0" operation_key="prmStonith7-1_monitor_0" operation="monitor" crm-debug-origin="do_update_resource" crm_feature_set="3.0.7" transition-key="10:2:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" transition-magic="0:7;10:2:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" call-id="47" rc-code="7" op-status="0" interval="0" last-run="1382322058" last-rc-change="1382322058" exec-time="0" queue-time="0" op-dig
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           </lrm_resource>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          </lrm_resources>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        </lrm>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      </node_state>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    </status>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  </cib>
Oct 21 11:20:58 [7685] bl460g1n6       crmd:     info: match_graph_event: 	Action prmStonith7-1_monitor_0 (10) confirmed on bl460g1n7 (rc=0)
Oct 21 11:20:58 [7685] bl460g1n6       crmd:    debug: run_graph: 	Transition 2 (Complete=7, Pending=20, Fired=0, Skipped=0, Incomplete=25, Source=/var/lib/pacemaker/pengine/pe-input-2.bz2): In-progress
Oct 21 11:20:58 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=bl460g1n7/crmd/11, version=0.14.27)
Oct 21 11:20:58 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=bl460g1n7/crmd/12, version=0.14.28)
Oct 21 11:20:58 [7685] bl460g1n6       crmd:    debug: te_update_diff: 	Processing diff (cib_modify): 0.14.27 -> 0.14.28 (S_TRANSITION_ENGINE)
Oct 21 11:20:58 [7685] bl460g1n6       crmd:     info: match_graph_event: 	Action prmStonith7-2_monitor_0 (11) confirmed on bl460g1n7 (rc=0)
Oct 21 11:20:58 [7685] bl460g1n6       crmd:    debug: run_graph: 	Transition 2 (Complete=8, Pending=19, Fired=0, Skipped=0, Incomplete=25, Source=/var/lib/pacemaker/pengine/pe-input-2.bz2): In-progress
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: --- 0.14.27
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: +++ 0.14.28 598a3df2ba7b318372da2ad65da58fa9
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-- <cib num_updates="27"/>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  <cib epoch="14" num_updates="28" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:21 2013" update-origin="bl460g1n7" update-client="crm_resource" have-quorum="1" dc-uuid="3232261592">
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    <status>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      <node_state id="3232261593" uname="bl460g1n7" in_ccm="true" crmd="online" crm-debug-origin="do_update_resource" join="member" expected="member">
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        <lrm id="3232261593">
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          <lrm_resources>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <lrm_resource id="prmStonith7-2" type="external/ssh" class="stonith">
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <lrm_rsc_op id="prmStonith7-2_last_0" operation_key="prmStonith7-2_monitor_0" operation="monitor" crm-debug-origin="do_update_resource" crm_feature_set="3.0.7" transition-key="11:2:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" transition-magic="0:7;11:2:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" call-id="48" rc-code="7" op-status="0" interval="0" last-run="1382322058" last-rc-change="1382322058" exec-time="0" queue-time="0" op-dig
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           </lrm_resource>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          </lrm_resources>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        </lrm>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      </node_state>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    </status>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  </cib>
Oct 21 11:20:58 [7685] bl460g1n6       crmd:    debug: te_update_diff: 	Processing diff (cib_modify): 0.14.28 -> 0.14.29 (S_TRANSITION_ENGINE)
Oct 21 11:20:58 [7685] bl460g1n6       crmd:     info: match_graph_event: 	Action prmStonith6-1_monitor_0 (19) confirmed on bl460g1n8 (rc=0)
Oct 21 11:20:58 [7685] bl460g1n6       crmd:    debug: run_graph: 	Transition 2 (Complete=9, Pending=18, Fired=0, Skipped=0, Incomplete=25, Source=/var/lib/pacemaker/pengine/pe-input-2.bz2): In-progress
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: --- 0.14.28
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: +++ 0.14.29 4a0b906f4cda334b2b3108ea5c5bd297
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-- <cib num_updates="28"/>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  <cib epoch="14" num_updates="29" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:21 2013" update-origin="bl460g1n7" update-client="crm_resource" have-quorum="1" dc-uuid="3232261592">
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    <status>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      <node_state id="3232261594" uname="bl460g1n8" in_ccm="true" crmd="online" crm-debug-origin="do_update_resource" join="member" expected="member">
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        <lrm id="3232261594">
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          <lrm_resources>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <lrm_resource id="prmStonith6-1" type="external/ipmi" class="stonith">
Oct 21 11:20:58 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=bl460g1n8/crmd/9, version=0.14.29)
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <lrm_rsc_op id="prmStonith6-1_last_0" operation_key="prmStonith6-1_monitor_0" operation="monitor" crm-debug-origin="do_update_resource" crm_feature_set="3.0.7" transition-key="19:2:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" transition-magic="0:7;19:2:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" call-id="42" rc-code="7" op-status="0" interval="0" last-run="1382322058" last-rc-change="1382322058" exec-time="0" queue-time="0" op-dig
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           </lrm_resource>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          </lrm_resources>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        </lrm>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      </node_state>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    </status>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  </cib>
Oct 21 11:20:58 [7685] bl460g1n6       crmd:    debug: te_update_diff: 	Processing diff (cib_modify): 0.14.29 -> 0.14.30 (S_TRANSITION_ENGINE)
Oct 21 11:20:58 [7685] bl460g1n6       crmd:     info: match_graph_event: 	Action prmStonith6-2_monitor_0 (20) confirmed on bl460g1n8 (rc=0)
Oct 21 11:20:58 [7685] bl460g1n6       crmd:    debug: run_graph: 	Transition 2 (Complete=10, Pending=17, Fired=0, Skipped=0, Incomplete=25, Source=/var/lib/pacemaker/pengine/pe-input-2.bz2): In-progress
Oct 21 11:20:58 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=bl460g1n8/crmd/10, version=0.14.30)
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: --- 0.14.29
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: +++ 0.14.30 221423a864297eaa111cfd3bd0190dff
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-- <cib num_updates="29"/>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  <cib epoch="14" num_updates="30" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:21 2013" update-origin="bl460g1n7" update-client="crm_resource" have-quorum="1" dc-uuid="3232261592">
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    <status>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      <node_state id="3232261594" uname="bl460g1n8" in_ccm="true" crmd="online" crm-debug-origin="do_update_resource" join="member" expected="member">
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        <lrm id="3232261594">
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          <lrm_resources>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <lrm_resource id="prmStonith6-2" type="external/ssh" class="stonith">
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <lrm_rsc_op id="prmStonith6-2_last_0" operation_key="prmStonith6-2_monitor_0" operation="monitor" crm-debug-origin="do_update_resource" crm_feature_set="3.0.7" transition-key="20:2:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" transition-magic="0:7;20:2:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" call-id="46" rc-code="7" op-status="0" interval="0" last-run="1382322058" last-rc-change="1382322058" exec-time="1" queue-time="0" op-dig
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           </lrm_resource>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          </lrm_resources>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        </lrm>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      </node_state>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    </status>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  </cib>
Oct 21 11:20:58 [7685] bl460g1n6       crmd:    debug: te_update_diff: 	Processing diff (cib_modify): 0.14.30 -> 0.14.31 (S_TRANSITION_ENGINE)
Oct 21 11:20:58 [7685] bl460g1n6       crmd:     info: match_graph_event: 	Action prmStonith7-1_monitor_0 (21) confirmed on bl460g1n8 (rc=0)
Oct 21 11:20:58 [7685] bl460g1n6       crmd:    debug: run_graph: 	Transition 2 (Complete=11, Pending=16, Fired=0, Skipped=0, Incomplete=25, Source=/var/lib/pacemaker/pengine/pe-input-2.bz2): In-progress
Oct 21 11:20:58 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=bl460g1n8/crmd/11, version=0.14.31)
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: --- 0.14.30
Oct 21 11:20:58 [7685] bl460g1n6       crmd:    debug: te_update_diff: 	Processing diff (cib_modify): 0.14.31 -> 0.14.32 (S_TRANSITION_ENGINE)
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: +++ 0.14.31 c7d8c80f115f9530438a29a2402ee953
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-- <cib num_updates="30"/>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  <cib epoch="14" num_updates="31" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:21 2013" update-origin="bl460g1n7" update-client="crm_resource" have-quorum="1" dc-uuid="3232261592">
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    <status>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      <node_state id="3232261594" uname="bl460g1n8" in_ccm="true" crmd="online" crm-debug-origin="do_update_resource" join="member" expected="member">
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        <lrm id="3232261594">
Oct 21 11:20:58 [7685] bl460g1n6       crmd:     info: match_graph_event: 	Action prmStonith7-2_monitor_0 (22) confirmed on bl460g1n8 (rc=0)
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          <lrm_resources>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <lrm_resource id="prmStonith7-1" type="external/ipmi" class="stonith">
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <lrm_rsc_op id="prmStonith7-1_last_0" operation_key="prmStonith7-1_monitor_0" operation="monitor" crm-debug-origin="do_update_resource" crm_feature_set="3.0.7" transition-key="21:2:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" transition-magic="0:7;21:2:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" call-id="47" rc-code="7" op-status="0" interval="0" last-run="1382322058" last-rc-change="1382322058" exec-time="0" queue-time="0" op-dig
Oct 21 11:20:58 [7685] bl460g1n6       crmd:    debug: run_graph: 	Transition 2 (Complete=12, Pending=15, Fired=0, Skipped=0, Incomplete=25, Source=/var/lib/pacemaker/pengine/pe-input-2.bz2): In-progress
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           </lrm_resource>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          </lrm_resources>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        </lrm>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      </node_state>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    </status>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  </cib>
Oct 21 11:20:58 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=bl460g1n8/crmd/12, version=0.14.32)
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: --- 0.14.31
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: +++ 0.14.32 a3e0450edc7f11ec9130e8dd7d1bdf14
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-- <cib num_updates="31"/>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  <cib epoch="14" num_updates="32" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:21 2013" update-origin="bl460g1n7" update-client="crm_resource" have-quorum="1" dc-uuid="3232261592">
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    <status>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      <node_state id="3232261594" uname="bl460g1n8" in_ccm="true" crmd="online" crm-debug-origin="do_update_resource" join="member" expected="member">
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        <lrm id="3232261594">
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          <lrm_resources>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <lrm_resource id="prmStonith7-2" type="external/ssh" class="stonith">
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <lrm_rsc_op id="prmStonith7-2_last_0" operation_key="prmStonith7-2_monitor_0" operation="monitor" crm-debug-origin="do_update_resource" crm_feature_set="3.0.7" transition-key="22:2:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" transition-magic="0:7;22:2:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" call-id="48" rc-code="7" op-status="0" interval="0" last-run="1382322058" last-rc-change="1382322058" exec-time="0" queue-time="0" op-dig
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           </lrm_resource>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          </lrm_resources>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        </lrm>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      </node_state>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    </status>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  </cib>
Oct 21 11:20:58 [7685] bl460g1n6       crmd:    debug: te_update_diff: 	Processing diff (cib_modify): 0.14.32 -> 0.14.33 (S_TRANSITION_ENGINE)
Oct 21 11:20:58 [7685] bl460g1n6       crmd:     info: match_graph_event: 	Action prmStonith8-1_monitor_0 (23) confirmed on bl460g1n8 (rc=0)
Oct 21 11:20:58 [7685] bl460g1n6       crmd:    debug: run_graph: 	Transition 2 (Complete=13, Pending=14, Fired=0, Skipped=0, Incomplete=25, Source=/var/lib/pacemaker/pengine/pe-input-2.bz2): In-progress
Oct 21 11:20:58 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=bl460g1n8/crmd/13, version=0.14.33)
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: --- 0.14.32
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: +++ 0.14.33 ed37e61d151456b540b11e4d10f8a27a
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-- <cib num_updates="32"/>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  <cib epoch="14" num_updates="33" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:21 2013" update-origin="bl460g1n7" update-client="crm_resource" have-quorum="1" dc-uuid="3232261592">
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    <status>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      <node_state id="3232261594" uname="bl460g1n8" in_ccm="true" crmd="online" crm-debug-origin="do_update_resource" join="member" expected="member">
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        <lrm id="3232261594">
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          <lrm_resources>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <lrm_resource id="prmStonith8-1" type="external/ipmi" class="stonith">
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <lrm_rsc_op id="prmStonith8-1_last_0" operation_key="prmStonith8-1_monitor_0" operation="monitor" crm-debug-origin="do_update_resource" crm_feature_set="3.0.7" transition-key="23:2:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" transition-magic="0:7;23:2:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" call-id="49" rc-code="7" op-status="0" interval="0" last-run="1382322058" last-rc-change="1382322058" exec-time="0" queue-time="0" op-dig
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           </lrm_resource>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          </lrm_resources>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        </lrm>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      </node_state>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    </status>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  </cib>
Oct 21 11:20:58 [7685] bl460g1n6       crmd:    debug: te_update_diff: 	Processing diff (cib_modify): 0.14.33 -> 0.14.34 (S_TRANSITION_ENGINE)
Oct 21 11:20:58 [7685] bl460g1n6       crmd:     info: match_graph_event: 	Action prmStonith8-2_monitor_0 (24) confirmed on bl460g1n8 (rc=0)
Oct 21 11:20:58 [7685] bl460g1n6       crmd:    debug: run_graph: 	Transition 2 (Complete=14, Pending=13, Fired=0, Skipped=0, Incomplete=25, Source=/var/lib/pacemaker/pengine/pe-input-2.bz2): In-progress
Oct 21 11:20:58 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=bl460g1n8/crmd/14, version=0.14.34)
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: --- 0.14.33
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: +++ 0.14.34 f83d0633807f424e7ab6350ce0c609b6
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-- <cib num_updates="33"/>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  <cib epoch="14" num_updates="34" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:21 2013" update-origin="bl460g1n7" update-client="crm_resource" have-quorum="1" dc-uuid="3232261592">
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    <status>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      <node_state id="3232261594" uname="bl460g1n8" in_ccm="true" crmd="online" crm-debug-origin="do_update_resource" join="member" expected="member">
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        <lrm id="3232261594">
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          <lrm_resources>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <lrm_resource id="prmStonith8-2" type="external/ssh" class="stonith">
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <lrm_rsc_op id="prmStonith8-2_last_0" operation_key="prmStonith8-2_monitor_0" operation="monitor" crm-debug-origin="do_update_resource" crm_feature_set="3.0.7" transition-key="24:2:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" transition-magic="0:7;24:2:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" call-id="50" rc-code="7" op-status="0" interval="0" last-run="1382322058" last-rc-change="1382322058" exec-time="0" queue-time="0" op-dig
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           </lrm_resource>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          </lrm_resources>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        </lrm>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      </node_state>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    </status>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  </cib>
Oct 21 11:20:58 [7685] bl460g1n6       crmd:    debug: te_update_diff: 	Processing diff (cib_modify): 0.14.34 -> 0.14.35 (S_TRANSITION_ENGINE)
Oct 21 11:20:58 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=bl460g1n7/crmd/13, version=0.14.35)
Oct 21 11:20:58 [7685] bl460g1n6       crmd:     info: match_graph_event: 	Action prmStonith8-1_monitor_0 (12) confirmed on bl460g1n7 (rc=0)
Oct 21 11:20:58 [7685] bl460g1n6       crmd:    debug: run_graph: 	Transition 2 (Complete=15, Pending=12, Fired=0, Skipped=0, Incomplete=25, Source=/var/lib/pacemaker/pengine/pe-input-2.bz2): In-progress
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: --- 0.14.34
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: +++ 0.14.35 769f939a8a63a4135afc40aa40cb778e
Oct 21 11:20:58 [7685] bl460g1n6       crmd:    debug: te_update_diff: 	Processing diff (cib_modify): 0.14.35 -> 0.14.36 (S_TRANSITION_ENGINE)
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-- <cib num_updates="34"/>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  <cib epoch="14" num_updates="35" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:21 2013" update-origin="bl460g1n7" update-client="crm_resource" have-quorum="1" dc-uuid="3232261592">
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    <status>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      <node_state id="3232261593" uname="bl460g1n7" in_ccm="true" crmd="online" crm-debug-origin="do_update_resource" join="member" expected="member">
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        <lrm id="3232261593">
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          <lrm_resources>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <lrm_resource id="prmStonith8-1" type="external/ipmi" class="stonith">
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <lrm_rsc_op id="prmStonith8-1_last_0" operation_key="prmStonith8-1_monitor_0" operation="monitor" crm-debug-origin="do_update_resource" crm_feature_set="3.0.7" transition-key="12:2:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" transition-magic="0:7;12:2:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" call-id="49" rc-code="7" op-status="0" interval="0" last-run="1382322058" last-rc-change="1382322058" exec-time="0" queue-time="0" op-dig
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           </lrm_resource>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          </lrm_resources>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        </lrm>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      </node_state>
Oct 21 11:20:58 [7685] bl460g1n6       crmd:     info: match_graph_event: 	Action prmStonith8-2_monitor_0 (13) confirmed on bl460g1n7 (rc=0)
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    </status>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  </cib>
Oct 21 11:20:58 [7685] bl460g1n6       crmd:    debug: run_graph: 	Transition 2 (Complete=16, Pending=11, Fired=0, Skipped=0, Incomplete=25, Source=/var/lib/pacemaker/pengine/pe-input-2.bz2): In-progress
Oct 21 11:20:58 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=bl460g1n7/crmd/14, version=0.14.36)
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: --- 0.14.35
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: +++ 0.14.36 fb5694d74282160c36567dfbbd6f6f6d
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-- <cib num_updates="35"/>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  <cib epoch="14" num_updates="36" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:21 2013" update-origin="bl460g1n7" update-client="crm_resource" have-quorum="1" dc-uuid="3232261592">
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    <status>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      <node_state id="3232261593" uname="bl460g1n7" in_ccm="true" crmd="online" crm-debug-origin="do_update_resource" join="member" expected="member">
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        <lrm id="3232261593">
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          <lrm_resources>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <lrm_resource id="prmStonith8-2" type="external/ssh" class="stonith">
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <lrm_rsc_op id="prmStonith8-2_last_0" operation_key="prmStonith8-2_monitor_0" operation="monitor" crm-debug-origin="do_update_resource" crm_feature_set="3.0.7" transition-key="13:2:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" transition-magic="0:7;13:2:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" call-id="50" rc-code="7" op-status="0" interval="0" last-run="1382322058" last-rc-change="1382322058" exec-time="0" queue-time="0" op-dig
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           </lrm_resource>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          </lrm_resources>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        </lrm>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      </node_state>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    </status>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  </cib>
Oct 21 11:20:58 [7685] bl460g1n6       crmd:    debug: te_update_diff: 	Processing diff (cib_modify): 0.14.36 -> 0.14.37 (S_TRANSITION_ENGINE)
Oct 21 11:20:58 [7685] bl460g1n6       crmd:     info: match_graph_event: 	Action prmPing_monitor_0 (14) confirmed on bl460g1n7 (rc=0)
Oct 21 11:20:58 [7685] bl460g1n6       crmd:    debug: run_graph: 	Transition 2 (Complete=17, Pending=10, Fired=0, Skipped=0, Incomplete=25, Source=/var/lib/pacemaker/pengine/pe-input-2.bz2): In-progress
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: --- 0.14.36
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: +++ 0.14.37 300c47c26e1e3e34d25904a562bfb35c
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-- <cib num_updates="36"/>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  <cib epoch="14" num_updates="37" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:21 2013" update-origin="bl460g1n7" update-client="crm_resource" have-quorum="1" dc-uuid="3232261592">
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    <status>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      <node_state id="3232261593" uname="bl460g1n7" in_ccm="true" crmd="online" crm-debug-origin="do_update_resource" join="member" expected="member">
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        <lrm id="3232261593">
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          <lrm_resources>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <lrm_resource id="prmPing" type="ping" class="ocf" provider="pacemaker">
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <lrm_rsc_op id="prmPing_last_0" operation_key="prmPing_monitor_0" operation="monitor" crm-debug-origin="do_update_resource" crm_feature_set="3.0.7" transition-key="14:2:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" transition-magic="0:7;14:2:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" call-id="51" rc-code="7" op-status="0" interval="0" last-run="1382322058" last-rc-change="1382322058" exec-time="14" queue-time="0" op-digest="bc586d
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           </lrm_resource>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          </lrm_resources>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        </lrm>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      </node_state>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    </status>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  </cib>
Oct 21 11:20:58 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=bl460g1n7/crmd/15, version=0.14.37)
Oct 21 11:20:58 [7685] bl460g1n6       crmd:    debug: te_update_diff: 	Processing diff (cib_modify): 0.14.37 -> 0.14.38 (S_TRANSITION_ENGINE)
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: --- 0.14.37
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: +++ 0.14.38 719cb3c4c7b2fbb3f36a9294a77c9317
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-- <cib num_updates="37"/>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  <cib epoch="14" num_updates="38" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:21 2013" update-origin="bl460g1n7" update-client="crm_resource" have-quorum="1" dc-uuid="3232261592">
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    <status>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      <node_state id="3232261594" uname="bl460g1n8" in_ccm="true" crmd="online" crm-debug-origin="do_update_resource" join="member" expected="member">
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        <lrm id="3232261594">
Oct 21 11:20:58 [7685] bl460g1n6       crmd:     info: match_graph_event: 	Action prmPing_monitor_0 (25) confirmed on bl460g1n8 (rc=0)
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          <lrm_resources>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <lrm_resource id="prmPing" type="ping" class="ocf" provider="pacemaker">
Oct 21 11:20:58 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=bl460g1n8/crmd/15, version=0.14.38)
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <lrm_rsc_op id="prmPing_last_0" operation_key="prmPing_monitor_0" operation="monitor" crm-debug-origin="do_update_resource" crm_feature_set="3.0.7" transition-key="25:2:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" transition-magic="0:7;25:2:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" call-id="51" rc-code="7" op-status="0" interval="0" last-run="1382322058" last-rc-change="1382322058" exec-time="13" queue-time="0" op-digest="bc586d
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           </lrm_resource>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          </lrm_resources>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        </lrm>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      </node_state>
Oct 21 11:20:58 [7685] bl460g1n6       crmd:    debug: run_graph: 	Transition 2 (Complete=18, Pending=9, Fired=0, Skipped=0, Incomplete=25, Source=/var/lib/pacemaker/pengine/pe-input-2.bz2): In-progress
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    </status>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  </cib>
VirtualDomain(prmVM2)[8008]:	2013/10/21_11:20:58 DEBUG: Virtual domain vm2 is currently running.
VirtualDomain(prmVM1)[8007]:	2013/10/21_11:20:58 DEBUG: Virtual domain vm1 is currently running.
VirtualDomain(prmVM3)[8009]:	2013/10/21_11:20:58 DEBUG: Virtual domain vm3 is currently running.
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: --- 0.14.38
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: +++ 0.14.39 f558149812a3a24db4ede4607378db23
Oct 21 11:20:58 [7685] bl460g1n6       crmd:    debug: te_update_diff: 	Processing diff (cib_modify): 0.14.38 -> 0.14.39 (S_TRANSITION_ENGINE)
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-- <cib num_updates="38"/>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  <cib epoch="14" num_updates="39" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:21 2013" update-origin="bl460g1n7" update-client="crm_resource" have-quorum="1" dc-uuid="3232261592">
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    <status>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      <node_state id="3232261594" uname="bl460g1n8" in_ccm="true" crmd="online" crm-debug-origin="do_update_resource" join="member" expected="member">
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        <lrm id="3232261594">
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          <lrm_resources>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <lrm_resource id="prmVM1" type="VirtualDomain" class="ocf" provider="heartbeat">
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <lrm_rsc_op id="prmVM1_last_0" operation_key="prmVM1_monitor_0" operation="monitor" crm-debug-origin="do_update_resource" crm_feature_set="3.0.7" transition-key="16:2:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" transition-magic="0:7;16:2:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" call-id="36" rc-code="7" op-status="0" interval="0" last-run="1382322058" last-rc-change="1382322058" exec-time="165" queue-time="0" op-digest="0d45152
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           </lrm_resource>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          </lrm_resources>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        </lrm>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      </node_state>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    </status>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  </cib>
Oct 21 11:20:58 [7685] bl460g1n6       crmd:     info: match_graph_event: 	Action prmVM1_monitor_0 (16) confirmed on bl460g1n8 (rc=0)
Oct 21 11:20:58 [7685] bl460g1n6       crmd:    debug: run_graph: 	Transition 2 (Complete=19, Pending=8, Fired=0, Skipped=0, Incomplete=25, Source=/var/lib/pacemaker/pengine/pe-input-2.bz2): In-progress
Oct 21 11:20:58 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=bl460g1n8/crmd/16, version=0.14.39)
Oct 21 11:20:58 [7685] bl460g1n6       crmd:    debug: te_update_diff: 	Processing diff (cib_modify): 0.14.39 -> 0.14.40 (S_TRANSITION_ENGINE)
Oct 21 11:20:58 [7685] bl460g1n6       crmd:     info: match_graph_event: 	Action prmVM2_monitor_0 (17) confirmed on bl460g1n8 (rc=0)
Oct 21 11:20:58 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=bl460g1n8/crmd/17, version=0.14.40)
Oct 21 11:20:58 [7685] bl460g1n6       crmd:    debug: run_graph: 	Transition 2 (Complete=20, Pending=7, Fired=0, Skipped=0, Incomplete=25, Source=/var/lib/pacemaker/pengine/pe-input-2.bz2): In-progress
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: --- 0.14.39
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: +++ 0.14.40 0753084528059ab3fb3cb051111da97b
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-- <cib num_updates="39"/>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  <cib epoch="14" num_updates="40" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:21 2013" update-origin="bl460g1n7" update-client="crm_resource" have-quorum="1" dc-uuid="3232261592">
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    <status>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      <node_state id="3232261594" uname="bl460g1n8" in_ccm="true" crmd="online" crm-debug-origin="do_update_resource" join="member" expected="member">
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        <lrm id="3232261594">
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          <lrm_resources>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <lrm_resource id="prmVM2" type="VirtualDomain" class="ocf" provider="heartbeat">
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <lrm_rsc_op id="prmVM2_last_0" operation_key="prmVM2_monitor_0" operation="monitor" crm-debug-origin="do_update_resource" crm_feature_set="3.0.7" transition-key="17:2:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" transition-magic="0:7;17:2:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" call-id="38" rc-code="7" op-status="0" interval="0" last-run="1382322058" last-rc-change="1382322058" exec-time="166" queue-time="0" op-digest="655164b
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           </lrm_resource>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          </lrm_resources>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        </lrm>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      </node_state>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    </status>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  </cib>
Oct 21 11:20:58 [7685] bl460g1n6       crmd:    debug: te_update_diff: 	Processing diff (cib_modify): 0.14.40 -> 0.14.41 (S_TRANSITION_ENGINE)
Oct 21 11:20:58 [7685] bl460g1n6       crmd:     info: match_graph_event: 	Action prmVM3_monitor_0 (18) confirmed on bl460g1n8 (rc=0)
Oct 21 11:20:58 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=bl460g1n8/crmd/18, version=0.14.41)
Oct 21 11:20:58 [7685] bl460g1n6       crmd:   notice: te_rsc_command: 	Initiating action 15: probe_complete probe_complete on bl460g1n8 - no waiting
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: --- 0.14.40
Oct 21 11:20:58 [7685] bl460g1n6       crmd:     info: te_rsc_command: 	Action 15 confirmed - no wait
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: +++ 0.14.41 b78acb96eb5cb5159d0a07535306bb38
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-- <cib num_updates="40"/>
Oct 21 11:20:58 [7685] bl460g1n6       crmd:    debug: run_graph: 	Transition 2 (Complete=21, Pending=6, Fired=1, Skipped=0, Incomplete=24, Source=/var/lib/pacemaker/pengine/pe-input-2.bz2): In-progress
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  <cib epoch="14" num_updates="41" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:21 2013" update-origin="bl460g1n7" update-client="crm_resource" have-quorum="1" dc-uuid="3232261592">
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    <status>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      <node_state id="3232261594" uname="bl460g1n8" in_ccm="true" crmd="online" crm-debug-origin="do_update_resource" join="member" expected="member">
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        <lrm id="3232261594">
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          <lrm_resources>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <lrm_resource id="prmVM3" type="VirtualDomain" class="ocf" provider="heartbeat">
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <lrm_rsc_op id="prmVM3_last_0" operation_key="prmVM3_monitor_0" operation="monitor" crm-debug-origin="do_update_resource" crm_feature_set="3.0.7" transition-key="18:2:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" transition-magic="0:7;18:2:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" call-id="40" rc-code="7" op-status="0" interval="0" last-run="1382322058" last-rc-change="1382322058" exec-time="164" queue-time="0" op-digest="b48a219
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           </lrm_resource>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          </lrm_resources>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        </lrm>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      </node_state>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    </status>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  </cib>
Oct 21 11:20:58 [7685] bl460g1n6       crmd:    debug: run_graph: 	Transition 2 (Complete=22, Pending=6, Fired=0, Skipped=0, Incomplete=24, Source=/var/lib/pacemaker/pengine/pe-input-2.bz2): In-progress
Oct 21 11:20:58 [7685] bl460g1n6       crmd:    debug: te_update_diff: 	Processing diff (cib_modify): 0.14.41 -> 0.14.42 (S_TRANSITION_ENGINE)
Oct 21 11:20:58 [7685] bl460g1n6       crmd:     info: match_graph_event: 	Action prmVM2_monitor_0 (6) confirmed on bl460g1n7 (rc=0)
Oct 21 11:20:58 [7685] bl460g1n6       crmd:    debug: run_graph: 	Transition 2 (Complete=23, Pending=5, Fired=0, Skipped=0, Incomplete=24, Source=/var/lib/pacemaker/pengine/pe-input-2.bz2): In-progress
Oct 21 11:20:58 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=bl460g1n7/crmd/16, version=0.14.42)
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: --- 0.14.41
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: +++ 0.14.42 a5b07488a4ef453f937c334579953d36
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-- <cib num_updates="41"/>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  <cib epoch="14" num_updates="42" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:21 2013" update-origin="bl460g1n7" update-client="crm_resource" have-quorum="1" dc-uuid="3232261592">
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    <status>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      <node_state id="3232261593" uname="bl460g1n7" in_ccm="true" crmd="online" crm-debug-origin="do_update_resource" join="member" expected="member">
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        <lrm id="3232261593">
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          <lrm_resources>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <lrm_resource id="prmVM2" type="VirtualDomain" class="ocf" provider="heartbeat">
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <lrm_rsc_op id="prmVM2_last_0" operation_key="prmVM2_monitor_0" operation="monitor" crm-debug-origin="do_update_resource" crm_feature_set="3.0.7" transition-key="6:2:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" transition-magic="0:7;6:2:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" call-id="38" rc-code="7" op-status="0" interval="0" last-run="1382322058" last-rc-change="1382322058" exec-time="169" queue-time="0" op-digest="655164bb0
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           </lrm_resource>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          </lrm_resources>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        </lrm>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      </node_state>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    </status>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  </cib>
Oct 21 11:20:58 [7685] bl460g1n6       crmd:    debug: te_update_diff: 	Processing diff (cib_modify): 0.14.42 -> 0.14.43 (S_TRANSITION_ENGINE)
Oct 21 11:20:58 [7685] bl460g1n6       crmd:     info: match_graph_event: 	Action prmVM1_monitor_0 (5) confirmed on bl460g1n7 (rc=0)
Oct 21 11:20:58 [7685] bl460g1n6       crmd:    debug: run_graph: 	Transition 2 (Complete=24, Pending=4, Fired=0, Skipped=0, Incomplete=24, Source=/var/lib/pacemaker/pengine/pe-input-2.bz2): In-progress
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: --- 0.14.42
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: +++ 0.14.43 58bbda448880822728cc72d00b3f4523
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-- <cib num_updates="42"/>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  <cib epoch="14" num_updates="43" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:21 2013" update-origin="bl460g1n7" update-client="crm_resource" have-quorum="1" dc-uuid="3232261592">
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    <status>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      <node_state id="3232261593" uname="bl460g1n7" in_ccm="true" crmd="online" crm-debug-origin="do_update_resource" join="member" expected="member">
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        <lrm id="3232261593">
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          <lrm_resources>
Oct 21 11:20:58 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=bl460g1n7/crmd/17, version=0.14.43)
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <lrm_resource id="prmVM1" type="VirtualDomain" class="ocf" provider="heartbeat">
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <lrm_rsc_op id="prmVM1_last_0" operation_key="prmVM1_monitor_0" operation="monitor" crm-debug-origin="do_update_resource" crm_feature_set="3.0.7" transition-key="5:2:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" transition-magic="0:7;5:2:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" call-id="36" rc-code="7" op-status="0" interval="0" last-run="1382322058" last-rc-change="1382322058" exec-time="180" queue-time="0" op-digest="0d451529e
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           </lrm_resource>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          </lrm_resources>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        </lrm>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      </node_state>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    </status>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  </cib>
Oct 21 11:20:58 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=bl460g1n7/crmd/18, version=0.14.44)
Oct 21 11:20:58 [7685] bl460g1n6       crmd:    debug: te_update_diff: 	Processing diff (cib_modify): 0.14.43 -> 0.14.44 (S_TRANSITION_ENGINE)
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: --- 0.14.43
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: +++ 0.14.44 7bff461dad999485b58eb6e7872083ea
Oct 21 11:20:58 [7685] bl460g1n6       crmd:     info: match_graph_event: 	Action prmVM3_monitor_0 (7) confirmed on bl460g1n7 (rc=0)
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-- <cib num_updates="43"/>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  <cib epoch="14" num_updates="44" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:21 2013" update-origin="bl460g1n7" update-client="crm_resource" have-quorum="1" dc-uuid="3232261592">
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    <status>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      <node_state id="3232261593" uname="bl460g1n7" in_ccm="true" crmd="online" crm-debug-origin="do_update_resource" join="member" expected="member">
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        <lrm id="3232261593">
Oct 21 11:20:58 [7685] bl460g1n6       crmd:   notice: te_rsc_command: 	Initiating action 4: probe_complete probe_complete on bl460g1n7 - no waiting
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          <lrm_resources>
Oct 21 11:20:58 [7685] bl460g1n6       crmd:     info: te_rsc_command: 	Action 4 confirmed - no wait
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <lrm_resource id="prmVM3" type="VirtualDomain" class="ocf" provider="heartbeat">
Oct 21 11:20:58 [7685] bl460g1n6       crmd:    debug: te_pseudo_action: 	Pseudo action 2 fired and confirmed
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <lrm_rsc_op id="prmVM3_last_0" operation_key="prmVM3_monitor_0" operation="monitor" crm-debug-origin="do_update_resource" crm_feature_set="3.0.7" transition-key="7:2:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" transition-magic="0:7;7:2:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" call-id="40" rc-code="7" op-status="0" interval="0" last-run="1382322058" last-rc-change="1382322058" exec-time="174" queue-time="0" op-digest="b48a219e0
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           </lrm_resource>
Oct 21 11:20:58 [7685] bl460g1n6       crmd:    debug: run_graph: 	Transition 2 (Complete=25, Pending=3, Fired=2, Skipped=0, Incomplete=22, Source=/var/lib/pacemaker/pengine/pe-input-2.bz2): In-progress
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          </lrm_resources>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        </lrm>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      </node_state>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    </status>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  </cib>
Oct 21 11:20:58 [7685] bl460g1n6       crmd:   notice: te_rsc_command: 	Initiating action 35: start prmStonith6-1_start_0 on bl460g1n7
Oct 21 11:20:58 [7685] bl460g1n6       crmd:   notice: te_rsc_command: 	Initiating action 43: start prmStonith7-1_start_0 on bl460g1n8
Oct 21 11:20:58 [7685] bl460g1n6       crmd:   notice: te_rsc_command: 	Initiating action 51: start prmStonith8-1_start_0 on bl460g1n7
Oct 21 11:20:58 [7685] bl460g1n6       crmd:   notice: te_rsc_command: 	Initiating action 59: start prmPing_start_0 on bl460g1n6 (local)
Oct 21 11:20:58 [7685] bl460g1n6       crmd:    debug: do_lrm_rsc_op: 	Stopped 0 recurring operations in preparation for prmPing_start_0
Oct 21 11:20:58 [7685] bl460g1n6       crmd:     info: do_lrm_rsc_op: 	Performing key=59:2:0:38db9a68-056c-4e65-8658-75f0c3cc91e5 op=prmPing_start_0
Oct 21 11:20:58 [7682] bl460g1n6       lrmd:    debug: process_lrmd_message: 	Processed lrmd_rsc_exec operation from 666c66a8-da92-4298-beea-12fd671d2b0d: rc=46, reply=1, notify=0, exit=4201864
Oct 21 11:20:58 [7682] bl460g1n6       lrmd:     info: log_execute: 	executing - rsc:prmPing action:start call_id:46
Oct 21 11:20:58 [7685] bl460g1n6       crmd:   notice: te_rsc_command: 	Initiating action 61: start prmPing:1_start_0 on bl460g1n7
Oct 21 11:20:58 [7685] bl460g1n6       crmd:   notice: te_rsc_command: 	Initiating action 63: start prmPing:2_start_0 on bl460g1n8
Oct 21 11:20:58 [7685] bl460g1n6       crmd:    debug: run_graph: 	Transition 2 (Complete=27, Pending=9, Fired=6, Skipped=0, Incomplete=16, Source=/var/lib/pacemaker/pengine/pe-input-2.bz2): In-progress
Oct 21 11:20:58 [7680] bl460g1n6        cib:     info: crm_client_new: 	Connecting 0xc52380 for uid=0 gid=0 pid=8101 id=49e5cb57-216e-483d-8a31-a21a1fb73427
Oct 21 11:20:58 [7680] bl460g1n6        cib:    debug: handle_new_connection: 	IPC credentials authenticated (7680-8101-14)
Oct 21 11:20:58 [7680] bl460g1n6        cib:    debug: qb_ipcs_shm_connect: 	connecting to client [8101]
Oct 21 11:20:58 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:58 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:58 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:58 [8101] bl460g1n6 crm_resource:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:58 [8101] bl460g1n6 crm_resource:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:58 [8101] bl460g1n6 crm_resource:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:58 [8101] bl460g1n6 crm_resource:    debug: cib_native_signon_raw: 	Connection to CIB successful
Oct 21 11:20:58 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_query operation for section 'all': OK (rc=0, origin=local/crm_resource/2, version=0.14.44)
Oct 21 11:20:58 [8101] bl460g1n6 crm_resource:    debug: unpack_config: 	STONITH timeout: 60000
Oct 21 11:20:58 [8101] bl460g1n6 crm_resource:    debug: unpack_config: 	STONITH of failed nodes is enabled
Oct 21 11:20:58 [8101] bl460g1n6 crm_resource:    debug: unpack_config: 	Stop all active resources: false
Oct 21 11:20:58 [8101] bl460g1n6 crm_resource:    debug: unpack_config: 	Cluster is symmetric - resources can run anywhere by default
Oct 21 11:20:58 [8101] bl460g1n6 crm_resource:    debug: unpack_config: 	Default stickiness: 0
Oct 21 11:20:58 [8101] bl460g1n6 crm_resource:    debug: unpack_config: 	On loss of CCM Quorum: Freeze resources
Oct 21 11:20:58 [8101] bl460g1n6 crm_resource:    debug: unpack_config: 	Node scores: 'red' = -INFINITY, 'yellow' = 0, 'green' = 0
Oct 21 11:20:58 [7680] bl460g1n6        cib:     info: crm_client_new: 	Connecting 0xc070f0 for uid=0 gid=0 pid=8104 id=a0b0bcb3-6534-4a6b-b171-e8f0e3239af9
Oct 21 11:20:58 [7680] bl460g1n6        cib:    debug: handle_new_connection: 	IPC credentials authenticated (7680-8104-15)
Oct 21 11:20:58 [8101] bl460g1n6 crm_resource:  warning: handle_startup_fencing: 	Blind faith: not fencing unseen nodes
Oct 21 11:20:58 [7680] bl460g1n6        cib:    debug: qb_ipcs_shm_connect: 	connecting to client [8104]
Oct 21 11:20:58 [8101] bl460g1n6 crm_resource:    debug: unpack_domains: 	Unpacking domains
Oct 21 11:20:58 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:58 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:58 [8101] bl460g1n6 crm_resource:     info: determine_online_status_fencing: 	Node bl460g1n8 is active
Oct 21 11:20:58 [8101] bl460g1n6 crm_resource:     info: determine_online_status: 	Node bl460g1n8 is online
Oct 21 11:20:58 [8101] bl460g1n6 crm_resource:     info: determine_online_status_fencing: 	Node bl460g1n6 is active
Oct 21 11:20:58 [8101] bl460g1n6 crm_resource:     info: determine_online_status: 	Node bl460g1n6 is online
Oct 21 11:20:58 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:58 [8101] bl460g1n6 crm_resource:     info: determine_online_status_fencing: 	Node bl460g1n7 is active
Oct 21 11:20:58 [8101] bl460g1n6 crm_resource:     info: determine_online_status: 	Node bl460g1n7 is online
Oct 21 11:20:58 [8101] bl460g1n6 crm_resource:    debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n8 to prmPing:0
Oct 21 11:20:58 [8104] bl460g1n6 crm_resource:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:58 [8101] bl460g1n6 crm_resource:    debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n6 to prmPing:0
Oct 21 11:20:58 [8104] bl460g1n6 crm_resource:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:58 [8104] bl460g1n6 crm_resource:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:58 [8101] bl460g1n6 crm_resource:    debug: determine_op_status: 	prmVM1_monitor_0 on bl460g1n6 returned 'ok' (0) instead of the expected value: 'not running' (7)
Oct 21 11:20:58 [8101] bl460g1n6 crm_resource:     info: determine_op_status: 	Operation monitor found resource prmVM1 active on bl460g1n6
Oct 21 11:20:58 [8101] bl460g1n6 crm_resource:    debug: determine_op_status: 	prmVM2_monitor_0 on bl460g1n6 returned 'ok' (0) instead of the expected value: 'not running' (7)
Oct 21 11:20:58 [8101] bl460g1n6 crm_resource:     info: determine_op_status: 	Operation monitor found resource prmVM2 active on bl460g1n6
Oct 21 11:20:58 [8101] bl460g1n6 crm_resource:    debug: determine_op_status: 	prmVM3_monitor_0 on bl460g1n6 returned 'ok' (0) instead of the expected value: 'not running' (7)
Oct 21 11:20:58 [8101] bl460g1n6 crm_resource:     info: determine_op_status: 	Operation monitor found resource prmVM3 active on bl460g1n6
Oct 21 11:20:58 [8101] bl460g1n6 crm_resource:    debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n7 to prmPing:0
Oct 21 11:20:58 [8101] bl460g1n6 crm_resource:    debug: dump_resource_attr: 	Looking up cpu in prmVM2
Oct 21 11:20:58 [8104] bl460g1n6 crm_resource:    debug: cib_native_signon_raw: 	Connection to CIB successful
Oct 21 11:20:58 [8101] bl460g1n6 crm_resource:    debug: cib_native_signoff: 	Signing out of the CIB Service
Oct 21 11:20:58 [8101] bl460g1n6 crm_resource:    debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Oct 21 11:20:58 [8101] bl460g1n6 crm_resource:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-request-7680-8101-14-header
Oct 21 11:20:58 [8101] bl460g1n6 crm_resource:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-response-7680-8101-14-header
Oct 21 11:20:58 [8101] bl460g1n6 crm_resource:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-event-7680-8101-14-header
Oct 21 11:20:58 [8101] bl460g1n6 crm_resource:     info: crm_xml_cleanup: 	Cleaning up memory from libxml2
Oct 21 11:20:58 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_query operation for section 'all': OK (rc=0, origin=local/crm_resource/2, version=0.14.44)
Oct 21 11:20:58 [7680] bl460g1n6        cib:    debug: qb_ipcs_dispatch_connection_request: 	HUP conn (7680-8101-14)
Oct 21 11:20:58 [7680] bl460g1n6        cib:    debug: qb_ipcs_disconnect: 	qb_ipcs_disconnect(7680-8101-14) state:2
Oct 21 11:20:58 [7680] bl460g1n6        cib:     info: crm_client_destroy: 	Destroying 0 events
Oct 21 11:20:58 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-response-7680-8101-14-header
Oct 21 11:20:58 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-event-7680-8101-14-header
Oct 21 11:20:58 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-request-7680-8101-14-header
Oct 21 11:20:58 [7680] bl460g1n6        cib:     info: crm_client_new: 	Connecting 0xc07650 for uid=0 gid=0 pid=8108 id=da3290c2-8a0e-415c-a856-75762ee66e39
Oct 21 11:20:58 [8104] bl460g1n6 crm_resource:    debug: unpack_config: 	STONITH timeout: 60000
Oct 21 11:20:58 [7680] bl460g1n6        cib:    debug: handle_new_connection: 	IPC credentials authenticated (7680-8108-14)
Oct 21 11:20:58 [8104] bl460g1n6 crm_resource:    debug: unpack_config: 	STONITH of failed nodes is enabled
Oct 21 11:20:58 [8104] bl460g1n6 crm_resource:    debug: unpack_config: 	Stop all active resources: false
Oct 21 11:20:58 [8104] bl460g1n6 crm_resource:    debug: unpack_config: 	Cluster is symmetric - resources can run anywhere by default
Oct 21 11:20:58 [8104] bl460g1n6 crm_resource:    debug: unpack_config: 	Default stickiness: 0
Oct 21 11:20:58 [7680] bl460g1n6        cib:    debug: qb_ipcs_shm_connect: 	connecting to client [8108]
Oct 21 11:20:58 [8104] bl460g1n6 crm_resource:    debug: unpack_config: 	On loss of CCM Quorum: Freeze resources
Oct 21 11:20:58 [8104] bl460g1n6 crm_resource:    debug: unpack_config: 	Node scores: 'red' = -INFINITY, 'yellow' = 0, 'green' = 0
Oct 21 11:20:58 [8104] bl460g1n6 crm_resource:  warning: handle_startup_fencing: 	Blind faith: not fencing unseen nodes
Oct 21 11:20:58 [8104] bl460g1n6 crm_resource:    debug: unpack_domains: 	Unpacking domains
Oct 21 11:20:58 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:58 [8104] bl460g1n6 crm_resource:     info: determine_online_status_fencing: 	Node bl460g1n8 is active
Oct 21 11:20:58 [8104] bl460g1n6 crm_resource:     info: determine_online_status: 	Node bl460g1n8 is online
Oct 21 11:20:58 [8104] bl460g1n6 crm_resource:     info: determine_online_status_fencing: 	Node bl460g1n6 is active
Oct 21 11:20:58 [8104] bl460g1n6 crm_resource:     info: determine_online_status: 	Node bl460g1n6 is online
Oct 21 11:20:58 [8104] bl460g1n6 crm_resource:     info: determine_online_status_fencing: 	Node bl460g1n7 is active
Oct 21 11:20:58 [8104] bl460g1n6 crm_resource:     info: determine_online_status: 	Node bl460g1n7 is online
Oct 21 11:20:58 [8104] bl460g1n6 crm_resource:    debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n8 to prmPing:0
Oct 21 11:20:58 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:58 [8104] bl460g1n6 crm_resource:    debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n6 to prmPing:0
Oct 21 11:20:58 [8104] bl460g1n6 crm_resource:    debug: determine_op_status: 	prmVM1_monitor_0 on bl460g1n6 returned 'ok' (0) instead of the expected value: 'not running' (7)
Oct 21 11:20:58 [8104] bl460g1n6 crm_resource:     info: determine_op_status: 	Operation monitor found resource prmVM1 active on bl460g1n6
Oct 21 11:20:58 [8104] bl460g1n6 crm_resource:    debug: determine_op_status: 	prmVM2_monitor_0 on bl460g1n6 returned 'ok' (0) instead of the expected value: 'not running' (7)
Oct 21 11:20:58 [8104] bl460g1n6 crm_resource:     info: determine_op_status: 	Operation monitor found resource prmVM2 active on bl460g1n6
Oct 21 11:20:58 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:58 [8104] bl460g1n6 crm_resource:    debug: determine_op_status: 	prmVM3_monitor_0 on bl460g1n6 returned 'ok' (0) instead of the expected value: 'not running' (7)
Oct 21 11:20:58 [8104] bl460g1n6 crm_resource:     info: determine_op_status: 	Operation monitor found resource prmVM3 active on bl460g1n6
Oct 21 11:20:58 [8104] bl460g1n6 crm_resource:    debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n7 to prmPing:0
Oct 21 11:20:58 [8104] bl460g1n6 crm_resource:    debug: dump_resource_attr: 	Looking up cpu in prmVM1
Oct 21 11:20:58 [8108] bl460g1n6 crm_resource:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:58 [8108] bl460g1n6 crm_resource:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:58 [8108] bl460g1n6 crm_resource:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:58 [8104] bl460g1n6 crm_resource:    debug: cib_native_signoff: 	Signing out of the CIB Service
Oct 21 11:20:58 [8104] bl460g1n6 crm_resource:    debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Oct 21 11:20:58 [7680] bl460g1n6        cib:    debug: qb_ipcs_dispatch_connection_request: 	HUP conn (7680-8104-15)
Oct 21 11:20:58 [8104] bl460g1n6 crm_resource:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-request-7680-8104-15-header
Oct 21 11:20:58 [7680] bl460g1n6        cib:    debug: qb_ipcs_disconnect: 	qb_ipcs_disconnect(7680-8104-15) state:2
Oct 21 11:20:58 [8104] bl460g1n6 crm_resource:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-response-7680-8104-15-header
Oct 21 11:20:58 [7680] bl460g1n6        cib:     info: crm_client_destroy: 	Destroying 0 events
Oct 21 11:20:58 [8104] bl460g1n6 crm_resource:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-event-7680-8104-15-header
Oct 21 11:20:58 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-response-7680-8104-15-header
Oct 21 11:20:58 [8104] bl460g1n6 crm_resource:     info: crm_xml_cleanup: 	Cleaning up memory from libxml2
Oct 21 11:20:58 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-event-7680-8104-15-header
Oct 21 11:20:58 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-request-7680-8104-15-header
Oct 21 11:20:58 [8108] bl460g1n6 crm_resource:    debug: cib_native_signon_raw: 	Connection to CIB successful
Oct 21 11:20:58 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_query operation for section 'all': OK (rc=0, origin=local/crm_resource/2, version=0.14.44)
Oct 21 11:20:58 [8108] bl460g1n6 crm_resource:    debug: unpack_config: 	STONITH timeout: 60000
Oct 21 11:20:58 [8108] bl460g1n6 crm_resource:    debug: unpack_config: 	STONITH of failed nodes is enabled
Oct 21 11:20:58 [8108] bl460g1n6 crm_resource:    debug: unpack_config: 	Stop all active resources: false
Oct 21 11:20:58 [8108] bl460g1n6 crm_resource:    debug: unpack_config: 	Cluster is symmetric - resources can run anywhere by default
Oct 21 11:20:58 [8108] bl460g1n6 crm_resource:    debug: unpack_config: 	Default stickiness: 0
Oct 21 11:20:58 [8108] bl460g1n6 crm_resource:    debug: unpack_config: 	On loss of CCM Quorum: Freeze resources
Oct 21 11:20:58 [8108] bl460g1n6 crm_resource:    debug: unpack_config: 	Node scores: 'red' = -INFINITY, 'yellow' = 0, 'green' = 0
Oct 21 11:20:58 [8108] bl460g1n6 crm_resource:  warning: handle_startup_fencing: 	Blind faith: not fencing unseen nodes
Oct 21 11:20:58 [8108] bl460g1n6 crm_resource:    debug: unpack_domains: 	Unpacking domains
Oct 21 11:20:58 [8108] bl460g1n6 crm_resource:     info: determine_online_status_fencing: 	Node bl460g1n8 is active
Oct 21 11:20:58 [8108] bl460g1n6 crm_resource:     info: determine_online_status: 	Node bl460g1n8 is online
Oct 21 11:20:58 [8108] bl460g1n6 crm_resource:     info: determine_online_status_fencing: 	Node bl460g1n6 is active
Oct 21 11:20:58 [8108] bl460g1n6 crm_resource:     info: determine_online_status: 	Node bl460g1n6 is online
Oct 21 11:20:58 [8108] bl460g1n6 crm_resource:     info: determine_online_status_fencing: 	Node bl460g1n7 is active
Oct 21 11:20:58 [8108] bl460g1n6 crm_resource:     info: determine_online_status: 	Node bl460g1n7 is online
Oct 21 11:20:58 [8108] bl460g1n6 crm_resource:    debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n8 to prmPing:0
Oct 21 11:20:58 [8108] bl460g1n6 crm_resource:    debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n6 to prmPing:0
Oct 21 11:20:58 [8108] bl460g1n6 crm_resource:    debug: determine_op_status: 	prmVM1_monitor_0 on bl460g1n6 returned 'ok' (0) instead of the expected value: 'not running' (7)
Oct 21 11:20:58 [8108] bl460g1n6 crm_resource:     info: determine_op_status: 	Operation monitor found resource prmVM1 active on bl460g1n6
Oct 21 11:20:58 [8108] bl460g1n6 crm_resource:    debug: determine_op_status: 	prmVM2_monitor_0 on bl460g1n6 returned 'ok' (0) instead of the expected value: 'not running' (7)
Oct 21 11:20:58 [8108] bl460g1n6 crm_resource:     info: determine_op_status: 	Operation monitor found resource prmVM2 active on bl460g1n6
Oct 21 11:20:58 [8108] bl460g1n6 crm_resource:    debug: determine_op_status: 	prmVM3_monitor_0 on bl460g1n6 returned 'ok' (0) instead of the expected value: 'not running' (7)
Oct 21 11:20:58 [8108] bl460g1n6 crm_resource:     info: determine_op_status: 	Operation monitor found resource prmVM3 active on bl460g1n6
Oct 21 11:20:58 [8108] bl460g1n6 crm_resource:    debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n7 to prmPing:0
Oct 21 11:20:58 [8108] bl460g1n6 crm_resource:    debug: dump_resource_attr: 	Looking up cpu in prmVM3
Oct 21 11:20:58 [8108] bl460g1n6 crm_resource:    debug: cib_native_signoff: 	Signing out of the CIB Service
Oct 21 11:20:58 [8108] bl460g1n6 crm_resource:    debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Oct 21 11:20:58 [8108] bl460g1n6 crm_resource:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-request-7680-8108-14-header
Oct 21 11:20:58 [7680] bl460g1n6        cib:    debug: qb_ipcs_dispatch_connection_request: 	HUP conn (7680-8108-14)
Oct 21 11:20:58 [7680] bl460g1n6        cib:    debug: qb_ipcs_disconnect: 	qb_ipcs_disconnect(7680-8108-14) state:2
Oct 21 11:20:58 [8108] bl460g1n6 crm_resource:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-response-7680-8108-14-header
Oct 21 11:20:58 [8108] bl460g1n6 crm_resource:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-event-7680-8108-14-header
Oct 21 11:20:58 [8108] bl460g1n6 crm_resource:     info: crm_xml_cleanup: 	Cleaning up memory from libxml2
Oct 21 11:20:58 [7680] bl460g1n6        cib:     info: crm_client_destroy: 	Destroying 0 events
Oct 21 11:20:58 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-response-7680-8108-14-header
Oct 21 11:20:58 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-event-7680-8108-14-header
Oct 21 11:20:58 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-request-7680-8108-14-header
Oct 21 11:20:58 [7680] bl460g1n6        cib:     info: crm_client_new: 	Connecting 0xc52380 for uid=0 gid=0 pid=8134 id=6cfb264d-e486-446b-9dcc-058d12f07ea2
Oct 21 11:20:58 [7680] bl460g1n6        cib:    debug: handle_new_connection: 	IPC credentials authenticated (7680-8134-14)
Oct 21 11:20:58 [7680] bl460g1n6        cib:    debug: qb_ipcs_shm_connect: 	connecting to client [8134]
Oct 21 11:20:58 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:58 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:58 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:58 [8134] bl460g1n6 crm_resource:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:58 [8134] bl460g1n6 crm_resource:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:58 [8134] bl460g1n6 crm_resource:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:58 [8134] bl460g1n6 crm_resource:    debug: cib_native_signon_raw: 	Connection to CIB successful
Oct 21 11:20:58 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_query operation for section 'all': OK (rc=0, origin=local/crm_resource/2, version=0.14.44)
Oct 21 11:20:58 [8134] bl460g1n6 crm_resource:    debug: unpack_config: 	STONITH timeout: 60000
Oct 21 11:20:58 [8134] bl460g1n6 crm_resource:    debug: unpack_config: 	STONITH of failed nodes is enabled
Oct 21 11:20:58 [8134] bl460g1n6 crm_resource:    debug: unpack_config: 	Stop all active resources: false
Oct 21 11:20:58 [8134] bl460g1n6 crm_resource:    debug: unpack_config: 	Cluster is symmetric - resources can run anywhere by default
Oct 21 11:20:58 [8134] bl460g1n6 crm_resource:    debug: unpack_config: 	Default stickiness: 0
Oct 21 11:20:58 [8134] bl460g1n6 crm_resource:    debug: unpack_config: 	On loss of CCM Quorum: Freeze resources
Oct 21 11:20:58 [8134] bl460g1n6 crm_resource:    debug: unpack_config: 	Node scores: 'red' = -INFINITY, 'yellow' = 0, 'green' = 0
Oct 21 11:20:58 [8134] bl460g1n6 crm_resource:  warning: handle_startup_fencing: 	Blind faith: not fencing unseen nodes
Oct 21 11:20:58 [8134] bl460g1n6 crm_resource:    debug: unpack_domains: 	Unpacking domains
Oct 21 11:20:58 [8134] bl460g1n6 crm_resource:     info: determine_online_status_fencing: 	Node bl460g1n8 is active
Oct 21 11:20:58 [8134] bl460g1n6 crm_resource:     info: determine_online_status: 	Node bl460g1n8 is online
Oct 21 11:20:58 [8134] bl460g1n6 crm_resource:     info: determine_online_status_fencing: 	Node bl460g1n6 is active
Oct 21 11:20:58 [8134] bl460g1n6 crm_resource:     info: determine_online_status: 	Node bl460g1n6 is online
Oct 21 11:20:58 [8134] bl460g1n6 crm_resource:     info: determine_online_status_fencing: 	Node bl460g1n7 is active
Oct 21 11:20:58 [8134] bl460g1n6 crm_resource:     info: determine_online_status: 	Node bl460g1n7 is online
Oct 21 11:20:58 [8134] bl460g1n6 crm_resource:    debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n8 to prmPing:0
Oct 21 11:20:58 [8134] bl460g1n6 crm_resource:    debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n6 to prmPing:0
Oct 21 11:20:58 [8134] bl460g1n6 crm_resource:    debug: determine_op_status: 	prmVM1_monitor_0 on bl460g1n6 returned 'ok' (0) instead of the expected value: 'not running' (7)
Oct 21 11:20:58 [8134] bl460g1n6 crm_resource:     info: determine_op_status: 	Operation monitor found resource prmVM1 active on bl460g1n6
Oct 21 11:20:58 [8134] bl460g1n6 crm_resource:    debug: determine_op_status: 	prmVM2_monitor_0 on bl460g1n6 returned 'ok' (0) instead of the expected value: 'not running' (7)
Oct 21 11:20:58 [8134] bl460g1n6 crm_resource:     info: determine_op_status: 	Operation monitor found resource prmVM2 active on bl460g1n6
Oct 21 11:20:58 [8134] bl460g1n6 crm_resource:    debug: determine_op_status: 	prmVM3_monitor_0 on bl460g1n6 returned 'ok' (0) instead of the expected value: 'not running' (7)
Oct 21 11:20:58 [8134] bl460g1n6 crm_resource:     info: determine_op_status: 	Operation monitor found resource prmVM3 active on bl460g1n6
Oct 21 11:20:58 [8134] bl460g1n6 crm_resource:    debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n7 to prmPing:0
Oct 21 11:20:58 [8134] bl460g1n6 crm_resource:    debug: dump_resource_attr: 	Looking up hv_memory in prmVM2
Oct 21 11:20:58 [8134] bl460g1n6 crm_resource:    debug: cib_native_signoff: 	Signing out of the CIB Service
Oct 21 11:20:58 [8134] bl460g1n6 crm_resource:    debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Oct 21 11:20:58 [8134] bl460g1n6 crm_resource:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-request-7680-8134-14-header
Oct 21 11:20:58 [7680] bl460g1n6        cib:    debug: qb_ipcs_dispatch_connection_request: 	HUP conn (7680-8134-14)
Oct 21 11:20:58 [7680] bl460g1n6        cib:    debug: qb_ipcs_disconnect: 	qb_ipcs_disconnect(7680-8134-14) state:2
Oct 21 11:20:58 [8134] bl460g1n6 crm_resource:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-response-7680-8134-14-header
Oct 21 11:20:58 [8134] bl460g1n6 crm_resource:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-event-7680-8134-14-header
Oct 21 11:20:58 [8134] bl460g1n6 crm_resource:     info: crm_xml_cleanup: 	Cleaning up memory from libxml2
Oct 21 11:20:58 [7680] bl460g1n6        cib:     info: crm_client_destroy: 	Destroying 0 events
Oct 21 11:20:58 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-response-7680-8134-14-header
Oct 21 11:20:58 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-event-7680-8134-14-header
Oct 21 11:20:58 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-request-7680-8134-14-header
Oct 21 11:20:58 [7682] bl460g1n6       lrmd:    debug: operation_finished: 	prmVM2_monitor_10000:8008 - exited with rc=0
Oct 21 11:20:58 [7682] bl460g1n6       lrmd:    debug: operation_finished: 	prmVM2_monitor_10000:8008:stderr [ -- empty -- ]
Oct 21 11:20:58 [7682] bl460g1n6       lrmd:    debug: operation_finished: 	prmVM2_monitor_10000:8008:stdout [ -- empty -- ]
Oct 21 11:20:58 [7682] bl460g1n6       lrmd:    debug: log_finished: 	finished - rsc:prmVM2 action:monitor call_id:44 pid:8008 exit-code:0 exec-time:381ms queue-time:0ms
Oct 21 11:20:58 [7685] bl460g1n6       crmd:    debug: create_operation_update: 	do_update_resource: Updating resource prmVM2 after monitor op complete (interval=10000)
Oct 21 11:20:58 [7685] bl460g1n6       crmd:   notice: process_lrm_event: 	LRM operation prmVM2_monitor_10000 (call=44, rc=0, cib-update=90, confirmed=false) ok
Oct 21 11:20:58 [7685] bl460g1n6       crmd:    debug: update_history_cache: 	Updating history for 'prmVM2' with monitor op
Oct 21 11:20:58 [7680] bl460g1n6        cib:     info: crm_client_new: 	Connecting 0xc52380 for uid=0 gid=0 pid=8136 id=a097d97f-e6b4-4d46-b611-fff2be8848b0
Oct 21 11:20:58 [7680] bl460g1n6        cib:    debug: handle_new_connection: 	IPC credentials authenticated (7680-8136-14)
Oct 21 11:20:58 [7680] bl460g1n6        cib:    debug: qb_ipcs_shm_connect: 	connecting to client [8136]
Oct 21 11:20:58 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:58 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:58 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:58 [8136] bl460g1n6 crm_resource:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:58 [8136] bl460g1n6 crm_resource:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:58 [8136] bl460g1n6 crm_resource:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: --- 0.14.44
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: +++ 0.14.45 eed6e86546c5e71d5270bc6f6b1be8a6
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-  <cib num_updates="44">
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-    <status>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	--     <node_state crm-debug-origin="do_state_transition" id="3232261592"/>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-    </status>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-  </cib>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  <cib epoch="14" num_updates="45" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:21 2013" update-origin="bl460g1n7" update-client="crm_resource" have-quorum="1" dc-uuid="3232261592">
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    <status>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      <node_state id="3232261592" uname="bl460g1n6" in_ccm="true" crmd="online" crm-debug-origin="do_update_resource" join="member" expected="member">
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        <lrm id="3232261592">
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          <lrm_resources>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+            <lrm_resource id="prmVM2" type="VirtualDomain" class="ocf" provider="heartbeat">
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <lrm_rsc_op id="prmVM2_monitor_10000" operation_key="prmVM2_monitor_10000" operation="monitor" crm-debug-origin="do_update_resource" crm_feature_set="3.0.7" transition-key="31:2:0:38db9a68-056c-4e65-8658-75f0c3cc91e5" transition-magic="0:0;31:2:0:38db9a68-056c-4e65-8658-75f0c3cc91e5" call-id="44" rc-code="0" op-status="0" interval="10000" last-rc-change="1382322058" exec-time="381" queue-time="0" op-digest="120b5eaffe2214
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+            </lrm_resource>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          </lrm_resources>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        </lrm>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      </node_state>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    </status>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  </cib>
Oct 21 11:20:58 [7685] bl460g1n6       crmd:    debug: te_update_diff: 	Processing diff (cib_modify): 0.14.44 -> 0.14.45 (S_TRANSITION_ENGINE)
Oct 21 11:20:58 [7685] bl460g1n6       crmd:     info: match_graph_event: 	Action prmVM2_monitor_10000 (31) confirmed on bl460g1n6 (rc=0)
Oct 21 11:20:58 [7685] bl460g1n6       crmd:    debug: run_graph: 	Transition 2 (Complete=28, Pending=8, Fired=0, Skipped=0, Incomplete=16, Source=/var/lib/pacemaker/pengine/pe-input-2.bz2): In-progress
Oct 21 11:20:58 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=local/crmd/90, version=0.14.45)
Oct 21 11:20:58 [7680] bl460g1n6        cib:     info: crm_client_new: 	Connecting 0xbf9300 for uid=0 gid=0 pid=8138 id=fecab1ed-a7f9-42c6-8169-1ade3d33f909
Oct 21 11:20:58 [7680] bl460g1n6        cib:    debug: handle_new_connection: 	IPC credentials authenticated (7680-8138-15)
Oct 21 11:20:58 [7680] bl460g1n6        cib:    debug: qb_ipcs_shm_connect: 	connecting to client [8138]
Oct 21 11:20:58 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:58 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:58 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:58 [8138] bl460g1n6 crm_resource:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:58 [8138] bl460g1n6 crm_resource:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:58 [8138] bl460g1n6 crm_resource:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:58 [8136] bl460g1n6 crm_resource:    debug: cib_native_signon_raw: 	Connection to CIB successful
Oct 21 11:20:58 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_query operation for section 'all': OK (rc=0, origin=local/crm_resource/2, version=0.14.45)
Oct 21 11:20:58 [8138] bl460g1n6 crm_resource:    debug: cib_native_signon_raw: 	Connection to CIB successful
Oct 21 11:20:58 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_query operation for section 'all': OK (rc=0, origin=local/crm_resource/2, version=0.14.45)
Oct 21 11:20:58 [8136] bl460g1n6 crm_resource:    debug: unpack_config: 	STONITH timeout: 60000
Oct 21 11:20:58 [8136] bl460g1n6 crm_resource:    debug: unpack_config: 	STONITH of failed nodes is enabled
Oct 21 11:20:58 [8136] bl460g1n6 crm_resource:    debug: unpack_config: 	Stop all active resources: false
Oct 21 11:20:58 [8136] bl460g1n6 crm_resource:    debug: unpack_config: 	Cluster is symmetric - resources can run anywhere by default
Oct 21 11:20:58 [8136] bl460g1n6 crm_resource:    debug: unpack_config: 	Default stickiness: 0
Oct 21 11:20:58 [8136] bl460g1n6 crm_resource:    debug: unpack_config: 	On loss of CCM Quorum: Freeze resources
Oct 21 11:20:58 [8136] bl460g1n6 crm_resource:    debug: unpack_config: 	Node scores: 'red' = -INFINITY, 'yellow' = 0, 'green' = 0
Oct 21 11:20:58 [8136] bl460g1n6 crm_resource:  warning: handle_startup_fencing: 	Blind faith: not fencing unseen nodes
Oct 21 11:20:58 [8136] bl460g1n6 crm_resource:    debug: unpack_domains: 	Unpacking domains
Oct 21 11:20:58 [8136] bl460g1n6 crm_resource:     info: determine_online_status_fencing: 	Node bl460g1n8 is active
Oct 21 11:20:58 [8136] bl460g1n6 crm_resource:     info: determine_online_status: 	Node bl460g1n8 is online
Oct 21 11:20:58 [8136] bl460g1n6 crm_resource:     info: determine_online_status_fencing: 	Node bl460g1n6 is active
Oct 21 11:20:58 [8136] bl460g1n6 crm_resource:     info: determine_online_status: 	Node bl460g1n6 is online
Oct 21 11:20:58 [8136] bl460g1n6 crm_resource:     info: determine_online_status_fencing: 	Node bl460g1n7 is active
Oct 21 11:20:58 [8136] bl460g1n6 crm_resource:     info: determine_online_status: 	Node bl460g1n7 is online
Oct 21 11:20:58 [8136] bl460g1n6 crm_resource:    debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n8 to prmPing:0
Oct 21 11:20:58 [8136] bl460g1n6 crm_resource:    debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n6 to prmPing:0
Oct 21 11:20:58 [8136] bl460g1n6 crm_resource:    debug: determine_op_status: 	prmVM1_monitor_0 on bl460g1n6 returned 'ok' (0) instead of the expected value: 'not running' (7)
Oct 21 11:20:58 [8136] bl460g1n6 crm_resource:     info: determine_op_status: 	Operation monitor found resource prmVM1 active on bl460g1n6
Oct 21 11:20:58 [8136] bl460g1n6 crm_resource:    debug: determine_op_status: 	prmVM2_monitor_0 on bl460g1n6 returned 'ok' (0) instead of the expected value: 'not running' (7)
Oct 21 11:20:58 [8136] bl460g1n6 crm_resource:     info: determine_op_status: 	Operation monitor found resource prmVM2 active on bl460g1n6
Oct 21 11:20:58 [8136] bl460g1n6 crm_resource:    debug: determine_op_status: 	prmVM3_monitor_0 on bl460g1n6 returned 'ok' (0) instead of the expected value: 'not running' (7)
Oct 21 11:20:58 [8136] bl460g1n6 crm_resource:     info: determine_op_status: 	Operation monitor found resource prmVM3 active on bl460g1n6
Oct 21 11:20:58 [8136] bl460g1n6 crm_resource:    debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n7 to prmPing:0
Oct 21 11:20:58 [8136] bl460g1n6 crm_resource:    debug: dump_resource_attr: 	Looking up hv_memory in prmVM1
Oct 21 11:20:58 [8136] bl460g1n6 crm_resource:    debug: cib_native_signoff: 	Signing out of the CIB Service
Oct 21 11:20:58 [8136] bl460g1n6 crm_resource:    debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Oct 21 11:20:58 [8136] bl460g1n6 crm_resource:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-request-7680-8136-14-header
Oct 21 11:20:58 [7680] bl460g1n6        cib:    debug: qb_ipcs_dispatch_connection_request: 	HUP conn (7680-8136-14)
Oct 21 11:20:58 [7680] bl460g1n6        cib:    debug: qb_ipcs_disconnect: 	qb_ipcs_disconnect(7680-8136-14) state:2
Oct 21 11:20:58 [8136] bl460g1n6 crm_resource:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-response-7680-8136-14-header
Oct 21 11:20:58 [8136] bl460g1n6 crm_resource:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-event-7680-8136-14-header
Oct 21 11:20:58 [8136] bl460g1n6 crm_resource:     info: crm_xml_cleanup: 	Cleaning up memory from libxml2
Oct 21 11:20:58 [7680] bl460g1n6        cib:     info: crm_client_destroy: 	Destroying 0 events
Oct 21 11:20:58 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-response-7680-8136-14-header
Oct 21 11:20:58 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-event-7680-8136-14-header
Oct 21 11:20:58 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-request-7680-8136-14-header
Oct 21 11:20:58 [8138] bl460g1n6 crm_resource:    debug: unpack_config: 	STONITH timeout: 60000
Oct 21 11:20:58 [8138] bl460g1n6 crm_resource:    debug: unpack_config: 	STONITH of failed nodes is enabled
Oct 21 11:20:58 [8138] bl460g1n6 crm_resource:    debug: unpack_config: 	Stop all active resources: false
Oct 21 11:20:58 [8138] bl460g1n6 crm_resource:    debug: unpack_config: 	Cluster is symmetric - resources can run anywhere by default
Oct 21 11:20:58 [8138] bl460g1n6 crm_resource:    debug: unpack_config: 	Default stickiness: 0
Oct 21 11:20:58 [8138] bl460g1n6 crm_resource:    debug: unpack_config: 	On loss of CCM Quorum: Freeze resources
Oct 21 11:20:58 [8138] bl460g1n6 crm_resource:    debug: unpack_config: 	Node scores: 'red' = -INFINITY, 'yellow' = 0, 'green' = 0
Oct 21 11:20:58 [8138] bl460g1n6 crm_resource:  warning: handle_startup_fencing: 	Blind faith: not fencing unseen nodes
Oct 21 11:20:58 [8138] bl460g1n6 crm_resource:    debug: unpack_domains: 	Unpacking domains
Oct 21 11:20:58 [8138] bl460g1n6 crm_resource:     info: determine_online_status_fencing: 	Node bl460g1n8 is active
Oct 21 11:20:58 [8138] bl460g1n6 crm_resource:     info: determine_online_status: 	Node bl460g1n8 is online
Oct 21 11:20:58 [8138] bl460g1n6 crm_resource:     info: determine_online_status_fencing: 	Node bl460g1n6 is active
Oct 21 11:20:58 [8138] bl460g1n6 crm_resource:     info: determine_online_status: 	Node bl460g1n6 is online
Oct 21 11:20:58 [8138] bl460g1n6 crm_resource:     info: determine_online_status_fencing: 	Node bl460g1n7 is active
Oct 21 11:20:58 [8138] bl460g1n6 crm_resource:     info: determine_online_status: 	Node bl460g1n7 is online
Oct 21 11:20:58 [8138] bl460g1n6 crm_resource:    debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n8 to prmPing:0
Oct 21 11:20:58 [8138] bl460g1n6 crm_resource:    debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n6 to prmPing:0
Oct 21 11:20:58 [8138] bl460g1n6 crm_resource:    debug: determine_op_status: 	prmVM1_monitor_0 on bl460g1n6 returned 'ok' (0) instead of the expected value: 'not running' (7)
Oct 21 11:20:58 [8138] bl460g1n6 crm_resource:     info: determine_op_status: 	Operation monitor found resource prmVM1 active on bl460g1n6
Oct 21 11:20:58 [8138] bl460g1n6 crm_resource:    debug: determine_op_status: 	prmVM2_monitor_0 on bl460g1n6 returned 'ok' (0) instead of the expected value: 'not running' (7)
Oct 21 11:20:58 [8138] bl460g1n6 crm_resource:     info: determine_op_status: 	Operation monitor found resource prmVM2 active on bl460g1n6
Oct 21 11:20:58 [8138] bl460g1n6 crm_resource:    debug: determine_op_status: 	prmVM3_monitor_0 on bl460g1n6 returned 'ok' (0) instead of the expected value: 'not running' (7)
Oct 21 11:20:58 [8138] bl460g1n6 crm_resource:     info: determine_op_status: 	Operation monitor found resource prmVM3 active on bl460g1n6
Oct 21 11:20:58 [8138] bl460g1n6 crm_resource:    debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n7 to prmPing:0
Oct 21 11:20:58 [8138] bl460g1n6 crm_resource:    debug: dump_resource_attr: 	Looking up hv_memory in prmVM3
Oct 21 11:20:58 [8138] bl460g1n6 crm_resource:    debug: cib_native_signoff: 	Signing out of the CIB Service
Oct 21 11:20:58 [8138] bl460g1n6 crm_resource:    debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Oct 21 11:20:58 [7682] bl460g1n6       lrmd:    debug: operation_finished: 	prmVM1_monitor_10000:8007 - exited with rc=0
Oct 21 11:20:58 [7680] bl460g1n6        cib:    debug: qb_ipcs_dispatch_connection_request: 	HUP conn (7680-8138-15)
Oct 21 11:20:58 [8138] bl460g1n6 crm_resource:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-request-7680-8138-15-header
Oct 21 11:20:58 [7682] bl460g1n6       lrmd:    debug: operation_finished: 	prmVM1_monitor_10000:8007:stderr [ -- empty -- ]
Oct 21 11:20:58 [7680] bl460g1n6        cib:    debug: qb_ipcs_disconnect: 	qb_ipcs_disconnect(7680-8138-15) state:2
Oct 21 11:20:58 [7682] bl460g1n6       lrmd:    debug: operation_finished: 	prmVM1_monitor_10000:8007:stdout [ -- empty -- ]
Oct 21 11:20:58 [8138] bl460g1n6 crm_resource:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-response-7680-8138-15-header
Oct 21 11:20:58 [7682] bl460g1n6       lrmd:    debug: log_finished: 	finished - rsc:prmVM1 action:monitor call_id:43 pid:8007 exit-code:0 exec-time:416ms queue-time:0ms
Oct 21 11:20:58 [8138] bl460g1n6 crm_resource:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-event-7680-8138-15-header
Oct 21 11:20:58 [7680] bl460g1n6        cib:     info: crm_client_destroy: 	Destroying 0 events
Oct 21 11:20:58 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-response-7680-8138-15-header
Oct 21 11:20:58 [8138] bl460g1n6 crm_resource:     info: crm_xml_cleanup: 	Cleaning up memory from libxml2
Oct 21 11:20:58 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-event-7680-8138-15-header
Oct 21 11:20:58 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-request-7680-8138-15-header
Oct 21 11:20:58 [7685] bl460g1n6       crmd:    debug: create_operation_update: 	do_update_resource: Updating resource prmVM1 after monitor op complete (interval=10000)
Oct 21 11:20:58 [7685] bl460g1n6       crmd:   notice: process_lrm_event: 	LRM operation prmVM1_monitor_10000 (call=43, rc=0, cib-update=91, confirmed=false) ok
Oct 21 11:20:58 [7685] bl460g1n6       crmd:    debug: update_history_cache: 	Updating history for 'prmVM1' with monitor op
Oct 21 11:20:58 [7682] bl460g1n6       lrmd:    debug: operation_finished: 	prmVM3_monitor_10000:8009 - exited with rc=0
Oct 21 11:20:58 [7682] bl460g1n6       lrmd:    debug: operation_finished: 	prmVM3_monitor_10000:8009:stderr [ -- empty -- ]
Oct 21 11:20:58 [7682] bl460g1n6       lrmd:    debug: operation_finished: 	prmVM3_monitor_10000:8009:stdout [ -- empty -- ]
Oct 21 11:20:58 [7682] bl460g1n6       lrmd:    debug: log_finished: 	finished - rsc:prmVM3 action:monitor call_id:45 pid:8009 exit-code:0 exec-time:415ms queue-time:0ms
Oct 21 11:20:58 [7685] bl460g1n6       crmd:    debug: create_operation_update: 	do_update_resource: Updating resource prmVM3 after monitor op complete (interval=10000)
Oct 21 11:20:58 [7685] bl460g1n6       crmd:   notice: process_lrm_event: 	LRM operation prmVM3_monitor_10000 (call=45, rc=0, cib-update=92, confirmed=false) ok
Oct 21 11:20:58 [7685] bl460g1n6       crmd:    debug: update_history_cache: 	Updating history for 'prmVM3' with monitor op
Oct 21 11:20:58 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=local/crmd/91, version=0.14.46)
Oct 21 11:20:58 [7685] bl460g1n6       crmd:    debug: te_update_diff: 	Processing diff (cib_modify): 0.14.45 -> 0.14.46 (S_TRANSITION_ENGINE)
Oct 21 11:20:58 [7685] bl460g1n6       crmd:     info: match_graph_event: 	Action prmVM1_monitor_10000 (28) confirmed on bl460g1n6 (rc=0)
Oct 21 11:20:58 [7685] bl460g1n6       crmd:    debug: run_graph: 	Transition 2 (Complete=29, Pending=7, Fired=0, Skipped=0, Incomplete=16, Source=/var/lib/pacemaker/pengine/pe-input-2.bz2): In-progress
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: --- 0.14.45
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: +++ 0.14.46 a337457ecddcd3f2ec94f44bba664534
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-- <cib num_updates="45"/>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  <cib epoch="14" num_updates="46" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:21 2013" update-origin="bl460g1n7" update-client="crm_resource" have-quorum="1" dc-uuid="3232261592">
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    <status>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      <node_state id="3232261592" uname="bl460g1n6" in_ccm="true" crmd="online" crm-debug-origin="do_update_resource" join="member" expected="member">
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        <lrm id="3232261592">
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          <lrm_resources>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+            <lrm_resource id="prmVM1" type="VirtualDomain" class="ocf" provider="heartbeat">
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <lrm_rsc_op id="prmVM1_monitor_10000" operation_key="prmVM1_monitor_10000" operation="monitor" crm-debug-origin="do_update_resource" crm_feature_set="3.0.7" transition-key="28:2:0:38db9a68-056c-4e65-8658-75f0c3cc91e5" transition-magic="0:0;28:2:0:38db9a68-056c-4e65-8658-75f0c3cc91e5" call-id="43" rc-code="0" op-status="0" interval="10000" last-rc-change="1382322058" exec-time="416" queue-time="0" op-digest="a1884217e7bc99
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+            </lrm_resource>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          </lrm_resources>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        </lrm>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      </node_state>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    </status>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  </cib>
Oct 21 11:20:58 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=local/crmd/92, version=0.14.47)
Oct 21 11:20:58 [7685] bl460g1n6       crmd:    debug: te_update_diff: 	Processing diff (cib_modify): 0.14.46 -> 0.14.47 (S_TRANSITION_ENGINE)
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: --- 0.14.46
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: +++ 0.14.47 88e1680a2812c102641078f2db165a21
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-- <cib num_updates="46"/>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  <cib epoch="14" num_updates="47" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:21 2013" update-origin="bl460g1n7" update-client="crm_resource" have-quorum="1" dc-uuid="3232261592">
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    <status>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      <node_state id="3232261592" uname="bl460g1n6" in_ccm="true" crmd="online" crm-debug-origin="do_update_resource" join="member" expected="member">
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        <lrm id="3232261592">
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          <lrm_resources>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+            <lrm_resource id="prmVM3" type="VirtualDomain" class="ocf" provider="heartbeat">
Oct 21 11:20:58 [7685] bl460g1n6       crmd:     info: match_graph_event: 	Action prmVM3_monitor_10000 (34) confirmed on bl460g1n6 (rc=0)
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <lrm_rsc_op id="prmVM3_monitor_10000" operation_key="prmVM3_monitor_10000" operation="monitor" crm-debug-origin="do_update_resource" crm_feature_set="3.0.7" transition-key="34:2:0:38db9a68-056c-4e65-8658-75f0c3cc91e5" transition-magic="0:0;34:2:0:38db9a68-056c-4e65-8658-75f0c3cc91e5" call-id="45" rc-code="0" op-status="0" interval="10000" last-rc-change="1382322058" exec-time="415" queue-time="0" op-digest="68b4c7da73e899
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+            </lrm_resource>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          </lrm_resources>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        </lrm>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      </node_state>
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    </status>
Oct 21 11:20:58 [7685] bl460g1n6       crmd:    debug: run_graph: 	Transition 2 (Complete=30, Pending=6, Fired=0, Skipped=0, Incomplete=16, Source=/var/lib/pacemaker/pengine/pe-input-2.bz2): In-progress
Oct 21 11:20:58 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  </cib>
Oct 21 11:20:59 [8143] bl460g1n6 attrd_updater:     info: attrd_update_delegate: 	Connecting to cluster... 5 retries remaining
Oct 21 11:20:59 [7683] bl460g1n6      attrd:     info: crm_client_new: 	Connecting 0x14e3150 for uid=0 gid=0 pid=8143 id=fbcdad55-6e3a-4e6a-a180-c2c9facba772
Oct 21 11:20:59 [7683] bl460g1n6      attrd:    debug: handle_new_connection: 	IPC credentials authenticated (7683-8143-10)
Oct 21 11:20:59 [7683] bl460g1n6      attrd:    debug: qb_ipcs_shm_connect: 	connecting to client [8143]
Oct 21 11:20:59 [7683] bl460g1n6      attrd:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:59 [7683] bl460g1n6      attrd:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:59 [7683] bl460g1n6      attrd:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:59 [8143] bl460g1n6 attrd_updater:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:59 [8143] bl460g1n6 attrd_updater:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:59 [8143] bl460g1n6 attrd_updater:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:20:59 [8143] bl460g1n6 attrd_updater:    debug: attrd_update_delegate: 	Sent update: default_ping_set=100 for localhost
Oct 21 11:20:59 [7683] bl460g1n6      attrd:     info: attrd_client_message: 	Broadcasting default_ping_set[bl460g1n6] = 100 (writer)
Oct 21 11:20:59 [8143] bl460g1n6 attrd_updater:     info: crm_xml_cleanup: 	Cleaning up memory from libxml2
Oct 21 11:20:59 [7683] bl460g1n6      attrd:    debug: qb_ipcs_dispatch_connection_request: 	HUP conn (7683-8143-10)
Oct 21 11:20:59 [7683] bl460g1n6      attrd:    debug: qb_ipcs_disconnect: 	qb_ipcs_disconnect(7683-8143-10) state:2
Oct 21 11:20:59 [7683] bl460g1n6      attrd:     info: crm_client_destroy: 	Destroying 0 events
Oct 21 11:20:59 [7683] bl460g1n6      attrd:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-attrd-response-7683-8143-10-header
Oct 21 11:20:59 [7683] bl460g1n6      attrd:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-attrd-event-7683-8143-10-header
Oct 21 11:20:59 [7683] bl460g1n6      attrd:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-attrd-request-7683-8143-10-header
Oct 21 11:20:59 [7682] bl460g1n6       lrmd:    debug: operation_finished: 	prmPing_start_0:8102 - exited with rc=0
Oct 21 11:20:59 [7682] bl460g1n6       lrmd:    debug: operation_finished: 	prmPing_start_0:8102:stderr [ -- empty -- ]
Oct 21 11:20:59 [7682] bl460g1n6       lrmd:    debug: operation_finished: 	prmPing_start_0:8102:stdout [ -- empty -- ]
Oct 21 11:20:59 [7683] bl460g1n6      attrd:    debug: write_attributes: 	Skipping unchanged attribute shutdown
Oct 21 11:20:59 [7683] bl460g1n6      attrd:    debug: write_attribute: 	Update: bl460g1n6[default_ping_set]=100 (3232261592 3232261592 3232261592 bl460g1n6)
Oct 21 11:20:59 [7682] bl460g1n6       lrmd:     info: log_finished: 	finished - rsc:prmPing action:start call_id:46 pid:8102 exit-code:0 exec-time:1072ms queue-time:1ms
Oct 21 11:20:59 [7683] bl460g1n6      attrd:   notice: write_attribute: 	Sent update 10 with 1 changes for default_ping_set, id=<n/a>, set=(null)
Oct 21 11:20:59 [7683] bl460g1n6      attrd:    debug: write_attributes: 	Skipping unchanged attribute terminate
Oct 21 11:20:59 [7683] bl460g1n6      attrd:    debug: write_attributes: 	Skipping unchanged attribute probe_complete
Oct 21 11:20:59 [7685] bl460g1n6       crmd:    debug: create_operation_update: 	do_update_resource: Updating resource prmPing after start op complete (interval=0)
Oct 21 11:20:59 [7685] bl460g1n6       crmd:   notice: process_lrm_event: 	LRM operation prmPing_start_0 (call=46, rc=0, cib-update=93, confirmed=true) ok
Oct 21 11:20:59 [7685] bl460g1n6       crmd:    debug: update_history_cache: 	Updating history for 'prmPing' with start op
Oct 21 11:20:59 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=local/attrd/10, version=0.14.48)
Oct 21 11:20:59 [7685] bl460g1n6       crmd:    debug: te_update_diff: 	Processing diff (cib_modify): 0.14.47 -> 0.14.48 (S_TRANSITION_ENGINE)
Oct 21 11:20:59 [7685] bl460g1n6       crmd:     info: abort_transition_graph: 	te_update_diff:172 - Triggered transition abort (complete=0, node=bl460g1n6, tag=nvpair, id=status-3232261592-default_ping_set, name=default_ping_set, value=100, magic=NA, cib=0.14.48) : Transient attribute: update
Oct 21 11:20:59 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause   <nvpair id="status-3232261592-default_ping_set" name="default_ping_set" value="100" __crm_diff_marker__="added:top"/>
Oct 21 11:20:59 [7685] bl460g1n6       crmd:    debug: update_abort_priority: 	Abort priority upgraded from 0 to 1000000
Oct 21 11:20:59 [7685] bl460g1n6       crmd:    debug: update_abort_priority: 	Abort action done superceeded by restart
Oct 21 11:20:59 [7685] bl460g1n6       crmd:    debug: run_graph: 	Transition 2 (Complete=30, Pending=6, Fired=0, Skipped=15, Incomplete=1, Source=/var/lib/pacemaker/pengine/pe-input-2.bz2): In-progress
Oct 21 11:20:59 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: --- 0.14.47
Oct 21 11:20:59 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: +++ 0.14.48 00110705f4eb8155073b83a21445b54f
Oct 21 11:20:59 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-- <cib num_updates="47"/>
Oct 21 11:20:59 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  <cib epoch="14" num_updates="48" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:21 2013" update-origin="bl460g1n7" update-client="crm_resource" have-quorum="1" dc-uuid="3232261592">
Oct 21 11:20:59 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    <status>
Oct 21 11:20:59 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      <node_state id="3232261592" uname="bl460g1n6" in_ccm="true" crmd="online" crm-debug-origin="do_update_resource" join="member" expected="member">
Oct 21 11:20:59 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        <transient_attributes id="3232261592">
Oct 21 11:20:59 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          <instance_attributes id="status-3232261592">
Oct 21 11:20:59 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <nvpair id="status-3232261592-default_ping_set" name="default_ping_set" value="100"/>
Oct 21 11:20:59 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          </instance_attributes>
Oct 21 11:20:59 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        </transient_attributes>
Oct 21 11:20:59 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      </node_state>
Oct 21 11:20:59 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    </status>
Oct 21 11:20:59 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  </cib>
Oct 21 11:20:59 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=local/crmd/93, version=0.14.49)
Oct 21 11:20:59 [7685] bl460g1n6       crmd:    debug: te_update_diff: 	Processing diff (cib_modify): 0.14.48 -> 0.14.49 (S_TRANSITION_ENGINE)
Oct 21 11:20:59 [7685] bl460g1n6       crmd:     info: match_graph_event: 	Action prmPing_start_0 (59) confirmed on bl460g1n6 (rc=0)
Oct 21 11:20:59 [7683] bl460g1n6      attrd:     info: attrd_cib_callback: 	Update 10 for default_ping_set: OK (0)
Oct 21 11:20:59 [7683] bl460g1n6      attrd:   notice: attrd_cib_callback: 	Update 10 for default_ping_set[bl460g1n6]=100: OK (0)
Oct 21 11:20:59 [7685] bl460g1n6       crmd:    debug: run_graph: 	Transition 2 (Complete=31, Pending=5, Fired=0, Skipped=15, Incomplete=1, Source=/var/lib/pacemaker/pengine/pe-input-2.bz2): In-progress
Oct 21 11:20:59 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: --- 0.14.48
Oct 21 11:20:59 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: +++ 0.14.49 5359a8e48ee7f198f566b23886f802fe
Oct 21 11:20:59 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-  <cib num_updates="48">
Oct 21 11:20:59 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-    <status>
Oct 21 11:20:59 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-      <node_state id="3232261592">
Oct 21 11:20:59 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-        <lrm id="3232261592">
Oct 21 11:20:59 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-          <lrm_resources>
Oct 21 11:20:59 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-            <lrm_resource id="prmPing">
Oct 21 11:20:59 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	--             <lrm_rsc_op operation_key="prmPing_monitor_0" operation="monitor" transition-key="13:1:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" transition-magic="0:7;13:1:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" call-id="42" rc-code="7" last-run="1382322053" last-rc-change="1382322053" exec-time="24" queue-time="0" id="prmPing_last_0"/>
Oct 21 11:20:59 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-            </lrm_resource>
Oct 21 11:20:59 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-          </lrm_resources>
Oct 21 11:20:59 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-        </lrm>
Oct 21 11:20:59 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-      </node_state>
Oct 21 11:20:59 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-    </status>
Oct 21 11:20:59 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-  </cib>
Oct 21 11:20:59 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  <cib epoch="14" num_updates="49" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:21 2013" update-origin="bl460g1n7" update-client="crm_resource" have-quorum="1" dc-uuid="3232261592">
Oct 21 11:20:59 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    <status>
Oct 21 11:20:59 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      <node_state id="3232261592" uname="bl460g1n6" in_ccm="true" crmd="online" crm-debug-origin="do_update_resource" join="member" expected="member">
Oct 21 11:20:59 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        <lrm id="3232261592">
Oct 21 11:20:59 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          <lrm_resources>
Oct 21 11:20:59 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+            <lrm_resource id="prmPing" type="ping" class="ocf" provider="pacemaker">
Oct 21 11:20:59 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <lrm_rsc_op id="prmPing_last_0" operation_key="prmPing_start_0" operation="start" crm-debug-origin="do_update_resource" crm_feature_set="3.0.7" transition-key="59:2:0:38db9a68-056c-4e65-8658-75f0c3cc91e5" transition-magic="0:0;59:2:0:38db9a68-056c-4e65-8658-75f0c3cc91e5" call-id="46" rc-code="0" op-status="0" interval="0" last-run="1382322058" last-rc-change="1382322058" exec-time="1072" queue-time="1" op-digest="bc586d1a
Oct 21 11:20:59 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+            </lrm_resource>
Oct 21 11:20:59 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          </lrm_resources>
Oct 21 11:20:59 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        </lrm>
Oct 21 11:20:59 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      </node_state>
Oct 21 11:20:59 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    </status>
Oct 21 11:20:59 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  </cib>
Oct 21 11:21:00 [7683] bl460g1n6      attrd:    debug: write_attributes: 	Skipping unchanged attribute shutdown
Oct 21 11:21:00 [7683] bl460g1n6      attrd:    debug: write_attribute: 	Update: bl460g1n6[default_ping_set]=100 (3232261592 3232261592 3232261592 bl460g1n6)
Oct 21 11:21:00 [7683] bl460g1n6      attrd:    debug: write_attribute: 	Update: bl460g1n8[default_ping_set]=100 (3232261594 3232261594 3232261594 bl460g1n8)
Oct 21 11:21:00 [7683] bl460g1n6      attrd:   notice: write_attribute: 	Sent update 11 with 2 changes for default_ping_set, id=<n/a>, set=(null)
Oct 21 11:21:00 [7683] bl460g1n6      attrd:    debug: write_attributes: 	Skipping unchanged attribute terminate
Oct 21 11:21:00 [7683] bl460g1n6      attrd:    debug: write_attributes: 	Skipping unchanged attribute probe_complete
Oct 21 11:21:00 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: --- 0.14.49
Oct 21 11:21:00 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: +++ 0.14.50 e25e6318b862f1303dc69a868c1bc1a7
Oct 21 11:21:00 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-- <cib num_updates="49"/>
Oct 21 11:21:00 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  <cib epoch="14" num_updates="50" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:21 2013" update-origin="bl460g1n7" update-client="crm_resource" have-quorum="1" dc-uuid="3232261592">
Oct 21 11:21:00 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    <status>
Oct 21 11:21:00 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      <node_state id="3232261594" uname="bl460g1n8" in_ccm="true" crmd="online" crm-debug-origin="do_update_resource" join="member" expected="member">
Oct 21 11:21:00 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++       <transient_attributes id="3232261594">
Oct 21 11:21:00 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         <instance_attributes id="status-3232261594">
Oct 21 11:21:00 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <nvpair id="status-3232261594-default_ping_set" name="default_ping_set" value="100"/>
Oct 21 11:21:00 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         </instance_attributes>
Oct 21 11:21:00 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++       </transient_attributes>
Oct 21 11:21:00 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      </node_state>
Oct 21 11:21:00 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    </status>
Oct 21 11:21:00 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  </cib>
Oct 21 11:21:00 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=local/attrd/11, version=0.14.50)
Oct 21 11:21:00 [7685] bl460g1n6       crmd:    debug: te_update_diff: 	Processing diff (cib_modify): 0.14.49 -> 0.14.50 (S_TRANSITION_ENGINE)
Oct 21 11:21:00 [7685] bl460g1n6       crmd:     info: abort_transition_graph: 	te_update_diff:172 - Triggered transition abort (complete=0, node=bl460g1n8, tag=nvpair, id=status-3232261594-default_ping_set, name=default_ping_set, value=100, magic=NA, cib=0.14.50) : Transient attribute: update
Oct 21 11:21:00 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause   <nvpair id="status-3232261594-default_ping_set" name="default_ping_set" value="100"/>
Oct 21 11:21:00 [7685] bl460g1n6       crmd:    debug: run_graph: 	Transition 2 (Complete=31, Pending=5, Fired=0, Skipped=15, Incomplete=1, Source=/var/lib/pacemaker/pengine/pe-input-2.bz2): In-progress
Oct 21 11:21:00 [7685] bl460g1n6       crmd:    debug: te_update_diff: 	Processing diff (cib_modify): 0.14.50 -> 0.14.51 (S_TRANSITION_ENGINE)
Oct 21 11:21:00 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: --- 0.14.50
Oct 21 11:21:00 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: +++ 0.14.51 ce4ad94d7d6fecb4b919df71db30c87a
Oct 21 11:21:00 [7685] bl460g1n6       crmd:     info: match_graph_event: 	Action prmPing_start_0 (63) confirmed on bl460g1n8 (rc=0)
Oct 21 11:21:00 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-  <cib num_updates="50">
Oct 21 11:21:00 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-    <status>
Oct 21 11:21:00 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-      <node_state id="3232261594">
Oct 21 11:21:00 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-        <lrm id="3232261594">
Oct 21 11:21:00 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-          <lrm_resources>
Oct 21 11:21:00 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-            <lrm_resource id="prmPing">
Oct 21 11:21:00 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	--             <lrm_rsc_op operation_key="prmPing_monitor_0" operation="monitor" transition-key="25:2:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" transition-magic="0:7;25:2:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" call-id="51" rc-code="7" last-run="1382322058" last-rc-change="1382322058" exec-time="13" id="prmPing_last_0"/>
Oct 21 11:21:00 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-            </lrm_resource>
Oct 21 11:21:00 [7685] bl460g1n6       crmd:    debug: run_graph: 	Transition 2 (Complete=32, Pending=4, Fired=0, Skipped=15, Incomplete=1, Source=/var/lib/pacemaker/pengine/pe-input-2.bz2): In-progress
Oct 21 11:21:00 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-          </lrm_resources>
Oct 21 11:21:00 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-        </lrm>
Oct 21 11:21:00 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-      </node_state>
Oct 21 11:21:00 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-    </status>
Oct 21 11:21:00 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-  </cib>
Oct 21 11:21:00 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  <cib epoch="14" num_updates="51" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:21 2013" update-origin="bl460g1n7" update-client="crm_resource" have-quorum="1" dc-uuid="3232261592">
Oct 21 11:21:00 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    <status>
Oct 21 11:21:00 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      <node_state id="3232261594" uname="bl460g1n8" in_ccm="true" crmd="online" crm-debug-origin="do_update_resource" join="member" expected="member">
Oct 21 11:21:00 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        <lrm id="3232261594">
Oct 21 11:21:00 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          <lrm_resources>
Oct 21 11:21:00 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+            <lrm_resource id="prmPing" type="ping" class="ocf" provider="pacemaker">
Oct 21 11:21:00 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <lrm_rsc_op id="prmPing_last_0" operation_key="prmPing_start_0" operation="start" crm-debug-origin="do_update_resource" crm_feature_set="3.0.7" transition-key="63:2:0:38db9a68-056c-4e65-8658-75f0c3cc91e5" transition-magic="0:0;63:2:0:38db9a68-056c-4e65-8658-75f0c3cc91e5" call-id="53" rc-code="0" op-status="0" interval="0" last-run="1382322059" last-rc-change="1382322059" exec-time="1035" queue-time="0" op-digest="bc586d1a
Oct 21 11:21:00 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+            </lrm_resource>
Oct 21 11:21:00 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          </lrm_resources>
Oct 21 11:21:00 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        </lrm>
Oct 21 11:21:00 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      </node_state>
Oct 21 11:21:00 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=bl460g1n8/crmd/19, version=0.14.51)
Oct 21 11:21:00 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    </status>
Oct 21 11:21:00 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  </cib>
Oct 21 11:21:00 [7683] bl460g1n6      attrd:     info: attrd_cib_callback: 	Update 11 for default_ping_set: OK (0)
Oct 21 11:21:00 [7683] bl460g1n6      attrd:   notice: attrd_cib_callback: 	Update 11 for default_ping_set[bl460g1n6]=100: OK (0)
Oct 21 11:21:00 [7683] bl460g1n6      attrd:   notice: attrd_cib_callback: 	Update 11 for default_ping_set[bl460g1n8]=100: OK (0)
Oct 21 11:21:00 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=bl460g1n8/crmd/20, version=0.14.52)
Oct 21 11:21:00 [7685] bl460g1n6       crmd:    debug: te_update_diff: 	Processing diff (cib_modify): 0.14.51 -> 0.14.52 (S_TRANSITION_ENGINE)
Oct 21 11:21:00 [7685] bl460g1n6       crmd:     info: match_graph_event: 	Action prmStonith7-1_start_0 (43) confirmed on bl460g1n8 (rc=0)
Oct 21 11:21:00 [7685] bl460g1n6       crmd:    debug: run_graph: 	Transition 2 (Complete=33, Pending=3, Fired=0, Skipped=15, Incomplete=1, Source=/var/lib/pacemaker/pengine/pe-input-2.bz2): In-progress
Oct 21 11:21:00 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: --- 0.14.51
Oct 21 11:21:00 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: +++ 0.14.52 48f76c81b65000db3c65803067cc027e
Oct 21 11:21:00 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-  <cib num_updates="51">
Oct 21 11:21:00 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-    <status>
Oct 21 11:21:00 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-      <node_state id="3232261594">
Oct 21 11:21:00 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-        <lrm id="3232261594">
Oct 21 11:21:00 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-          <lrm_resources>
Oct 21 11:21:00 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-            <lrm_resource id="prmStonith7-1">
Oct 21 11:21:00 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	--             <lrm_rsc_op operation_key="prmStonith7-1_monitor_0" operation="monitor" transition-key="21:2:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" transition-magic="0:7;21:2:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" call-id="47" rc-code="7" exec-time="0" id="prmStonith7-1_last_0"/>
Oct 21 11:21:00 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-            </lrm_resource>
Oct 21 11:21:00 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-          </lrm_resources>
Oct 21 11:21:00 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-        </lrm>
Oct 21 11:21:00 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-      </node_state>
Oct 21 11:21:00 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-    </status>
Oct 21 11:21:00 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-  </cib>
Oct 21 11:21:00 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  <cib epoch="14" num_updates="52" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:21 2013" update-origin="bl460g1n7" update-client="crm_resource" have-quorum="1" dc-uuid="3232261592">
Oct 21 11:21:00 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    <status>
Oct 21 11:21:00 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      <node_state id="3232261594" uname="bl460g1n8" in_ccm="true" crmd="online" crm-debug-origin="do_update_resource" join="member" expected="member">
Oct 21 11:21:00 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        <lrm id="3232261594">
Oct 21 11:21:00 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          <lrm_resources>
Oct 21 11:21:00 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+            <lrm_resource id="prmStonith7-1" type="external/ipmi" class="stonith">
Oct 21 11:21:00 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <lrm_rsc_op id="prmStonith7-1_last_0" operation_key="prmStonith7-1_start_0" operation="start" crm-debug-origin="do_update_resource" crm_feature_set="3.0.7" transition-key="43:2:0:38db9a68-056c-4e65-8658-75f0c3cc91e5" transition-magic="0:0;43:2:0:38db9a68-056c-4e65-8658-75f0c3cc91e5" call-id="52" rc-code="0" op-status="0" interval="0" last-run="1382322058" last-rc-change="1382322058" exec-time="2100" queue-time="0" op-dige
Oct 21 11:21:00 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+            </lrm_resource>
Oct 21 11:21:00 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          </lrm_resources>
Oct 21 11:21:00 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        </lrm>
Oct 21 11:21:00 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      </node_state>
Oct 21 11:21:00 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    </status>
Oct 21 11:21:00 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  </cib>
Oct 21 11:21:01 [7683] bl460g1n6      attrd:    debug: write_attributes: 	Skipping unchanged attribute shutdown
Oct 21 11:21:01 [7683] bl460g1n6      attrd:    debug: write_attribute: 	Update: bl460g1n6[default_ping_set]=100 (3232261592 3232261592 3232261592 bl460g1n6)
Oct 21 11:21:01 [7683] bl460g1n6      attrd:    debug: write_attribute: 	Update: bl460g1n7[default_ping_set]=100 (3232261593 3232261593 3232261593 bl460g1n7)
Oct 21 11:21:01 [7683] bl460g1n6      attrd:    debug: write_attribute: 	Update: bl460g1n8[default_ping_set]=100 (3232261594 3232261594 3232261594 bl460g1n8)
Oct 21 11:21:01 [7683] bl460g1n6      attrd:   notice: write_attribute: 	Sent update 12 with 3 changes for default_ping_set, id=<n/a>, set=(null)
Oct 21 11:21:01 [7683] bl460g1n6      attrd:    debug: write_attributes: 	Skipping unchanged attribute terminate
Oct 21 11:21:01 [7683] bl460g1n6      attrd:    debug: write_attributes: 	Skipping unchanged attribute probe_complete
Oct 21 11:21:01 [7685] bl460g1n6       crmd:    debug: te_update_diff: 	Processing diff (cib_modify): 0.14.52 -> 0.14.53 (S_TRANSITION_ENGINE)
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: --- 0.14.52
Oct 21 11:21:01 [7685] bl460g1n6       crmd:     info: abort_transition_graph: 	te_update_diff:172 - Triggered transition abort (complete=0, node=bl460g1n7, tag=nvpair, id=status-3232261593-default_ping_set, name=default_ping_set, value=100, magic=NA, cib=0.14.53) : Transient attribute: update
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: +++ 0.14.53 9ed4ca0f900a9950d2a02867d545ae77
Oct 21 11:21:01 [7685] bl460g1n6       crmd:    debug: abort_transition_graph: 	Cause   <nvpair id="status-3232261593-default_ping_set" name="default_ping_set" value="100"/>
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-- <cib num_updates="52"/>
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  <cib epoch="14" num_updates="53" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:21 2013" update-origin="bl460g1n7" update-client="crm_resource" have-quorum="1" dc-uuid="3232261592">
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    <status>
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      <node_state id="3232261593" uname="bl460g1n7" in_ccm="true" crmd="online" crm-debug-origin="do_update_resource" join="member" expected="member">
Oct 21 11:21:01 [7685] bl460g1n6       crmd:    debug: run_graph: 	Transition 2 (Complete=33, Pending=3, Fired=0, Skipped=15, Incomplete=1, Source=/var/lib/pacemaker/pengine/pe-input-2.bz2): In-progress
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++       <transient_attributes id="3232261593">
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         <instance_attributes id="status-3232261593">
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++           <nvpair id="status-3232261593-default_ping_set" name="default_ping_set" value="100"/>
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++         </instance_attributes>
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++       </transient_attributes>
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      </node_state>
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    </status>
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  </cib>
Oct 21 11:21:01 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=local/attrd/12, version=0.14.53)
Oct 21 11:21:01 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=bl460g1n7/crmd/19, version=0.14.54)
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: --- 0.14.53
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: +++ 0.14.54 870506540f16390f31cc3fe7fc931ba7
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-  <cib num_updates="53">
Oct 21 11:21:01 [7685] bl460g1n6       crmd:    debug: te_update_diff: 	Processing diff (cib_modify): 0.14.53 -> 0.14.54 (S_TRANSITION_ENGINE)
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-    <status>
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-      <node_state id="3232261593">
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-        <lrm id="3232261593">
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-          <lrm_resources>
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-            <lrm_resource id="prmPing">
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	--             <lrm_rsc_op operation_key="prmPing_monitor_0" operation="monitor" transition-key="14:2:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" transition-magic="0:7;14:2:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" call-id="51" rc-code="7" last-run="1382322058" last-rc-change="1382322058" exec-time="14" id="prmPing_last_0"/>
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-            </lrm_resource>
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-          </lrm_resources>
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-        </lrm>
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-      </node_state>
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-    </status>
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-  </cib>
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  <cib epoch="14" num_updates="54" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:21 2013" update-origin="bl460g1n7" update-client="crm_resource" have-quorum="1" dc-uuid="3232261592">
Oct 21 11:21:01 [7685] bl460g1n6       crmd:     info: match_graph_event: 	Action prmPing_start_0 (61) confirmed on bl460g1n7 (rc=0)
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    <status>
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      <node_state id="3232261593" uname="bl460g1n7" in_ccm="true" crmd="online" crm-debug-origin="do_update_resource" join="member" expected="member">
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        <lrm id="3232261593">
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          <lrm_resources>
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+            <lrm_resource id="prmPing" type="ping" class="ocf" provider="pacemaker">
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <lrm_rsc_op id="prmPing_last_0" operation_key="prmPing_start_0" operation="start" crm-debug-origin="do_update_resource" crm_feature_set="3.0.7" transition-key="61:2:0:38db9a68-056c-4e65-8658-75f0c3cc91e5" transition-magic="0:0;61:2:0:38db9a68-056c-4e65-8658-75f0c3cc91e5" call-id="54" rc-code="0" op-status="0" interval="0" last-run="1382322060" last-rc-change="1382322060" exec-time="1037" queue-time="0" op-digest="bc586d1a
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+            </lrm_resource>
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          </lrm_resources>
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        </lrm>
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      </node_state>
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    </status>
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  </cib>
Oct 21 11:21:01 [7685] bl460g1n6       crmd:    debug: te_pseudo_action: 	Pseudo action 66 fired and confirmed
Oct 21 11:21:01 [7685] bl460g1n6       crmd:    debug: run_graph: 	Transition 2 (Complete=34, Pending=2, Fired=1, Skipped=15, Incomplete=0, Source=/var/lib/pacemaker/pengine/pe-input-2.bz2): In-progress
Oct 21 11:21:01 [7683] bl460g1n6      attrd:     info: attrd_cib_callback: 	Update 12 for default_ping_set: OK (0)
Oct 21 11:21:01 [7685] bl460g1n6       crmd:    debug: run_graph: 	Transition 2 (Complete=35, Pending=2, Fired=0, Skipped=15, Incomplete=0, Source=/var/lib/pacemaker/pengine/pe-input-2.bz2): In-progress
Oct 21 11:21:01 [7683] bl460g1n6      attrd:   notice: attrd_cib_callback: 	Update 12 for default_ping_set[bl460g1n6]=100: OK (0)
Oct 21 11:21:01 [7683] bl460g1n6      attrd:   notice: attrd_cib_callback: 	Update 12 for default_ping_set[bl460g1n7]=100: OK (0)
Oct 21 11:21:01 [7683] bl460g1n6      attrd:   notice: attrd_cib_callback: 	Update 12 for default_ping_set[bl460g1n8]=100: OK (0)
Oct 21 11:21:01 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=bl460g1n7/crmd/20, version=0.14.55)
Oct 21 11:21:01 [7685] bl460g1n6       crmd:    debug: te_update_diff: 	Processing diff (cib_modify): 0.14.54 -> 0.14.55 (S_TRANSITION_ENGINE)
Oct 21 11:21:01 [7685] bl460g1n6       crmd:     info: match_graph_event: 	Action prmStonith8-1_start_0 (51) confirmed on bl460g1n7 (rc=0)
Oct 21 11:21:01 [7685] bl460g1n6       crmd:    debug: run_graph: 	Transition 2 (Complete=36, Pending=1, Fired=0, Skipped=15, Incomplete=0, Source=/var/lib/pacemaker/pengine/pe-input-2.bz2): In-progress
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: --- 0.14.54
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: +++ 0.14.55 e53c70b2ec7d625873a7ec42ef73ee57
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-  <cib num_updates="54">
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-    <status>
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-      <node_state id="3232261593">
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-        <lrm id="3232261593">
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-          <lrm_resources>
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-            <lrm_resource id="prmStonith8-1">
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	--             <lrm_rsc_op operation_key="prmStonith8-1_monitor_0" operation="monitor" transition-key="12:2:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" transition-magic="0:7;12:2:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" call-id="49" rc-code="7" last-run="1382322058" last-rc-change="1382322058" exec-time="0" id="prmStonith8-1_last_0"/>
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-            </lrm_resource>
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-          </lrm_resources>
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-        </lrm>
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-      </node_state>
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-    </status>
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-  </cib>
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  <cib epoch="14" num_updates="55" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:21 2013" update-origin="bl460g1n7" update-client="crm_resource" have-quorum="1" dc-uuid="3232261592">
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    <status>
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      <node_state id="3232261593" uname="bl460g1n7" in_ccm="true" crmd="online" crm-debug-origin="do_update_resource" join="member" expected="member">
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        <lrm id="3232261593">
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          <lrm_resources>
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+            <lrm_resource id="prmStonith8-1" type="external/ipmi" class="stonith">
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <lrm_rsc_op id="prmStonith8-1_last_0" operation_key="prmStonith8-1_start_0" operation="start" crm-debug-origin="do_update_resource" crm_feature_set="3.0.7" transition-key="51:2:0:38db9a68-056c-4e65-8658-75f0c3cc91e5" transition-magic="0:0;51:2:0:38db9a68-056c-4e65-8658-75f0c3cc91e5" call-id="53" rc-code="0" op-status="0" interval="0" last-run="1382322059" last-rc-change="1382322059" exec-time="2103" queue-time="0" op-dige
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+            </lrm_resource>
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          </lrm_resources>
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        </lrm>
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      </node_state>
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    </status>
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  </cib>
Oct 21 11:21:01 [7685] bl460g1n6       crmd:    debug: te_update_diff: 	Processing diff (cib_modify): 0.14.55 -> 0.14.56 (S_TRANSITION_ENGINE)
Oct 21 11:21:01 [7685] bl460g1n6       crmd:     info: match_graph_event: 	Action prmStonith6-1_start_0 (35) confirmed on bl460g1n7 (rc=0)
Oct 21 11:21:01 [7685] bl460g1n6       crmd:   notice: run_graph: 	Transition 2 (Complete=37, Pending=0, Fired=0, Skipped=15, Incomplete=0, Source=/var/lib/pacemaker/pengine/pe-input-2.bz2): Stopped
Oct 21 11:21:01 [7685] bl460g1n6       crmd:    debug: te_graph_trigger: 	Transition 2 is now complete
Oct 21 11:21:01 [7685] bl460g1n6       crmd:    debug: notify_crmd: 	Processing transition completion in state S_TRANSITION_ENGINE
Oct 21 11:21:01 [7685] bl460g1n6       crmd:    debug: crm_timer_start: 	Started New Transition Timer (I_PE_CALC:2000ms), src=169
Oct 21 11:21:01 [7685] bl460g1n6       crmd:    debug: notify_crmd: 	Transition 2 status: restart - Transient attribute: update
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: --- 0.14.55
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: +++ 0.14.56 1d616f5d7dc8c8e7c4fc3ef4f5579c32
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-  <cib num_updates="55">
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-    <status>
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-      <node_state id="3232261593">
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-        <lrm id="3232261593">
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-          <lrm_resources>
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-            <lrm_resource id="prmStonith6-1">
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	--             <lrm_rsc_op operation_key="prmStonith6-1_monitor_0" operation="monitor" transition-key="8:2:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" transition-magic="0:7;8:2:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" call-id="42" rc-code="7" exec-time="0" id="prmStonith6-1_last_0"/>
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-            </lrm_resource>
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-          </lrm_resources>
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-        </lrm>
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-      </node_state>
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-    </status>
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-  </cib>
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  <cib epoch="14" num_updates="56" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:21 2013" update-origin="bl460g1n7" update-client="crm_resource" have-quorum="1" dc-uuid="3232261592">
Oct 21 11:21:01 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=bl460g1n7/crmd/21, version=0.14.56)
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    <status>
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      <node_state id="3232261593" uname="bl460g1n7" in_ccm="true" crmd="online" crm-debug-origin="do_update_resource" join="member" expected="member">
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        <lrm id="3232261593">
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          <lrm_resources>
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+            <lrm_resource id="prmStonith6-1" type="external/ipmi" class="stonith">
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <lrm_rsc_op id="prmStonith6-1_last_0" operation_key="prmStonith6-1_start_0" operation="start" crm-debug-origin="do_update_resource" crm_feature_set="3.0.7" transition-key="35:2:0:38db9a68-056c-4e65-8658-75f0c3cc91e5" transition-magic="0:0;35:2:0:38db9a68-056c-4e65-8658-75f0c3cc91e5" call-id="52" rc-code="0" op-status="0" interval="0" last-run="1382322058" last-rc-change="1382322058" exec-time="3107" queue-time="0" op-dige
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+            </lrm_resource>
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          </lrm_resources>
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        </lrm>
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      </node_state>
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    </status>
Oct 21 11:21:01 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  </cib>
Oct 21 11:21:03 [7685] bl460g1n6       crmd:     info: crm_timer_popped: 	New Transition Timer (I_PE_CALC) just popped (2000ms)
Oct 21 11:21:03 [7685] bl460g1n6       crmd:    debug: s_crmd_fsa: 	Processing I_PE_CALC: [ state=S_TRANSITION_ENGINE cause=C_TIMER_POPPED origin=crm_timer_popped ]
Oct 21 11:21:03 [7685] bl460g1n6       crmd:     info: do_state_transition: 	State transition S_TRANSITION_ENGINE -> S_POLICY_ENGINE [ input=I_PE_CALC cause=C_TIMER_POPPED origin=crm_timer_popped ]
Oct 21 11:21:03 [7685] bl460g1n6       crmd:     info: do_state_transition: 	Progressed to state S_POLICY_ENGINE after C_TIMER_POPPED
Oct 21 11:21:03 [7685] bl460g1n6       crmd:    debug: do_state_transition: 	All 3 cluster nodes are eligible to run resources.
Oct 21 11:21:03 [7685] bl460g1n6       crmd:    debug: do_pe_invoke: 	Query 94: Requesting the current CIB: S_POLICY_ENGINE
Oct 21 11:21:03 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_query operation for section 'all': OK (rc=0, origin=local/crmd/94, version=0.14.56)
Oct 21 11:21:03 [7685] bl460g1n6       crmd:    debug: do_pe_invoke_callback: 	Invoking the PE: query=94, ref=pe_calc-dc-1382322063-108, seq=16, quorate=1
Oct 21 11:21:03 [7684] bl460g1n6    pengine:    debug: unpack_config: 	STONITH timeout: 60000
Oct 21 11:21:03 [7684] bl460g1n6    pengine:    debug: unpack_config: 	STONITH of failed nodes is enabled
Oct 21 11:21:03 [7684] bl460g1n6    pengine:    debug: unpack_config: 	Stop all active resources: false
Oct 21 11:21:03 [7684] bl460g1n6    pengine:    debug: unpack_config: 	Cluster is symmetric - resources can run anywhere by default
Oct 21 11:21:03 [7684] bl460g1n6    pengine:    debug: unpack_config: 	Default stickiness: 0
Oct 21 11:21:03 [7684] bl460g1n6    pengine:    debug: unpack_config: 	On loss of CCM Quorum: Freeze resources
Oct 21 11:21:03 [7684] bl460g1n6    pengine:    debug: unpack_config: 	Node scores: 'red' = -INFINITY, 'yellow' = 0, 'green' = 0
Oct 21 11:21:03 [7684] bl460g1n6    pengine:    debug: unpack_domains: 	Unpacking domains
Oct 21 11:21:03 [7684] bl460g1n6    pengine:     info: determine_online_status_fencing: 	Node bl460g1n8 is active
Oct 21 11:21:03 [7684] bl460g1n6    pengine:     info: determine_online_status: 	Node bl460g1n8 is online
Oct 21 11:21:03 [7684] bl460g1n6    pengine:     info: determine_online_status_fencing: 	Node bl460g1n6 is active
Oct 21 11:21:03 [7684] bl460g1n6    pengine:     info: determine_online_status: 	Node bl460g1n6 is online
Oct 21 11:21:03 [7684] bl460g1n6    pengine:     info: determine_online_status_fencing: 	Node bl460g1n7 is active
Oct 21 11:21:03 [7684] bl460g1n6    pengine:     info: determine_online_status: 	Node bl460g1n7 is online
Oct 21 11:21:03 [7684] bl460g1n6    pengine:    debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n8 to prmPing:0
Oct 21 11:21:03 [7684] bl460g1n6    pengine:    debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n6 to prmPing:1
Oct 21 11:21:03 [7684] bl460g1n6    pengine:    debug: determine_op_status: 	prmVM1_monitor_0 on bl460g1n6 returned 'ok' (0) instead of the expected value: 'not running' (7)
Oct 21 11:21:03 [7684] bl460g1n6    pengine:     info: determine_op_status: 	Operation monitor found resource prmVM1 active on bl460g1n6
Oct 21 11:21:03 [7684] bl460g1n6    pengine:    debug: determine_op_status: 	prmVM2_monitor_0 on bl460g1n6 returned 'ok' (0) instead of the expected value: 'not running' (7)
Oct 21 11:21:03 [7684] bl460g1n6    pengine:     info: determine_op_status: 	Operation monitor found resource prmVM2 active on bl460g1n6
Oct 21 11:21:03 [7684] bl460g1n6    pengine:    debug: determine_op_status: 	prmVM3_monitor_0 on bl460g1n6 returned 'ok' (0) instead of the expected value: 'not running' (7)
Oct 21 11:21:03 [7684] bl460g1n6    pengine:     info: determine_op_status: 	Operation monitor found resource prmVM3 active on bl460g1n6
Oct 21 11:21:03 [7684] bl460g1n6    pengine:    debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n7 to prmPing:2
Oct 21 11:21:03 [7684] bl460g1n6    pengine:     info: native_print: 	prmVM1	(ocf::heartbeat:VirtualDomain):	Started bl460g1n6 
Oct 21 11:21:03 [7684] bl460g1n6    pengine:     info: native_print: 	prmVM2	(ocf::heartbeat:VirtualDomain):	Started bl460g1n6 
Oct 21 11:21:03 [7684] bl460g1n6    pengine:     info: native_print: 	prmVM3	(ocf::heartbeat:VirtualDomain):	Started bl460g1n6 
Oct 21 11:21:03 [7684] bl460g1n6    pengine:     info: group_print: 	 Resource Group: grpStonith6
Oct 21 11:21:03 [7684] bl460g1n6    pengine:     info: native_print: 	     prmStonith6-1	(stonith:external/ipmi):	Started bl460g1n7 
Oct 21 11:21:03 [7684] bl460g1n6    pengine:     info: native_print: 	     prmStonith6-2	(stonith:external/ssh):	Stopped 
Oct 21 11:21:03 [7684] bl460g1n6    pengine:     info: group_print: 	 Resource Group: grpStonith7
Oct 21 11:21:03 [7684] bl460g1n6    pengine:     info: native_print: 	     prmStonith7-1	(stonith:external/ipmi):	Started bl460g1n8 
Oct 21 11:21:03 [7684] bl460g1n6    pengine:     info: native_print: 	     prmStonith7-2	(stonith:external/ssh):	Stopped 
Oct 21 11:21:03 [7684] bl460g1n6    pengine:     info: group_print: 	 Resource Group: grpStonith8
Oct 21 11:21:03 [7684] bl460g1n6    pengine:     info: native_print: 	     prmStonith8-1	(stonith:external/ipmi):	Started bl460g1n7 
Oct 21 11:21:03 [7684] bl460g1n6    pengine:     info: native_print: 	     prmStonith8-2	(stonith:external/ssh):	Stopped 
Oct 21 11:21:03 [7684] bl460g1n6    pengine:     info: clone_print: 	 Clone Set: clnPing [prmPing]
Oct 21 11:21:03 [7684] bl460g1n6    pengine:    debug: native_active: 	Resource prmPing:0 active on bl460g1n8
Oct 21 11:21:03 [7684] bl460g1n6    pengine:    debug: native_active: 	Resource prmPing:0 active on bl460g1n8
Oct 21 11:21:03 [7684] bl460g1n6    pengine:    debug: native_active: 	Resource prmPing:1 active on bl460g1n6
Oct 21 11:21:03 [7684] bl460g1n6    pengine:    debug: native_active: 	Resource prmPing:1 active on bl460g1n6
Oct 21 11:21:03 [7684] bl460g1n6    pengine:    debug: native_active: 	Resource prmPing:2 active on bl460g1n7
Oct 21 11:21:03 [7684] bl460g1n6    pengine:    debug: native_active: 	Resource prmPing:2 active on bl460g1n7
Oct 21 11:21:03 [7684] bl460g1n6    pengine:     info: short_print: 	     Started: [ bl460g1n6 bl460g1n7 bl460g1n8 ]
Oct 21 11:21:03 [7684] bl460g1n6    pengine:    debug: group_rsc_location: 	Processing rsc_location lo8-rule for grpStonith8
Oct 21 11:21:03 [7684] bl460g1n6    pengine:    debug: group_rsc_location: 	Processing rsc_location lo7-rule for grpStonith7
Oct 21 11:21:03 [7684] bl460g1n6    pengine:    debug: group_rsc_location: 	Processing rsc_location lo6-rule for grpStonith6
Oct 21 11:21:03 [7684] bl460g1n6    pengine:    debug: common_apply_stickiness: 	Resource prmVM1: preferring current location (node=bl460g1n6, weight=1000000)
Oct 21 11:21:03 [7684] bl460g1n6    pengine:    debug: common_apply_stickiness: 	Resource prmVM2: preferring current location (node=bl460g1n6, weight=1000000)
Oct 21 11:21:03 [7684] bl460g1n6    pengine:    debug: common_apply_stickiness: 	Resource prmVM3: preferring current location (node=bl460g1n6, weight=1000000)
Oct 21 11:21:03 [7684] bl460g1n6    pengine:    debug: common_apply_stickiness: 	Resource prmPing:1: preferring current location (node=bl460g1n6, weight=1000000)
Oct 21 11:21:03 [7684] bl460g1n6    pengine:    debug: common_apply_stickiness: 	Resource prmStonith6-1: preferring current location (node=bl460g1n7, weight=1000000)
Oct 21 11:21:03 [7684] bl460g1n6    pengine:    debug: common_apply_stickiness: 	Resource prmStonith8-1: preferring current location (node=bl460g1n7, weight=1000000)
Oct 21 11:21:03 [7684] bl460g1n6    pengine:    debug: common_apply_stickiness: 	Resource prmPing:2: preferring current location (node=bl460g1n7, weight=1000000)
Oct 21 11:21:03 [7684] bl460g1n6    pengine:    debug: common_apply_stickiness: 	Resource prmStonith7-1: preferring current location (node=bl460g1n8, weight=1000000)
Oct 21 11:21:03 [7684] bl460g1n6    pengine:    debug: common_apply_stickiness: 	Resource prmPing:0: preferring current location (node=bl460g1n8, weight=1000000)
Oct 21 11:21:03 [7684] bl460g1n6    pengine:    debug: native_assign_node: 	Assigning bl460g1n6 to prmPing:1
Oct 21 11:21:03 [7684] bl460g1n6    pengine:    debug: native_assign_node: 	Assigning bl460g1n7 to prmPing:2
Oct 21 11:21:03 [7684] bl460g1n6    pengine:    debug: native_assign_node: 	Assigning bl460g1n8 to prmPing:0
Oct 21 11:21:03 [7684] bl460g1n6    pengine:    debug: clone_color: 	Allocated 3 clnPing instances of a possible 3
Oct 21 11:21:03 [7684] bl460g1n6    pengine:    debug: native_assign_node: 	Assigning bl460g1n6 to prmVM1
Oct 21 11:21:03 [7684] bl460g1n6    pengine:    debug: native_assign_node: 	Assigning bl460g1n6 to prmVM2
Oct 21 11:21:03 [7684] bl460g1n6    pengine:    debug: native_assign_node: 	Assigning bl460g1n6 to prmVM3
Oct 21 11:21:03 [7684] bl460g1n6    pengine:    debug: native_assign_node: 	Assigning bl460g1n7 to prmStonith6-1
Oct 21 11:21:03 [7684] bl460g1n6    pengine:    debug: native_assign_node: 	Assigning bl460g1n7 to prmStonith6-2
Oct 21 11:21:03 [7684] bl460g1n6    pengine:    debug: native_assign_node: 	Assigning bl460g1n8 to prmStonith7-1
Oct 21 11:21:03 [7684] bl460g1n6    pengine:    debug: native_assign_node: 	Assigning bl460g1n8 to prmStonith7-2
Oct 21 11:21:03 [7684] bl460g1n6    pengine:    debug: native_assign_node: 	Assigning bl460g1n7 to prmStonith8-1
Oct 21 11:21:03 [7684] bl460g1n6    pengine:    debug: native_assign_node: 	Assigning bl460g1n7 to prmStonith8-2
Oct 21 11:21:03 [7684] bl460g1n6    pengine:     info: RecurringOp: 	 Start recurring monitor (360s) for prmStonith6-1 on bl460g1n7
Oct 21 11:21:03 [7684] bl460g1n6    pengine:     info: RecurringOp: 	 Start recurring monitor (10s) for prmStonith6-2 on bl460g1n7
Oct 21 11:21:03 [7684] bl460g1n6    pengine:     info: RecurringOp: 	 Start recurring monitor (360s) for prmStonith7-1 on bl460g1n8
Oct 21 11:21:03 [7684] bl460g1n6    pengine:     info: RecurringOp: 	 Start recurring monitor (10s) for prmStonith7-2 on bl460g1n8
Oct 21 11:21:03 [7684] bl460g1n6    pengine:     info: RecurringOp: 	 Start recurring monitor (360s) for prmStonith8-1 on bl460g1n7
Oct 21 11:21:03 [7684] bl460g1n6    pengine:     info: RecurringOp: 	 Start recurring monitor (10s) for prmStonith8-2 on bl460g1n7
Oct 21 11:21:03 [7684] bl460g1n6    pengine:     info: RecurringOp: 	 Start recurring monitor (10s) for prmPing:0 on bl460g1n8
Oct 21 11:21:03 [7684] bl460g1n6    pengine:     info: RecurringOp: 	 Start recurring monitor (10s) for prmPing:1 on bl460g1n6
Oct 21 11:21:03 [7684] bl460g1n6    pengine:     info: RecurringOp: 	 Start recurring monitor (10s) for prmPing:2 on bl460g1n7
Oct 21 11:21:03 [7684] bl460g1n6    pengine:     info: LogActions: 	Leave   prmVM1	(Started bl460g1n6)
Oct 21 11:21:03 [7684] bl460g1n6    pengine:     info: LogActions: 	Leave   prmVM2	(Started bl460g1n6)
Oct 21 11:21:03 [7684] bl460g1n6    pengine:     info: LogActions: 	Leave   prmVM3	(Started bl460g1n6)
Oct 21 11:21:03 [7684] bl460g1n6    pengine:     info: LogActions: 	Leave   prmStonith6-1	(Started bl460g1n7)
Oct 21 11:21:03 [7684] bl460g1n6    pengine:   notice: LogActions: 	Start   prmStonith6-2	(bl460g1n7)
Oct 21 11:21:03 [7684] bl460g1n6    pengine:     info: LogActions: 	Leave   prmStonith7-1	(Started bl460g1n8)
Oct 21 11:21:03 [7684] bl460g1n6    pengine:   notice: LogActions: 	Start   prmStonith7-2	(bl460g1n8)
Oct 21 11:21:03 [7684] bl460g1n6    pengine:     info: LogActions: 	Leave   prmStonith8-1	(Started bl460g1n7)
Oct 21 11:21:03 [7684] bl460g1n6    pengine:   notice: LogActions: 	Start   prmStonith8-2	(bl460g1n7)
Oct 21 11:21:03 [7684] bl460g1n6    pengine:     info: LogActions: 	Leave   prmPing:0	(Started bl460g1n8)
Oct 21 11:21:03 [7684] bl460g1n6    pengine:     info: LogActions: 	Leave   prmPing:1	(Started bl460g1n6)
Oct 21 11:21:03 [7684] bl460g1n6    pengine:     info: LogActions: 	Leave   prmPing:2	(Started bl460g1n7)
Oct 21 11:21:03 [7684] bl460g1n6    pengine:    debug: action2xml: 	Using anonymous clone name prmPing for prmPing:0 (aka. prmPing)
Oct 21 11:21:03 [7684] bl460g1n6    pengine:    debug: action2xml: 	Using anonymous clone name prmPing for prmPing:1 (aka. prmPing)
Oct 21 11:21:03 [7684] bl460g1n6    pengine:    debug: action2xml: 	Using anonymous clone name prmPing for prmPing:2 (aka. prmPing)
Oct 21 11:21:03 [7684] bl460g1n6    pengine:   notice: process_pe_message: 	Calculated Transition 3: /var/lib/pacemaker/pengine/pe-input-3.bz2
Oct 21 11:21:03 [7685] bl460g1n6       crmd:    debug: s_crmd_fsa: 	Processing I_PE_SUCCESS: [ state=S_POLICY_ENGINE cause=C_IPC_MESSAGE origin=handle_response ]
Oct 21 11:21:03 [7685] bl460g1n6       crmd:     info: do_state_transition: 	State transition S_POLICY_ENGINE -> S_TRANSITION_ENGINE [ input=I_PE_SUCCESS cause=C_IPC_MESSAGE origin=handle_response ]
Oct 21 11:21:03 [7685] bl460g1n6       crmd:    debug: unpack_graph: 	Unpacked transition 3: 20 actions in 20 synapses
Oct 21 11:21:03 [7685] bl460g1n6       crmd:     info: do_te_invoke: 	Processing graph 3 (ref=pe_calc-dc-1382322063-108) derived from /var/lib/pacemaker/pengine/pe-input-3.bz2
Oct 21 11:21:03 [7685] bl460g1n6       crmd:    debug: te_pseudo_action: 	Pseudo action 20 fired and confirmed
Oct 21 11:21:03 [7685] bl460g1n6       crmd:   notice: te_rsc_command: 	Initiating action 17: monitor prmStonith6-1_monitor_360000 on bl460g1n7
Oct 21 11:21:03 [7685] bl460g1n6       crmd:   notice: te_rsc_command: 	Initiating action 18: start prmStonith6-2_start_0 on bl460g1n7
Oct 21 11:21:03 [7685] bl460g1n6       crmd:    debug: te_pseudo_action: 	Pseudo action 29 fired and confirmed
Oct 21 11:21:03 [7685] bl460g1n6       crmd:   notice: te_rsc_command: 	Initiating action 26: monitor prmStonith7-1_monitor_360000 on bl460g1n8
Oct 21 11:21:03 [7685] bl460g1n6       crmd:   notice: te_rsc_command: 	Initiating action 27: start prmStonith7-2_start_0 on bl460g1n8
Oct 21 11:21:03 [7685] bl460g1n6       crmd:    debug: te_pseudo_action: 	Pseudo action 38 fired and confirmed
Oct 21 11:21:03 [7685] bl460g1n6       crmd:   notice: te_rsc_command: 	Initiating action 35: monitor prmStonith8-1_monitor_360000 on bl460g1n7
Oct 21 11:21:03 [7685] bl460g1n6       crmd:   notice: te_rsc_command: 	Initiating action 36: start prmStonith8-2_start_0 on bl460g1n7
Oct 21 11:21:03 [7685] bl460g1n6       crmd:   notice: te_rsc_command: 	Initiating action 44: monitor prmPing_monitor_10000 on bl460g1n8
Oct 21 11:21:03 [7685] bl460g1n6       crmd:   notice: te_rsc_command: 	Initiating action 47: monitor prmPing_monitor_10000 on bl460g1n6 (local)
Oct 21 11:21:03 [7685] bl460g1n6       crmd:     info: do_lrm_rsc_op: 	Performing key=47:3:0:38db9a68-056c-4e65-8658-75f0c3cc91e5 op=prmPing_monitor_10000
Oct 21 11:21:03 [7682] bl460g1n6       lrmd:    debug: process_lrmd_message: 	Processed lrmd_rsc_exec operation from 666c66a8-da92-4298-beea-12fd671d2b0d: rc=47, reply=1, notify=0, exit=4201864
Oct 21 11:21:03 [7682] bl460g1n6       lrmd:    debug: log_execute: 	executing - rsc:prmPing action:monitor call_id:47
Oct 21 11:21:03 [7685] bl460g1n6       crmd:   notice: te_rsc_command: 	Initiating action 50: monitor prmPing_monitor_10000 on bl460g1n7
Oct 21 11:21:03 [7685] bl460g1n6       crmd:   notice: te_rsc_command: 	Initiating action 8: probe_complete probe_complete on bl460g1n8 - no waiting
Oct 21 11:21:03 [7685] bl460g1n6       crmd:     info: te_rsc_command: 	Action 8 confirmed - no wait
Oct 21 11:21:03 [7685] bl460g1n6       crmd:   notice: te_rsc_command: 	Initiating action 7: probe_complete probe_complete on bl460g1n7 - no waiting
Oct 21 11:21:03 [7685] bl460g1n6       crmd:     info: te_rsc_command: 	Action 7 confirmed - no wait
Oct 21 11:21:03 [7685] bl460g1n6       crmd:    debug: run_graph: 	Transition 3 (Complete=0, Pending=9, Fired=14, Skipped=0, Incomplete=6, Source=/var/lib/pacemaker/pengine/pe-input-3.bz2): In-progress
Oct 21 11:21:03 [7685] bl460g1n6       crmd:    debug: run_graph: 	Transition 3 (Complete=5, Pending=9, Fired=0, Skipped=0, Incomplete=6, Source=/var/lib/pacemaker/pengine/pe-input-3.bz2): In-progress
Oct 21 11:21:04 [8183] bl460g1n6 attrd_updater:     info: attrd_update_delegate: 	Connecting to cluster... 5 retries remaining
Oct 21 11:21:04 [7683] bl460g1n6      attrd:     info: crm_client_new: 	Connecting 0x14f9260 for uid=0 gid=0 pid=8183 id=f08e8248-29ef-4230-8f79-3ec0514cf24e
Oct 21 11:21:04 [7683] bl460g1n6      attrd:    debug: handle_new_connection: 	IPC credentials authenticated (7683-8183-10)
Oct 21 11:21:04 [7683] bl460g1n6      attrd:    debug: qb_ipcs_shm_connect: 	connecting to client [8183]
Oct 21 11:21:04 [7683] bl460g1n6      attrd:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:04 [7683] bl460g1n6      attrd:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:04 [7683] bl460g1n6      attrd:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:04 [8183] bl460g1n6 attrd_updater:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:04 [8183] bl460g1n6 attrd_updater:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:04 [8183] bl460g1n6 attrd_updater:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:04 [8183] bl460g1n6 attrd_updater:    debug: attrd_update_delegate: 	Sent update: default_ping_set=100 for localhost
Oct 21 11:21:04 [7683] bl460g1n6      attrd:     info: attrd_client_message: 	Broadcasting default_ping_set[bl460g1n6] = 100 (writer)
Oct 21 11:21:04 [8183] bl460g1n6 attrd_updater:     info: crm_xml_cleanup: 	Cleaning up memory from libxml2
Oct 21 11:21:04 [7683] bl460g1n6      attrd:    debug: qb_ipcs_dispatch_connection_request: 	HUP conn (7683-8183-10)
Oct 21 11:21:04 [7683] bl460g1n6      attrd:    debug: qb_ipcs_disconnect: 	qb_ipcs_disconnect(7683-8183-10) state:2
Oct 21 11:21:04 [7683] bl460g1n6      attrd:     info: crm_client_destroy: 	Destroying 0 events
Oct 21 11:21:04 [7683] bl460g1n6      attrd:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-attrd-response-7683-8183-10-header
Oct 21 11:21:04 [7683] bl460g1n6      attrd:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-attrd-event-7683-8183-10-header
Oct 21 11:21:04 [7683] bl460g1n6      attrd:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-attrd-request-7683-8183-10-header
Oct 21 11:21:04 [7682] bl460g1n6       lrmd:    debug: operation_finished: 	prmPing_monitor_10000:8146 - exited with rc=0
Oct 21 11:21:04 [7682] bl460g1n6       lrmd:    debug: operation_finished: 	prmPing_monitor_10000:8146:stderr [ -- empty -- ]
Oct 21 11:21:04 [7682] bl460g1n6       lrmd:    debug: operation_finished: 	prmPing_monitor_10000:8146:stdout [ -- empty -- ]
Oct 21 11:21:04 [7682] bl460g1n6       lrmd:    debug: log_finished: 	finished - rsc:prmPing action:monitor call_id:47 pid:8146 exit-code:0 exec-time:1048ms queue-time:1ms
Oct 21 11:21:04 [7685] bl460g1n6       crmd:    debug: create_operation_update: 	do_update_resource: Updating resource prmPing after monitor op complete (interval=10000)
Oct 21 11:21:04 [7685] bl460g1n6       crmd:   notice: process_lrm_event: 	LRM operation prmPing_monitor_10000 (call=47, rc=0, cib-update=95, confirmed=false) ok
Oct 21 11:21:04 [7685] bl460g1n6       crmd:    debug: update_history_cache: 	Updating history for 'prmPing' with monitor op
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: --- 0.14.56
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: +++ 0.14.57 a472b657d949e8bbac6e030887f089c3
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-- <cib num_updates="56"/>
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  <cib epoch="14" num_updates="57" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:21 2013" update-origin="bl460g1n7" update-client="crm_resource" have-quorum="1" dc-uuid="3232261592">
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    <status>
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      <node_state id="3232261592" uname="bl460g1n6" in_ccm="true" crmd="online" crm-debug-origin="do_update_resource" join="member" expected="member">
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        <lrm id="3232261592">
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          <lrm_resources>
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+            <lrm_resource id="prmPing" type="ping" class="ocf" provider="pacemaker">
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <lrm_rsc_op id="prmPing_monitor_10000" operation_key="prmPing_monitor_10000" operation="monitor" crm-debug-origin="do_update_resource" crm_feature_set="3.0.7" transition-key="47:3:0:38db9a68-056c-4e65-8658-75f0c3cc91e5" transition-magic="0:0;47:3:0:38db9a68-056c-4e65-8658-75f0c3cc91e5" call-id="47" rc-code="0" op-status="0" interval="10000" last-rc-change="1382322063" exec-time="1048" queue-time="1" op-digest="fa6682493dd
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+            </lrm_resource>
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          </lrm_resources>
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        </lrm>
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      </node_state>
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    </status>
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  </cib>
Oct 21 11:21:04 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=local/crmd/95, version=0.14.57)
Oct 21 11:21:04 [7685] bl460g1n6       crmd:    debug: te_update_diff: 	Processing diff (cib_modify): 0.14.56 -> 0.14.57 (S_TRANSITION_ENGINE)
Oct 21 11:21:04 [7685] bl460g1n6       crmd:     info: match_graph_event: 	Action prmPing_monitor_10000 (47) confirmed on bl460g1n6 (rc=0)
Oct 21 11:21:04 [7685] bl460g1n6       crmd:    debug: run_graph: 	Transition 3 (Complete=6, Pending=8, Fired=0, Skipped=0, Incomplete=6, Source=/var/lib/pacemaker/pengine/pe-input-3.bz2): In-progress
Oct 21 11:21:04 [7685] bl460g1n6       crmd:    debug: te_update_diff: 	Processing diff (cib_modify): 0.14.57 -> 0.14.58 (S_TRANSITION_ENGINE)
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: --- 0.14.57
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: +++ 0.14.58 67f2cb1587555165eefb76bc089c2bc8
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-- <cib num_updates="57"/>
Oct 21 11:21:04 [7685] bl460g1n6       crmd:     info: match_graph_event: 	Action prmPing_monitor_10000 (44) confirmed on bl460g1n8 (rc=0)
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  <cib epoch="14" num_updates="58" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:21 2013" update-origin="bl460g1n7" update-client="crm_resource" have-quorum="1" dc-uuid="3232261592">
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    <status>
Oct 21 11:21:04 [7685] bl460g1n6       crmd:    debug: run_graph: 	Transition 3 (Complete=7, Pending=7, Fired=0, Skipped=0, Incomplete=6, Source=/var/lib/pacemaker/pengine/pe-input-3.bz2): In-progress
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      <node_state id="3232261594" uname="bl460g1n8" in_ccm="true" crmd="online" crm-debug-origin="do_update_resource" join="member" expected="member">
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        <lrm id="3232261594">
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          <lrm_resources>
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+            <lrm_resource id="prmPing" type="ping" class="ocf" provider="pacemaker">
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <lrm_rsc_op id="prmPing_monitor_10000" operation_key="prmPing_monitor_10000" operation="monitor" crm-debug-origin="do_update_resource" crm_feature_set="3.0.7" transition-key="44:3:0:38db9a68-056c-4e65-8658-75f0c3cc91e5" transition-magic="0:0;44:3:0:38db9a68-056c-4e65-8658-75f0c3cc91e5" call-id="56" rc-code="0" op-status="0" interval="10000" last-rc-change="1382322063" exec-time="1033" queue-time="0" op-digest="fa6682493dd
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+            </lrm_resource>
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          </lrm_resources>
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        </lrm>
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      </node_state>
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    </status>
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  </cib>
Oct 21 11:21:04 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=bl460g1n8/crmd/21, version=0.14.58)
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: --- 0.14.58
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: +++ 0.14.59 d782b9fd402ebd846a35eee57c810e1e
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-  <cib num_updates="58">
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-    <status>
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-      <node_state id="3232261594">
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-        <lrm id="3232261594">
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-          <lrm_resources>
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-            <lrm_resource id="prmStonith7-2">
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	--             <lrm_rsc_op operation_key="prmStonith7-2_monitor_0" operation="monitor" transition-key="22:2:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" transition-magic="0:7;22:2:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" call-id="48" rc-code="7" last-run="1382322058" last-rc-change="1382322058" exec-time="0" id="prmStonith7-2_last_0"/>
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-            </lrm_resource>
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-          </lrm_resources>
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-        </lrm>
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-      </node_state>
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-    </status>
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-  </cib>
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  <cib epoch="14" num_updates="59" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:21 2013" update-origin="bl460g1n7" update-client="crm_resource" have-quorum="1" dc-uuid="3232261592">
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    <status>
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      <node_state id="3232261594" uname="bl460g1n8" in_ccm="true" crmd="online" crm-debug-origin="do_update_resource" join="member" expected="member">
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        <lrm id="3232261594">
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          <lrm_resources>
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+            <lrm_resource id="prmStonith7-2" type="external/ssh" class="stonith">
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <lrm_rsc_op id="prmStonith7-2_last_0" operation_key="prmStonith7-2_start_0" operation="start" crm-debug-origin="do_update_resource" crm_feature_set="3.0.7" transition-key="27:3:0:38db9a68-056c-4e65-8658-75f0c3cc91e5" transition-magic="0:0;27:3:0:38db9a68-056c-4e65-8658-75f0c3cc91e5" call-id="55" rc-code="0" op-status="0" interval="0" last-run="1382322063" last-rc-change="1382322063" exec-time="1065" queue-time="0" op-dige
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+            </lrm_resource>
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          </lrm_resources>
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        </lrm>
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      </node_state>
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    </status>
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  </cib>
Oct 21 11:21:04 [7685] bl460g1n6       crmd:    debug: te_update_diff: 	Processing diff (cib_modify): 0.14.58 -> 0.14.59 (S_TRANSITION_ENGINE)
Oct 21 11:21:04 [7685] bl460g1n6       crmd:     info: match_graph_event: 	Action prmStonith7-2_start_0 (27) confirmed on bl460g1n8 (rc=0)
Oct 21 11:21:04 [7685] bl460g1n6       crmd:    debug: te_pseudo_action: 	Pseudo action 30 fired and confirmed
Oct 21 11:21:04 [7685] bl460g1n6       crmd:   notice: te_rsc_command: 	Initiating action 28: monitor prmStonith7-2_monitor_10000 on bl460g1n8
Oct 21 11:21:04 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=bl460g1n8/crmd/22, version=0.14.59)
Oct 21 11:21:04 [7685] bl460g1n6       crmd:    debug: run_graph: 	Transition 3 (Complete=8, Pending=7, Fired=2, Skipped=0, Incomplete=4, Source=/var/lib/pacemaker/pengine/pe-input-3.bz2): In-progress
Oct 21 11:21:04 [7685] bl460g1n6       crmd:    debug: run_graph: 	Transition 3 (Complete=9, Pending=7, Fired=0, Skipped=0, Incomplete=4, Source=/var/lib/pacemaker/pengine/pe-input-3.bz2): In-progress
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: --- 0.14.59
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: +++ 0.14.60 64b7850962709e92081b67b841bb945c
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-- <cib num_updates="59"/>
Oct 21 11:21:04 [7685] bl460g1n6       crmd:    debug: te_update_diff: 	Processing diff (cib_modify): 0.14.59 -> 0.14.60 (S_TRANSITION_ENGINE)
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  <cib epoch="14" num_updates="60" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:21 2013" update-origin="bl460g1n7" update-client="crm_resource" have-quorum="1" dc-uuid="3232261592">
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    <status>
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      <node_state id="3232261594" uname="bl460g1n8" in_ccm="true" crmd="online" crm-debug-origin="do_update_resource" join="member" expected="member">
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        <lrm id="3232261594">
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          <lrm_resources>
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+            <lrm_resource id="prmStonith7-1" type="external/ipmi" class="stonith">
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <lrm_rsc_op id="prmStonith7-1_monitor_360000" operation_key="prmStonith7-1_monitor_360000" operation="monitor" crm-debug-origin="do_update_resource" crm_feature_set="3.0.7" transition-key="26:3:0:38db9a68-056c-4e65-8658-75f0c3cc91e5" transition-magic="0:0;26:3:0:38db9a68-056c-4e65-8658-75f0c3cc91e5" call-id="54" rc-code="0" op-status="0" interval="360000" last-rc-change="1382322063" exec-time="1088" queue-time="0" op-dige
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+            </lrm_resource>
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          </lrm_resources>
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        </lrm>
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      </node_state>
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    </status>
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  </cib>
Oct 21 11:21:04 [7685] bl460g1n6       crmd:     info: match_graph_event: 	Action prmStonith7-1_monitor_360000 (26) confirmed on bl460g1n8 (rc=0)
Oct 21 11:21:04 [7685] bl460g1n6       crmd:    debug: run_graph: 	Transition 3 (Complete=10, Pending=6, Fired=0, Skipped=0, Incomplete=4, Source=/var/lib/pacemaker/pengine/pe-input-3.bz2): In-progress
Oct 21 11:21:04 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=bl460g1n8/crmd/23, version=0.14.60)
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: --- 0.14.60
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: +++ 0.14.61 7a4109ff307a94b8c433b31d128eaa69
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-  <cib num_updates="60">
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-    <status>
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-      <node_state id="3232261593">
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-        <lrm id="3232261593">
Oct 21 11:21:04 [7685] bl460g1n6       crmd:    debug: te_update_diff: 	Processing diff (cib_modify): 0.14.60 -> 0.14.61 (S_TRANSITION_ENGINE)
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-          <lrm_resources>
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-            <lrm_resource id="prmStonith6-2">
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	--             <lrm_rsc_op operation_key="prmStonith6-2_monitor_0" operation="monitor" transition-key="9:2:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" transition-magic="0:7;9:2:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" call-id="46" rc-code="7" last-run="1382322058" last-rc-change="1382322058" exec-time="0" id="prmStonith6-2_last_0"/>
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-            </lrm_resource>
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-          </lrm_resources>
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-        </lrm>
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-      </node_state>
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-    </status>
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-  </cib>
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  <cib epoch="14" num_updates="61" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:21 2013" update-origin="bl460g1n7" update-client="crm_resource" have-quorum="1" dc-uuid="3232261592">
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    <status>
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      <node_state id="3232261593" uname="bl460g1n7" in_ccm="true" crmd="online" crm-debug-origin="do_update_resource" join="member" expected="member">
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        <lrm id="3232261593">
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          <lrm_resources>
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+            <lrm_resource id="prmStonith6-2" type="external/ssh" class="stonith">
Oct 21 11:21:04 [7685] bl460g1n6       crmd:     info: match_graph_event: 	Action prmStonith6-2_start_0 (18) confirmed on bl460g1n7 (rc=0)
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <lrm_rsc_op id="prmStonith6-2_last_0" operation_key="prmStonith6-2_start_0" operation="start" crm-debug-origin="do_update_resource" crm_feature_set="3.0.7" transition-key="18:3:0:38db9a68-056c-4e65-8658-75f0c3cc91e5" transition-magic="0:0;18:3:0:38db9a68-056c-4e65-8658-75f0c3cc91e5" call-id="56" rc-code="0" op-status="0" interval="0" last-run="1382322063" last-rc-change="1382322063" exec-time="1064" queue-time="0" op-dige
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+            </lrm_resource>
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          </lrm_resources>
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        </lrm>
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      </node_state>
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    </status>
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  </cib>
Oct 21 11:21:04 [7685] bl460g1n6       crmd:    debug: te_pseudo_action: 	Pseudo action 21 fired and confirmed
Oct 21 11:21:04 [7685] bl460g1n6       crmd:   notice: te_rsc_command: 	Initiating action 19: monitor prmStonith6-2_monitor_10000 on bl460g1n7
Oct 21 11:21:04 [7685] bl460g1n6       crmd:    debug: run_graph: 	Transition 3 (Complete=11, Pending=6, Fired=2, Skipped=0, Incomplete=2, Source=/var/lib/pacemaker/pengine/pe-input-3.bz2): In-progress
Oct 21 11:21:04 [7685] bl460g1n6       crmd:    debug: run_graph: 	Transition 3 (Complete=12, Pending=6, Fired=0, Skipped=0, Incomplete=2, Source=/var/lib/pacemaker/pengine/pe-input-3.bz2): In-progress
Oct 21 11:21:04 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=bl460g1n7/crmd/22, version=0.14.61)
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: --- 0.14.61
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: +++ 0.14.62 4cd6b76892e32ec2c68182f36c46f40e
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-- <cib num_updates="61"/>
Oct 21 11:21:04 [7685] bl460g1n6       crmd:    debug: te_update_diff: 	Processing diff (cib_modify): 0.14.61 -> 0.14.62 (S_TRANSITION_ENGINE)
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  <cib epoch="14" num_updates="62" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:21 2013" update-origin="bl460g1n7" update-client="crm_resource" have-quorum="1" dc-uuid="3232261592">
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    <status>
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      <node_state id="3232261593" uname="bl460g1n7" in_ccm="true" crmd="online" crm-debug-origin="do_update_resource" join="member" expected="member">
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        <lrm id="3232261593">
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          <lrm_resources>
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+            <lrm_resource id="prmPing" type="ping" class="ocf" provider="pacemaker">
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <lrm_rsc_op id="prmPing_monitor_10000" operation_key="prmPing_monitor_10000" operation="monitor" crm-debug-origin="do_update_resource" crm_feature_set="3.0.7" transition-key="50:3:0:38db9a68-056c-4e65-8658-75f0c3cc91e5" transition-magic="0:0;50:3:0:38db9a68-056c-4e65-8658-75f0c3cc91e5" call-id="59" rc-code="0" op-status="0" interval="10000" last-rc-change="1382322063" exec-time="1032" queue-time="0" op-digest="fa6682493dd
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+            </lrm_resource>
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          </lrm_resources>
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        </lrm>
Oct 21 11:21:04 [7685] bl460g1n6       crmd:     info: match_graph_event: 	Action prmPing_monitor_10000 (50) confirmed on bl460g1n7 (rc=0)
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      </node_state>
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    </status>
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  </cib>
Oct 21 11:21:04 [7685] bl460g1n6       crmd:    debug: run_graph: 	Transition 3 (Complete=13, Pending=5, Fired=0, Skipped=0, Incomplete=2, Source=/var/lib/pacemaker/pengine/pe-input-3.bz2): In-progress
Oct 21 11:21:04 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=bl460g1n7/crmd/23, version=0.14.62)
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: --- 0.14.62
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: +++ 0.14.63 2adafa447f74514d67cd3ed7dd6db165
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-- <cib num_updates="62"/>
Oct 21 11:21:04 [7685] bl460g1n6       crmd:    debug: te_update_diff: 	Processing diff (cib_modify): 0.14.62 -> 0.14.63 (S_TRANSITION_ENGINE)
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  <cib epoch="14" num_updates="63" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:21 2013" update-origin="bl460g1n7" update-client="crm_resource" have-quorum="1" dc-uuid="3232261592">
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    <status>
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      <node_state id="3232261593" uname="bl460g1n7" in_ccm="true" crmd="online" crm-debug-origin="do_update_resource" join="member" expected="member">
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        <lrm id="3232261593">
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          <lrm_resources>
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+            <lrm_resource id="prmStonith6-1" type="external/ipmi" class="stonith">
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <lrm_rsc_op id="prmStonith6-1_monitor_360000" operation_key="prmStonith6-1_monitor_360000" operation="monitor" crm-debug-origin="do_update_resource" crm_feature_set="3.0.7" transition-key="17:3:0:38db9a68-056c-4e65-8658-75f0c3cc91e5" transition-magic="0:0;17:3:0:38db9a68-056c-4e65-8658-75f0c3cc91e5" call-id="55" rc-code="0" op-status="0" interval="360000" last-rc-change="1382322063" exec-time="1090" queue-time="9" op-dige
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+            </lrm_resource>
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          </lrm_resources>
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        </lrm>
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      </node_state>
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    </status>
Oct 21 11:21:04 [7685] bl460g1n6       crmd:     info: match_graph_event: 	Action prmStonith6-1_monitor_360000 (17) confirmed on bl460g1n7 (rc=0)
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  </cib>
Oct 21 11:21:04 [7685] bl460g1n6       crmd:    debug: run_graph: 	Transition 3 (Complete=14, Pending=4, Fired=0, Skipped=0, Incomplete=2, Source=/var/lib/pacemaker/pengine/pe-input-3.bz2): In-progress
Oct 21 11:21:04 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=bl460g1n7/crmd/24, version=0.14.63)
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: --- 0.14.63
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: +++ 0.14.64 e902b0e398cc6db460412a0d58d5d467
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-  <cib num_updates="63">
Oct 21 11:21:04 [7685] bl460g1n6       crmd:    debug: te_update_diff: 	Processing diff (cib_modify): 0.14.63 -> 0.14.64 (S_TRANSITION_ENGINE)
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-    <status>
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-      <node_state id="3232261593">
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-        <lrm id="3232261593">
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-          <lrm_resources>
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-            <lrm_resource id="prmStonith8-2">
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	--             <lrm_rsc_op operation_key="prmStonith8-2_monitor_0" operation="monitor" transition-key="13:2:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" transition-magic="0:7;13:2:7:38db9a68-056c-4e65-8658-75f0c3cc91e5" call-id="50" rc-code="7" last-run="1382322058" last-rc-change="1382322058" exec-time="0" id="prmStonith8-2_last_0"/>
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-            </lrm_resource>
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-          </lrm_resources>
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-        </lrm>
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-      </node_state>
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-    </status>
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-  </cib>
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  <cib epoch="14" num_updates="64" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:21 2013" update-origin="bl460g1n7" update-client="crm_resource" have-quorum="1" dc-uuid="3232261592">
Oct 21 11:21:04 [7685] bl460g1n6       crmd:     info: match_graph_event: 	Action prmStonith8-2_start_0 (36) confirmed on bl460g1n7 (rc=0)
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    <status>
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      <node_state id="3232261593" uname="bl460g1n7" in_ccm="true" crmd="online" crm-debug-origin="do_update_resource" join="member" expected="member">
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        <lrm id="3232261593">
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          <lrm_resources>
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+            <lrm_resource id="prmStonith8-2" type="external/ssh" class="stonith">
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <lrm_rsc_op id="prmStonith8-2_last_0" operation_key="prmStonith8-2_start_0" operation="start" crm-debug-origin="do_update_resource" crm_feature_set="3.0.7" transition-key="36:3:0:38db9a68-056c-4e65-8658-75f0c3cc91e5" transition-magic="0:0;36:3:0:38db9a68-056c-4e65-8658-75f0c3cc91e5" call-id="58" rc-code="0" op-status="0" interval="0" last-run="1382322063" last-rc-change="1382322063" exec-time="1066" queue-time="0" op-dige
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+            </lrm_resource>
Oct 21 11:21:04 [7685] bl460g1n6       crmd:    debug: te_pseudo_action: 	Pseudo action 39 fired and confirmed
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          </lrm_resources>
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        </lrm>
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      </node_state>
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    </status>
Oct 21 11:21:04 [7685] bl460g1n6       crmd:   notice: te_rsc_command: 	Initiating action 37: monitor prmStonith8-2_monitor_10000 on bl460g1n7
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  </cib>
Oct 21 11:21:04 [7685] bl460g1n6       crmd:    debug: run_graph: 	Transition 3 (Complete=15, Pending=4, Fired=2, Skipped=0, Incomplete=0, Source=/var/lib/pacemaker/pengine/pe-input-3.bz2): In-progress
Oct 21 11:21:04 [7685] bl460g1n6       crmd:    debug: run_graph: 	Transition 3 (Complete=16, Pending=4, Fired=0, Skipped=0, Incomplete=0, Source=/var/lib/pacemaker/pengine/pe-input-3.bz2): In-progress
Oct 21 11:21:04 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=bl460g1n7/crmd/25, version=0.14.64)
Oct 21 11:21:04 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=bl460g1n7/crmd/26, version=0.14.65)
Oct 21 11:21:04 [7685] bl460g1n6       crmd:    debug: te_update_diff: 	Processing diff (cib_modify): 0.14.64 -> 0.14.65 (S_TRANSITION_ENGINE)
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: --- 0.14.64
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: +++ 0.14.65 bc4ceee2a9c627616b85565ca7b53391
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-- <cib num_updates="64"/>
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  <cib epoch="14" num_updates="65" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:21 2013" update-origin="bl460g1n7" update-client="crm_resource" have-quorum="1" dc-uuid="3232261592">
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    <status>
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      <node_state id="3232261593" uname="bl460g1n7" in_ccm="true" crmd="online" crm-debug-origin="do_update_resource" join="member" expected="member">
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        <lrm id="3232261593">
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          <lrm_resources>
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+            <lrm_resource id="prmStonith8-1" type="external/ipmi" class="stonith">
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <lrm_rsc_op id="prmStonith8-1_monitor_360000" operation_key="prmStonith8-1_monitor_360000" operation="monitor" crm-debug-origin="do_update_resource" crm_feature_set="3.0.7" transition-key="35:3:0:38db9a68-056c-4e65-8658-75f0c3cc91e5" transition-magic="0:0;35:3:0:38db9a68-056c-4e65-8658-75f0c3cc91e5" call-id="57" rc-code="0" op-status="0" interval="360000" last-rc-change="1382322063" exec-time="1112" queue-time="0" op-dige
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+            </lrm_resource>
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          </lrm_resources>
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        </lrm>
Oct 21 11:21:04 [7685] bl460g1n6       crmd:     info: match_graph_event: 	Action prmStonith8-1_monitor_360000 (35) confirmed on bl460g1n7 (rc=0)
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      </node_state>
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    </status>
Oct 21 11:21:04 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  </cib>
Oct 21 11:21:04 [7685] bl460g1n6       crmd:    debug: run_graph: 	Transition 3 (Complete=17, Pending=3, Fired=0, Skipped=0, Incomplete=0, Source=/var/lib/pacemaker/pengine/pe-input-3.bz2): In-progress
Oct 21 11:21:05 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: --- 0.14.65
Oct 21 11:21:05 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: +++ 0.14.66 b97aa5f934add0fa92fe1a051c9ce6d6
Oct 21 11:21:05 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-- <cib num_updates="65"/>
Oct 21 11:21:05 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  <cib epoch="14" num_updates="66" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:21 2013" update-origin="bl460g1n7" update-client="crm_resource" have-quorum="1" dc-uuid="3232261592">
Oct 21 11:21:05 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    <status>
Oct 21 11:21:05 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      <node_state id="3232261594" uname="bl460g1n8" in_ccm="true" crmd="online" crm-debug-origin="do_update_resource" join="member" expected="member">
Oct 21 11:21:05 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        <lrm id="3232261594">
Oct 21 11:21:05 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          <lrm_resources>
Oct 21 11:21:05 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+            <lrm_resource id="prmStonith7-2" type="external/ssh" class="stonith">
Oct 21 11:21:05 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <lrm_rsc_op id="prmStonith7-2_monitor_10000" operation_key="prmStonith7-2_monitor_10000" operation="monitor" crm-debug-origin="do_update_resource" crm_feature_set="3.0.7" transition-key="28:3:0:38db9a68-056c-4e65-8658-75f0c3cc91e5" transition-magic="0:0;28:3:0:38db9a68-056c-4e65-8658-75f0c3cc91e5" call-id="57" rc-code="0" op-status="0" interval="10000" last-rc-change="1382322064" exec-time="1039" queue-time="0" op-digest=
Oct 21 11:21:05 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+            </lrm_resource>
Oct 21 11:21:05 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          </lrm_resources>
Oct 21 11:21:05 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        </lrm>
Oct 21 11:21:05 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      </node_state>
Oct 21 11:21:05 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    </status>
Oct 21 11:21:05 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  </cib>
Oct 21 11:21:05 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=bl460g1n8/crmd/24, version=0.14.66)
Oct 21 11:21:05 [7685] bl460g1n6       crmd:    debug: te_update_diff: 	Processing diff (cib_modify): 0.14.65 -> 0.14.66 (S_TRANSITION_ENGINE)
Oct 21 11:21:05 [7685] bl460g1n6       crmd:     info: match_graph_event: 	Action prmStonith7-2_monitor_10000 (28) confirmed on bl460g1n8 (rc=0)
Oct 21 11:21:05 [7685] bl460g1n6       crmd:    debug: run_graph: 	Transition 3 (Complete=18, Pending=2, Fired=0, Skipped=0, Incomplete=0, Source=/var/lib/pacemaker/pengine/pe-input-3.bz2): In-progress
Oct 21 11:21:05 [7685] bl460g1n6       crmd:    debug: te_update_diff: 	Processing diff (cib_modify): 0.14.66 -> 0.14.67 (S_TRANSITION_ENGINE)
Oct 21 11:21:05 [7685] bl460g1n6       crmd:     info: match_graph_event: 	Action prmStonith6-2_monitor_10000 (19) confirmed on bl460g1n7 (rc=0)
Oct 21 11:21:05 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: --- 0.14.66
Oct 21 11:21:05 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: +++ 0.14.67 eab786d818f9f71114710b4219162247
Oct 21 11:21:05 [7685] bl460g1n6       crmd:    debug: run_graph: 	Transition 3 (Complete=19, Pending=1, Fired=0, Skipped=0, Incomplete=0, Source=/var/lib/pacemaker/pengine/pe-input-3.bz2): In-progress
Oct 21 11:21:05 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-- <cib num_updates="66"/>
Oct 21 11:21:05 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  <cib epoch="14" num_updates="67" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:21 2013" update-origin="bl460g1n7" update-client="crm_resource" have-quorum="1" dc-uuid="3232261592">
Oct 21 11:21:05 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    <status>
Oct 21 11:21:05 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      <node_state id="3232261593" uname="bl460g1n7" in_ccm="true" crmd="online" crm-debug-origin="do_update_resource" join="member" expected="member">
Oct 21 11:21:05 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        <lrm id="3232261593">
Oct 21 11:21:05 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          <lrm_resources>
Oct 21 11:21:05 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+            <lrm_resource id="prmStonith6-2" type="external/ssh" class="stonith">
Oct 21 11:21:05 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <lrm_rsc_op id="prmStonith6-2_monitor_10000" operation_key="prmStonith6-2_monitor_10000" operation="monitor" crm-debug-origin="do_update_resource" crm_feature_set="3.0.7" transition-key="19:3:0:38db9a68-056c-4e65-8658-75f0c3cc91e5" transition-magic="0:0;19:3:0:38db9a68-056c-4e65-8658-75f0c3cc91e5" call-id="60" rc-code="0" op-status="0" interval="10000" last-rc-change="1382322064" exec-time="1042" queue-time="0" op-digest=
Oct 21 11:21:05 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+            </lrm_resource>
Oct 21 11:21:05 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          </lrm_resources>
Oct 21 11:21:05 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        </lrm>
Oct 21 11:21:05 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      </node_state>
Oct 21 11:21:05 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    </status>
Oct 21 11:21:05 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  </cib>
Oct 21 11:21:05 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=bl460g1n7/crmd/27, version=0.14.67)
Oct 21 11:21:05 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: --- 0.14.67
Oct 21 11:21:05 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	Diff: +++ 0.14.68 e1249ea2ef95bfe0cc8ff158d1e42de9
Oct 21 11:21:05 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	-- <cib num_updates="67"/>
Oct 21 11:21:05 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  <cib epoch="14" num_updates="68" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.7" cib-last-written="Mon Oct 21 11:20:21 2013" update-origin="bl460g1n7" update-client="crm_resource" have-quorum="1" dc-uuid="3232261592">
Oct 21 11:21:05 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    <status>
Oct 21 11:21:05 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      <node_state id="3232261593" uname="bl460g1n7" in_ccm="true" crmd="online" crm-debug-origin="do_update_resource" join="member" expected="member">
Oct 21 11:21:05 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        <lrm id="3232261593">
Oct 21 11:21:05 [7685] bl460g1n6       crmd:    debug: te_update_diff: 	Processing diff (cib_modify): 0.14.67 -> 0.14.68 (S_TRANSITION_ENGINE)
Oct 21 11:21:05 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          <lrm_resources>
Oct 21 11:21:05 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+            <lrm_resource id="prmStonith8-2" type="external/ssh" class="stonith">
Oct 21 11:21:05 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	++             <lrm_rsc_op id="prmStonith8-2_monitor_10000" operation_key="prmStonith8-2_monitor_10000" operation="monitor" crm-debug-origin="do_update_resource" crm_feature_set="3.0.7" transition-key="37:3:0:38db9a68-056c-4e65-8658-75f0c3cc91e5" transition-magic="0:0;37:3:0:38db9a68-056c-4e65-8658-75f0c3cc91e5" call-id="61" rc-code="0" op-status="0" interval="10000" last-rc-change="1382322064" exec-time="1041" queue-time="0" op-digest=
Oct 21 11:21:05 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+            </lrm_resource>
Oct 21 11:21:05 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+          </lrm_resources>
Oct 21 11:21:05 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+        </lrm>
Oct 21 11:21:05 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+      </node_state>
Oct 21 11:21:05 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+    </status>
Oct 21 11:21:05 [7681] bl460g1n6 stonith-ng:    debug: Config update: 	+  </cib>
Oct 21 11:21:05 [7685] bl460g1n6       crmd:     info: match_graph_event: 	Action prmStonith8-2_monitor_10000 (37) confirmed on bl460g1n7 (rc=0)
Oct 21 11:21:05 [7685] bl460g1n6       crmd:   notice: run_graph: 	Transition 3 (Complete=20, Pending=0, Fired=0, Skipped=0, Incomplete=0, Source=/var/lib/pacemaker/pengine/pe-input-3.bz2): Complete
Oct 21 11:21:05 [7685] bl460g1n6       crmd:    debug: te_graph_trigger: 	Transition 3 is now complete
Oct 21 11:21:05 [7685] bl460g1n6       crmd:    debug: notify_crmd: 	Processing transition completion in state S_TRANSITION_ENGINE
Oct 21 11:21:05 [7685] bl460g1n6       crmd:    debug: notify_crmd: 	Transition 3 status: done - <null>
Oct 21 11:21:05 [7685] bl460g1n6       crmd:    debug: s_crmd_fsa: 	Processing I_TE_SUCCESS: [ state=S_TRANSITION_ENGINE cause=C_FSA_INTERNAL origin=notify_crmd ]
Oct 21 11:21:05 [7685] bl460g1n6       crmd:     info: do_log: 	FSA: Input I_TE_SUCCESS from notify_crmd() received in state S_TRANSITION_ENGINE
Oct 21 11:21:05 [7685] bl460g1n6       crmd:   notice: do_state_transition: 	State transition S_TRANSITION_ENGINE -> S_IDLE [ input=I_TE_SUCCESS cause=C_FSA_INTERNAL origin=notify_crmd ]
Oct 21 11:21:05 [7685] bl460g1n6       crmd:    debug: do_state_transition: 	Starting PEngine Recheck Timer
Oct 21 11:21:05 [7685] bl460g1n6       crmd:    debug: crm_timer_start: 	Started PEngine Recheck Timer (I_PE_CALC:900000ms), src=184
Oct 21 11:21:05 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=bl460g1n7/crmd/28, version=0.14.68)
Oct 21 11:21:08 [7682] bl460g1n6       lrmd:    debug: recurring_action_timer: 	Scheduling another invokation of prmVM2_monitor_10000
Oct 21 11:21:08 [7682] bl460g1n6       lrmd:    debug: recurring_action_timer: 	Scheduling another invokation of prmVM1_monitor_10000
Oct 21 11:21:08 [7682] bl460g1n6       lrmd:    debug: recurring_action_timer: 	Scheduling another invokation of prmVM3_monitor_10000
VirtualDomain(prmVM2)[8185]:	2013/10/21_11:21:08 DEBUG: Virtual domain vm2 is currently running.
VirtualDomain(prmVM1)[8192]:	2013/10/21_11:21:08 DEBUG: Virtual domain vm1 is currently running.
VirtualDomain(prmVM3)[8194]:	2013/10/21_11:21:08 DEBUG: Virtual domain vm3 is currently running.
Oct 21 11:21:08 [7680] bl460g1n6        cib:     info: crm_client_new: 	Connecting 0xbde1f0 for uid=0 gid=0 pid=8278 id=84fab12f-c463-4189-ab24-7c2d86f5c413
Oct 21 11:21:08 [7680] bl460g1n6        cib:    debug: handle_new_connection: 	IPC credentials authenticated (7680-8278-14)
Oct 21 11:21:08 [7680] bl460g1n6        cib:    debug: qb_ipcs_shm_connect: 	connecting to client [8278]
Oct 21 11:21:08 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:08 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:08 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:08 [8278] bl460g1n6 crm_resource:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:08 [8278] bl460g1n6 crm_resource:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:08 [8278] bl460g1n6 crm_resource:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:08 [8278] bl460g1n6 crm_resource:    debug: cib_native_signon_raw: 	Connection to CIB successful
Oct 21 11:21:08 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_query operation for section 'all': OK (rc=0, origin=local/crm_resource/2, version=0.14.68)
Oct 21 11:21:08 [8278] bl460g1n6 crm_resource:    debug: unpack_config: 	STONITH timeout: 60000
Oct 21 11:21:08 [8278] bl460g1n6 crm_resource:    debug: unpack_config: 	STONITH of failed nodes is enabled
Oct 21 11:21:08 [8278] bl460g1n6 crm_resource:    debug: unpack_config: 	Stop all active resources: false
Oct 21 11:21:08 [8278] bl460g1n6 crm_resource:    debug: unpack_config: 	Cluster is symmetric - resources can run anywhere by default
Oct 21 11:21:08 [8278] bl460g1n6 crm_resource:    debug: unpack_config: 	Default stickiness: 0
Oct 21 11:21:08 [8278] bl460g1n6 crm_resource:    debug: unpack_config: 	On loss of CCM Quorum: Freeze resources
Oct 21 11:21:08 [8278] bl460g1n6 crm_resource:    debug: unpack_config: 	Node scores: 'red' = -INFINITY, 'yellow' = 0, 'green' = 0
Oct 21 11:21:08 [8278] bl460g1n6 crm_resource:  warning: handle_startup_fencing: 	Blind faith: not fencing unseen nodes
Oct 21 11:21:08 [8278] bl460g1n6 crm_resource:    debug: unpack_domains: 	Unpacking domains
Oct 21 11:21:08 [8278] bl460g1n6 crm_resource:     info: determine_online_status_fencing: 	Node bl460g1n8 is active
Oct 21 11:21:08 [8278] bl460g1n6 crm_resource:     info: determine_online_status: 	Node bl460g1n8 is online
Oct 21 11:21:08 [8278] bl460g1n6 crm_resource:     info: determine_online_status_fencing: 	Node bl460g1n6 is active
Oct 21 11:21:08 [8278] bl460g1n6 crm_resource:     info: determine_online_status: 	Node bl460g1n6 is online
Oct 21 11:21:08 [8278] bl460g1n6 crm_resource:     info: determine_online_status_fencing: 	Node bl460g1n7 is active
Oct 21 11:21:08 [8278] bl460g1n6 crm_resource:     info: determine_online_status: 	Node bl460g1n7 is online
Oct 21 11:21:08 [8278] bl460g1n6 crm_resource:    debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n8 to prmPing:0
Oct 21 11:21:08 [8278] bl460g1n6 crm_resource:    debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n6 to prmPing:1
Oct 21 11:21:08 [8278] bl460g1n6 crm_resource:    debug: determine_op_status: 	prmVM1_monitor_0 on bl460g1n6 returned 'ok' (0) instead of the expected value: 'not running' (7)
Oct 21 11:21:08 [8278] bl460g1n6 crm_resource:     info: determine_op_status: 	Operation monitor found resource prmVM1 active on bl460g1n6
Oct 21 11:21:08 [8278] bl460g1n6 crm_resource:    debug: determine_op_status: 	prmVM2_monitor_0 on bl460g1n6 returned 'ok' (0) instead of the expected value: 'not running' (7)
Oct 21 11:21:08 [8278] bl460g1n6 crm_resource:     info: determine_op_status: 	Operation monitor found resource prmVM2 active on bl460g1n6
Oct 21 11:21:08 [8278] bl460g1n6 crm_resource:    debug: determine_op_status: 	prmVM3_monitor_0 on bl460g1n6 returned 'ok' (0) instead of the expected value: 'not running' (7)
Oct 21 11:21:08 [8278] bl460g1n6 crm_resource:     info: determine_op_status: 	Operation monitor found resource prmVM3 active on bl460g1n6
Oct 21 11:21:08 [8278] bl460g1n6 crm_resource:    debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n7 to prmPing:2
Oct 21 11:21:08 [8278] bl460g1n6 crm_resource:    debug: dump_resource_attr: 	Looking up cpu in prmVM2
Oct 21 11:21:08 [8278] bl460g1n6 crm_resource:    debug: cib_native_signoff: 	Signing out of the CIB Service
Oct 21 11:21:08 [8278] bl460g1n6 crm_resource:    debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Oct 21 11:21:08 [8278] bl460g1n6 crm_resource:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-request-7680-8278-14-header
Oct 21 11:21:08 [7680] bl460g1n6        cib:    debug: qb_ipcs_dispatch_connection_request: 	HUP conn (7680-8278-14)
Oct 21 11:21:08 [7680] bl460g1n6        cib:    debug: qb_ipcs_disconnect: 	qb_ipcs_disconnect(7680-8278-14) state:2
Oct 21 11:21:08 [8278] bl460g1n6 crm_resource:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-response-7680-8278-14-header
Oct 21 11:21:08 [8278] bl460g1n6 crm_resource:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-event-7680-8278-14-header
Oct 21 11:21:08 [7680] bl460g1n6        cib:     info: crm_client_destroy: 	Destroying 0 events
Oct 21 11:21:08 [8278] bl460g1n6 crm_resource:     info: crm_xml_cleanup: 	Cleaning up memory from libxml2
Oct 21 11:21:08 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-response-7680-8278-14-header
Oct 21 11:21:08 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-event-7680-8278-14-header
Oct 21 11:21:08 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-request-7680-8278-14-header
Oct 21 11:21:08 [7680] bl460g1n6        cib:     info: crm_client_new: 	Connecting 0xbde1f0 for uid=0 gid=0 pid=8281 id=ed857302-7a88-4ab0-8e23-8ea0eb4f45fa
Oct 21 11:21:08 [7680] bl460g1n6        cib:    debug: handle_new_connection: 	IPC credentials authenticated (7680-8281-14)
Oct 21 11:21:08 [7680] bl460g1n6        cib:    debug: qb_ipcs_shm_connect: 	connecting to client [8281]
Oct 21 11:21:08 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:08 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:08 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:08 [8281] bl460g1n6 crm_resource:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:08 [8281] bl460g1n6 crm_resource:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:08 [8281] bl460g1n6 crm_resource:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:08 [8281] bl460g1n6 crm_resource:    debug: cib_native_signon_raw: 	Connection to CIB successful
Oct 21 11:21:08 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_query operation for section 'all': OK (rc=0, origin=local/crm_resource/2, version=0.14.68)
Oct 21 11:21:08 [7680] bl460g1n6        cib:     info: crm_client_new: 	Connecting 0xc786d0 for uid=0 gid=0 pid=8286 id=f50cae61-433c-4b10-9ae9-66d9fa64c6af
Oct 21 11:21:08 [7680] bl460g1n6        cib:    debug: handle_new_connection: 	IPC credentials authenticated (7680-8286-15)
Oct 21 11:21:08 [7680] bl460g1n6        cib:    debug: qb_ipcs_shm_connect: 	connecting to client [8286]
Oct 21 11:21:08 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:08 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:08 [8281] bl460g1n6 crm_resource:    debug: unpack_config: 	STONITH timeout: 60000
Oct 21 11:21:08 [8281] bl460g1n6 crm_resource:    debug: unpack_config: 	STONITH of failed nodes is enabled
Oct 21 11:21:08 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:08 [8281] bl460g1n6 crm_resource:    debug: unpack_config: 	Stop all active resources: false
Oct 21 11:21:08 [8281] bl460g1n6 crm_resource:    debug: unpack_config: 	Cluster is symmetric - resources can run anywhere by default
Oct 21 11:21:08 [8281] bl460g1n6 crm_resource:    debug: unpack_config: 	Default stickiness: 0
Oct 21 11:21:08 [8281] bl460g1n6 crm_resource:    debug: unpack_config: 	On loss of CCM Quorum: Freeze resources
Oct 21 11:21:08 [8281] bl460g1n6 crm_resource:    debug: unpack_config: 	Node scores: 'red' = -INFINITY, 'yellow' = 0, 'green' = 0
Oct 21 11:21:08 [8281] bl460g1n6 crm_resource:  warning: handle_startup_fencing: 	Blind faith: not fencing unseen nodes
Oct 21 11:21:08 [8281] bl460g1n6 crm_resource:    debug: unpack_domains: 	Unpacking domains
Oct 21 11:21:08 [8286] bl460g1n6 crm_resource:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:08 [8286] bl460g1n6 crm_resource:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:08 [8286] bl460g1n6 crm_resource:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:08 [8281] bl460g1n6 crm_resource:     info: determine_online_status_fencing: 	Node bl460g1n8 is active
Oct 21 11:21:08 [8281] bl460g1n6 crm_resource:     info: determine_online_status: 	Node bl460g1n8 is online
Oct 21 11:21:08 [8281] bl460g1n6 crm_resource:     info: determine_online_status_fencing: 	Node bl460g1n6 is active
Oct 21 11:21:08 [8281] bl460g1n6 crm_resource:     info: determine_online_status: 	Node bl460g1n6 is online
Oct 21 11:21:08 [8281] bl460g1n6 crm_resource:     info: determine_online_status_fencing: 	Node bl460g1n7 is active
Oct 21 11:21:08 [8281] bl460g1n6 crm_resource:     info: determine_online_status: 	Node bl460g1n7 is online
Oct 21 11:21:08 [8286] bl460g1n6 crm_resource:    debug: cib_native_signon_raw: 	Connection to CIB successful
Oct 21 11:21:08 [8281] bl460g1n6 crm_resource:    debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n8 to prmPing:0
Oct 21 11:21:08 [8281] bl460g1n6 crm_resource:    debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n6 to prmPing:1
Oct 21 11:21:08 [8281] bl460g1n6 crm_resource:    debug: determine_op_status: 	prmVM1_monitor_0 on bl460g1n6 returned 'ok' (0) instead of the expected value: 'not running' (7)
Oct 21 11:21:08 [8281] bl460g1n6 crm_resource:     info: determine_op_status: 	Operation monitor found resource prmVM1 active on bl460g1n6
Oct 21 11:21:08 [8281] bl460g1n6 crm_resource:    debug: determine_op_status: 	prmVM2_monitor_0 on bl460g1n6 returned 'ok' (0) instead of the expected value: 'not running' (7)
Oct 21 11:21:08 [8281] bl460g1n6 crm_resource:     info: determine_op_status: 	Operation monitor found resource prmVM2 active on bl460g1n6
Oct 21 11:21:08 [8281] bl460g1n6 crm_resource:    debug: determine_op_status: 	prmVM3_monitor_0 on bl460g1n6 returned 'ok' (0) instead of the expected value: 'not running' (7)
Oct 21 11:21:08 [8281] bl460g1n6 crm_resource:     info: determine_op_status: 	Operation monitor found resource prmVM3 active on bl460g1n6
Oct 21 11:21:08 [8281] bl460g1n6 crm_resource:    debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n7 to prmPing:2
Oct 21 11:21:08 [8281] bl460g1n6 crm_resource:    debug: dump_resource_attr: 	Looking up cpu in prmVM1
Oct 21 11:21:08 [8281] bl460g1n6 crm_resource:    debug: cib_native_signoff: 	Signing out of the CIB Service
Oct 21 11:21:08 [8281] bl460g1n6 crm_resource:    debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Oct 21 11:21:08 [8281] bl460g1n6 crm_resource:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-request-7680-8281-14-header
Oct 21 11:21:08 [8281] bl460g1n6 crm_resource:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-response-7680-8281-14-header
Oct 21 11:21:08 [8281] bl460g1n6 crm_resource:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-event-7680-8281-14-header
Oct 21 11:21:08 [8281] bl460g1n6 crm_resource:     info: crm_xml_cleanup: 	Cleaning up memory from libxml2
Oct 21 11:21:08 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_query operation for section 'all': OK (rc=0, origin=local/crm_resource/2, version=0.14.68)
Oct 21 11:21:08 [7680] bl460g1n6        cib:    debug: qb_ipcs_dispatch_connection_request: 	HUP conn (7680-8281-14)
Oct 21 11:21:08 [7680] bl460g1n6        cib:    debug: qb_ipcs_disconnect: 	qb_ipcs_disconnect(7680-8281-14) state:2
Oct 21 11:21:08 [7680] bl460g1n6        cib:     info: crm_client_destroy: 	Destroying 0 events
Oct 21 11:21:08 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-response-7680-8281-14-header
Oct 21 11:21:08 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-event-7680-8281-14-header
Oct 21 11:21:08 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-request-7680-8281-14-header
Oct 21 11:21:08 [8286] bl460g1n6 crm_resource:    debug: unpack_config: 	STONITH timeout: 60000
Oct 21 11:21:08 [8286] bl460g1n6 crm_resource:    debug: unpack_config: 	STONITH of failed nodes is enabled
Oct 21 11:21:08 [8286] bl460g1n6 crm_resource:    debug: unpack_config: 	Stop all active resources: false
Oct 21 11:21:08 [8286] bl460g1n6 crm_resource:    debug: unpack_config: 	Cluster is symmetric - resources can run anywhere by default
Oct 21 11:21:08 [8286] bl460g1n6 crm_resource:    debug: unpack_config: 	Default stickiness: 0
Oct 21 11:21:08 [8286] bl460g1n6 crm_resource:    debug: unpack_config: 	On loss of CCM Quorum: Freeze resources
Oct 21 11:21:08 [8286] bl460g1n6 crm_resource:    debug: unpack_config: 	Node scores: 'red' = -INFINITY, 'yellow' = 0, 'green' = 0
Oct 21 11:21:08 [8286] bl460g1n6 crm_resource:  warning: handle_startup_fencing: 	Blind faith: not fencing unseen nodes
Oct 21 11:21:08 [8286] bl460g1n6 crm_resource:    debug: unpack_domains: 	Unpacking domains
Oct 21 11:21:08 [8286] bl460g1n6 crm_resource:     info: determine_online_status_fencing: 	Node bl460g1n8 is active
Oct 21 11:21:08 [8286] bl460g1n6 crm_resource:     info: determine_online_status: 	Node bl460g1n8 is online
Oct 21 11:21:08 [8286] bl460g1n6 crm_resource:     info: determine_online_status_fencing: 	Node bl460g1n6 is active
Oct 21 11:21:08 [8286] bl460g1n6 crm_resource:     info: determine_online_status: 	Node bl460g1n6 is online
Oct 21 11:21:08 [8286] bl460g1n6 crm_resource:     info: determine_online_status_fencing: 	Node bl460g1n7 is active
Oct 21 11:21:08 [8286] bl460g1n6 crm_resource:     info: determine_online_status: 	Node bl460g1n7 is online
Oct 21 11:21:08 [8286] bl460g1n6 crm_resource:    debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n8 to prmPing:0
Oct 21 11:21:08 [8286] bl460g1n6 crm_resource:    debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n6 to prmPing:1
Oct 21 11:21:08 [8286] bl460g1n6 crm_resource:    debug: determine_op_status: 	prmVM1_monitor_0 on bl460g1n6 returned 'ok' (0) instead of the expected value: 'not running' (7)
Oct 21 11:21:08 [8286] bl460g1n6 crm_resource:     info: determine_op_status: 	Operation monitor found resource prmVM1 active on bl460g1n6
Oct 21 11:21:08 [8286] bl460g1n6 crm_resource:    debug: determine_op_status: 	prmVM2_monitor_0 on bl460g1n6 returned 'ok' (0) instead of the expected value: 'not running' (7)
Oct 21 11:21:08 [8286] bl460g1n6 crm_resource:     info: determine_op_status: 	Operation monitor found resource prmVM2 active on bl460g1n6
Oct 21 11:21:08 [8286] bl460g1n6 crm_resource:    debug: determine_op_status: 	prmVM3_monitor_0 on bl460g1n6 returned 'ok' (0) instead of the expected value: 'not running' (7)
Oct 21 11:21:08 [8286] bl460g1n6 crm_resource:     info: determine_op_status: 	Operation monitor found resource prmVM3 active on bl460g1n6
Oct 21 11:21:08 [8286] bl460g1n6 crm_resource:    debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n7 to prmPing:2
Oct 21 11:21:08 [8286] bl460g1n6 crm_resource:    debug: dump_resource_attr: 	Looking up cpu in prmVM3
Oct 21 11:21:08 [8286] bl460g1n6 crm_resource:    debug: cib_native_signoff: 	Signing out of the CIB Service
Oct 21 11:21:08 [8286] bl460g1n6 crm_resource:    debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Oct 21 11:21:08 [7680] bl460g1n6        cib:    debug: qb_ipcs_dispatch_connection_request: 	HUP conn (7680-8286-15)
Oct 21 11:21:08 [8286] bl460g1n6 crm_resource:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-request-7680-8286-15-header
Oct 21 11:21:08 [7680] bl460g1n6        cib:    debug: qb_ipcs_disconnect: 	qb_ipcs_disconnect(7680-8286-15) state:2
Oct 21 11:21:08 [8286] bl460g1n6 crm_resource:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-response-7680-8286-15-header
Oct 21 11:21:08 [8286] bl460g1n6 crm_resource:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-event-7680-8286-15-header
Oct 21 11:21:08 [7680] bl460g1n6        cib:     info: crm_client_destroy: 	Destroying 0 events
Oct 21 11:21:08 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-response-7680-8286-15-header
Oct 21 11:21:08 [8286] bl460g1n6 crm_resource:     info: crm_xml_cleanup: 	Cleaning up memory from libxml2
Oct 21 11:21:08 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-event-7680-8286-15-header
Oct 21 11:21:08 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-request-7680-8286-15-header
Oct 21 11:21:08 [7680] bl460g1n6        cib:     info: crm_client_new: 	Connecting 0xbde1f0 for uid=0 gid=0 pid=8296 id=d6215469-b29a-4596-b41b-aebf4b94f787
Oct 21 11:21:08 [7680] bl460g1n6        cib:    debug: handle_new_connection: 	IPC credentials authenticated (7680-8296-14)
Oct 21 11:21:08 [7680] bl460g1n6        cib:    debug: qb_ipcs_shm_connect: 	connecting to client [8296]
Oct 21 11:21:08 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:08 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:08 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:08 [8296] bl460g1n6 crm_resource:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:08 [8296] bl460g1n6 crm_resource:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:08 [8296] bl460g1n6 crm_resource:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:08 [8296] bl460g1n6 crm_resource:    debug: cib_native_signon_raw: 	Connection to CIB successful
Oct 21 11:21:08 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_query operation for section 'all': OK (rc=0, origin=local/crm_resource/2, version=0.14.68)
Oct 21 11:21:08 [8296] bl460g1n6 crm_resource:    debug: unpack_config: 	STONITH timeout: 60000
Oct 21 11:21:08 [8296] bl460g1n6 crm_resource:    debug: unpack_config: 	STONITH of failed nodes is enabled
Oct 21 11:21:08 [8296] bl460g1n6 crm_resource:    debug: unpack_config: 	Stop all active resources: false
Oct 21 11:21:08 [8296] bl460g1n6 crm_resource:    debug: unpack_config: 	Cluster is symmetric - resources can run anywhere by default
Oct 21 11:21:08 [8296] bl460g1n6 crm_resource:    debug: unpack_config: 	Default stickiness: 0
Oct 21 11:21:08 [8296] bl460g1n6 crm_resource:    debug: unpack_config: 	On loss of CCM Quorum: Freeze resources
Oct 21 11:21:08 [8296] bl460g1n6 crm_resource:    debug: unpack_config: 	Node scores: 'red' = -INFINITY, 'yellow' = 0, 'green' = 0
Oct 21 11:21:08 [8296] bl460g1n6 crm_resource:  warning: handle_startup_fencing: 	Blind faith: not fencing unseen nodes
Oct 21 11:21:08 [8296] bl460g1n6 crm_resource:    debug: unpack_domains: 	Unpacking domains
Oct 21 11:21:08 [8296] bl460g1n6 crm_resource:     info: determine_online_status_fencing: 	Node bl460g1n8 is active
Oct 21 11:21:08 [8296] bl460g1n6 crm_resource:     info: determine_online_status: 	Node bl460g1n8 is online
Oct 21 11:21:08 [8296] bl460g1n6 crm_resource:     info: determine_online_status_fencing: 	Node bl460g1n6 is active
Oct 21 11:21:08 [8296] bl460g1n6 crm_resource:     info: determine_online_status: 	Node bl460g1n6 is online
Oct 21 11:21:08 [8296] bl460g1n6 crm_resource:     info: determine_online_status_fencing: 	Node bl460g1n7 is active
Oct 21 11:21:08 [8296] bl460g1n6 crm_resource:     info: determine_online_status: 	Node bl460g1n7 is online
Oct 21 11:21:08 [8296] bl460g1n6 crm_resource:    debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n8 to prmPing:0
Oct 21 11:21:08 [8296] bl460g1n6 crm_resource:    debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n6 to prmPing:1
Oct 21 11:21:08 [8296] bl460g1n6 crm_resource:    debug: determine_op_status: 	prmVM1_monitor_0 on bl460g1n6 returned 'ok' (0) instead of the expected value: 'not running' (7)
Oct 21 11:21:08 [8296] bl460g1n6 crm_resource:     info: determine_op_status: 	Operation monitor found resource prmVM1 active on bl460g1n6
Oct 21 11:21:08 [8296] bl460g1n6 crm_resource:    debug: determine_op_status: 	prmVM2_monitor_0 on bl460g1n6 returned 'ok' (0) instead of the expected value: 'not running' (7)
Oct 21 11:21:08 [8296] bl460g1n6 crm_resource:     info: determine_op_status: 	Operation monitor found resource prmVM2 active on bl460g1n6
Oct 21 11:21:08 [8296] bl460g1n6 crm_resource:    debug: determine_op_status: 	prmVM3_monitor_0 on bl460g1n6 returned 'ok' (0) instead of the expected value: 'not running' (7)
Oct 21 11:21:08 [8296] bl460g1n6 crm_resource:     info: determine_op_status: 	Operation monitor found resource prmVM3 active on bl460g1n6
Oct 21 11:21:08 [8296] bl460g1n6 crm_resource:    debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n7 to prmPing:2
Oct 21 11:21:08 [8296] bl460g1n6 crm_resource:    debug: dump_resource_attr: 	Looking up hv_memory in prmVM2
Oct 21 11:21:08 [8296] bl460g1n6 crm_resource:    debug: cib_native_signoff: 	Signing out of the CIB Service
Oct 21 11:21:08 [8296] bl460g1n6 crm_resource:    debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Oct 21 11:21:08 [8296] bl460g1n6 crm_resource:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-request-7680-8296-14-header
Oct 21 11:21:08 [7680] bl460g1n6        cib:    debug: qb_ipcs_dispatch_connection_request: 	HUP conn (7680-8296-14)
Oct 21 11:21:08 [7680] bl460g1n6        cib:    debug: qb_ipcs_disconnect: 	qb_ipcs_disconnect(7680-8296-14) state:2
Oct 21 11:21:08 [8296] bl460g1n6 crm_resource:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-response-7680-8296-14-header
Oct 21 11:21:08 [8296] bl460g1n6 crm_resource:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-event-7680-8296-14-header
Oct 21 11:21:08 [8296] bl460g1n6 crm_resource:     info: crm_xml_cleanup: 	Cleaning up memory from libxml2
Oct 21 11:21:08 [7680] bl460g1n6        cib:     info: crm_client_destroy: 	Destroying 0 events
Oct 21 11:21:08 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-response-7680-8296-14-header
Oct 21 11:21:08 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-event-7680-8296-14-header
Oct 21 11:21:08 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-request-7680-8296-14-header
Oct 21 11:21:08 [7682] bl460g1n6       lrmd:    debug: operation_finished: 	prmVM2_monitor_10000:8185 - exited with rc=0
Oct 21 11:21:08 [7682] bl460g1n6       lrmd:    debug: operation_finished: 	prmVM2_monitor_10000:8185:stderr [ -- empty -- ]
Oct 21 11:21:08 [7682] bl460g1n6       lrmd:    debug: operation_finished: 	prmVM2_monitor_10000:8185:stdout [ -- empty -- ]
Oct 21 11:21:08 [7682] bl460g1n6       lrmd:    debug: log_finished: 	finished - rsc:prmVM2 action:monitor call_id:44 pid:8185 exit-code:0 exec-time:0ms queue-time:0ms
Oct 21 11:21:08 [7680] bl460g1n6        cib:     info: crm_client_new: 	Connecting 0xbde1f0 for uid=0 gid=0 pid=8299 id=32efd4cf-f703-42b0-bd66-8bf9a801d4f3
Oct 21 11:21:08 [7680] bl460g1n6        cib:    debug: handle_new_connection: 	IPC credentials authenticated (7680-8299-14)
Oct 21 11:21:08 [7680] bl460g1n6        cib:    debug: qb_ipcs_shm_connect: 	connecting to client [8299]
Oct 21 11:21:08 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:08 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:08 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:08 [8299] bl460g1n6 crm_resource:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:08 [8299] bl460g1n6 crm_resource:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:08 [8299] bl460g1n6 crm_resource:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:08 [8299] bl460g1n6 crm_resource:    debug: cib_native_signon_raw: 	Connection to CIB successful
Oct 21 11:21:08 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_query operation for section 'all': OK (rc=0, origin=local/crm_resource/2, version=0.14.68)
Oct 21 11:21:08 [8299] bl460g1n6 crm_resource:    debug: unpack_config: 	STONITH timeout: 60000
Oct 21 11:21:08 [8299] bl460g1n6 crm_resource:    debug: unpack_config: 	STONITH of failed nodes is enabled
Oct 21 11:21:08 [8299] bl460g1n6 crm_resource:    debug: unpack_config: 	Stop all active resources: false
Oct 21 11:21:08 [8299] bl460g1n6 crm_resource:    debug: unpack_config: 	Cluster is symmetric - resources can run anywhere by default
Oct 21 11:21:08 [8299] bl460g1n6 crm_resource:    debug: unpack_config: 	Default stickiness: 0
Oct 21 11:21:08 [8299] bl460g1n6 crm_resource:    debug: unpack_config: 	On loss of CCM Quorum: Freeze resources
Oct 21 11:21:08 [8299] bl460g1n6 crm_resource:    debug: unpack_config: 	Node scores: 'red' = -INFINITY, 'yellow' = 0, 'green' = 0
Oct 21 11:21:08 [8299] bl460g1n6 crm_resource:  warning: handle_startup_fencing: 	Blind faith: not fencing unseen nodes
Oct 21 11:21:08 [8299] bl460g1n6 crm_resource:    debug: unpack_domains: 	Unpacking domains
Oct 21 11:21:08 [8299] bl460g1n6 crm_resource:     info: determine_online_status_fencing: 	Node bl460g1n8 is active
Oct 21 11:21:08 [8299] bl460g1n6 crm_resource:     info: determine_online_status: 	Node bl460g1n8 is online
Oct 21 11:21:08 [8299] bl460g1n6 crm_resource:     info: determine_online_status_fencing: 	Node bl460g1n6 is active
Oct 21 11:21:08 [8299] bl460g1n6 crm_resource:     info: determine_online_status: 	Node bl460g1n6 is online
Oct 21 11:21:08 [8299] bl460g1n6 crm_resource:     info: determine_online_status_fencing: 	Node bl460g1n7 is active
Oct 21 11:21:08 [8299] bl460g1n6 crm_resource:     info: determine_online_status: 	Node bl460g1n7 is online
Oct 21 11:21:08 [8299] bl460g1n6 crm_resource:    debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n8 to prmPing:0
Oct 21 11:21:08 [8299] bl460g1n6 crm_resource:    debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n6 to prmPing:1
Oct 21 11:21:08 [8299] bl460g1n6 crm_resource:    debug: determine_op_status: 	prmVM1_monitor_0 on bl460g1n6 returned 'ok' (0) instead of the expected value: 'not running' (7)
Oct 21 11:21:08 [8299] bl460g1n6 crm_resource:     info: determine_op_status: 	Operation monitor found resource prmVM1 active on bl460g1n6
Oct 21 11:21:08 [8299] bl460g1n6 crm_resource:    debug: determine_op_status: 	prmVM2_monitor_0 on bl460g1n6 returned 'ok' (0) instead of the expected value: 'not running' (7)
Oct 21 11:21:08 [8299] bl460g1n6 crm_resource:     info: determine_op_status: 	Operation monitor found resource prmVM2 active on bl460g1n6
Oct 21 11:21:08 [7680] bl460g1n6        cib:     info: crm_client_new: 	Connecting 0xc786d0 for uid=0 gid=0 pid=8301 id=76e37dfc-4369-4a25-a322-caf38dbf3c27
Oct 21 11:21:08 [7680] bl460g1n6        cib:    debug: handle_new_connection: 	IPC credentials authenticated (7680-8301-15)
Oct 21 11:21:08 [8299] bl460g1n6 crm_resource:    debug: determine_op_status: 	prmVM3_monitor_0 on bl460g1n6 returned 'ok' (0) instead of the expected value: 'not running' (7)
Oct 21 11:21:08 [7680] bl460g1n6        cib:    debug: qb_ipcs_shm_connect: 	connecting to client [8301]
Oct 21 11:21:08 [8299] bl460g1n6 crm_resource:     info: determine_op_status: 	Operation monitor found resource prmVM3 active on bl460g1n6
Oct 21 11:21:08 [8299] bl460g1n6 crm_resource:    debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n7 to prmPing:2
Oct 21 11:21:08 [8299] bl460g1n6 crm_resource:    debug: dump_resource_attr: 	Looking up hv_memory in prmVM1
Oct 21 11:21:08 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:08 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:08 [8299] bl460g1n6 crm_resource:    debug: cib_native_signoff: 	Signing out of the CIB Service
Oct 21 11:21:08 [8299] bl460g1n6 crm_resource:    debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Oct 21 11:21:08 [8299] bl460g1n6 crm_resource:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-request-7680-8299-14-header
Oct 21 11:21:08 [8299] bl460g1n6 crm_resource:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-response-7680-8299-14-header
Oct 21 11:21:08 [8299] bl460g1n6 crm_resource:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-event-7680-8299-14-header
Oct 21 11:21:08 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:08 [8299] bl460g1n6 crm_resource:     info: crm_xml_cleanup: 	Cleaning up memory from libxml2
Oct 21 11:21:08 [7680] bl460g1n6        cib:    debug: qb_ipcs_dispatch_connection_request: 	HUP conn (7680-8299-14)
Oct 21 11:21:08 [7680] bl460g1n6        cib:    debug: qb_ipcs_disconnect: 	qb_ipcs_disconnect(7680-8299-14) state:2
Oct 21 11:21:08 [8301] bl460g1n6 crm_resource:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:08 [7680] bl460g1n6        cib:     info: crm_client_destroy: 	Destroying 0 events
Oct 21 11:21:08 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-response-7680-8299-14-header
Oct 21 11:21:08 [8301] bl460g1n6 crm_resource:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:08 [8301] bl460g1n6 crm_resource:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:08 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-event-7680-8299-14-header
Oct 21 11:21:08 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-request-7680-8299-14-header
Oct 21 11:21:08 [8301] bl460g1n6 crm_resource:    debug: cib_native_signon_raw: 	Connection to CIB successful
Oct 21 11:21:08 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_query operation for section 'all': OK (rc=0, origin=local/crm_resource/2, version=0.14.68)
Oct 21 11:21:08 [7682] bl460g1n6       lrmd:    debug: operation_finished: 	prmVM1_monitor_10000:8192 - exited with rc=0
Oct 21 11:21:08 [7682] bl460g1n6       lrmd:    debug: operation_finished: 	prmVM1_monitor_10000:8192:stderr [ -- empty -- ]
Oct 21 11:21:08 [7682] bl460g1n6       lrmd:    debug: operation_finished: 	prmVM1_monitor_10000:8192:stdout [ -- empty -- ]
Oct 21 11:21:08 [7682] bl460g1n6       lrmd:    debug: log_finished: 	finished - rsc:prmVM1 action:monitor call_id:43 pid:8192 exit-code:0 exec-time:0ms queue-time:0ms
Oct 21 11:21:08 [8301] bl460g1n6 crm_resource:    debug: unpack_config: 	STONITH timeout: 60000
Oct 21 11:21:08 [8301] bl460g1n6 crm_resource:    debug: unpack_config: 	STONITH of failed nodes is enabled
Oct 21 11:21:08 [8301] bl460g1n6 crm_resource:    debug: unpack_config: 	Stop all active resources: false
Oct 21 11:21:08 [8301] bl460g1n6 crm_resource:    debug: unpack_config: 	Cluster is symmetric - resources can run anywhere by default
Oct 21 11:21:08 [8301] bl460g1n6 crm_resource:    debug: unpack_config: 	Default stickiness: 0
Oct 21 11:21:08 [8301] bl460g1n6 crm_resource:    debug: unpack_config: 	On loss of CCM Quorum: Freeze resources
Oct 21 11:21:08 [8301] bl460g1n6 crm_resource:    debug: unpack_config: 	Node scores: 'red' = -INFINITY, 'yellow' = 0, 'green' = 0
Oct 21 11:21:08 [8301] bl460g1n6 crm_resource:  warning: handle_startup_fencing: 	Blind faith: not fencing unseen nodes
Oct 21 11:21:08 [8301] bl460g1n6 crm_resource:    debug: unpack_domains: 	Unpacking domains
Oct 21 11:21:08 [8301] bl460g1n6 crm_resource:     info: determine_online_status_fencing: 	Node bl460g1n8 is active
Oct 21 11:21:08 [8301] bl460g1n6 crm_resource:     info: determine_online_status: 	Node bl460g1n8 is online
Oct 21 11:21:08 [8301] bl460g1n6 crm_resource:     info: determine_online_status_fencing: 	Node bl460g1n6 is active
Oct 21 11:21:08 [8301] bl460g1n6 crm_resource:     info: determine_online_status: 	Node bl460g1n6 is online
Oct 21 11:21:08 [8301] bl460g1n6 crm_resource:     info: determine_online_status_fencing: 	Node bl460g1n7 is active
Oct 21 11:21:08 [8301] bl460g1n6 crm_resource:     info: determine_online_status: 	Node bl460g1n7 is online
Oct 21 11:21:08 [8301] bl460g1n6 crm_resource:    debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n8 to prmPing:0
Oct 21 11:21:08 [8301] bl460g1n6 crm_resource:    debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n6 to prmPing:1
Oct 21 11:21:08 [8301] bl460g1n6 crm_resource:    debug: determine_op_status: 	prmVM1_monitor_0 on bl460g1n6 returned 'ok' (0) instead of the expected value: 'not running' (7)
Oct 21 11:21:08 [8301] bl460g1n6 crm_resource:     info: determine_op_status: 	Operation monitor found resource prmVM1 active on bl460g1n6
Oct 21 11:21:08 [8301] bl460g1n6 crm_resource:    debug: determine_op_status: 	prmVM2_monitor_0 on bl460g1n6 returned 'ok' (0) instead of the expected value: 'not running' (7)
Oct 21 11:21:08 [8301] bl460g1n6 crm_resource:     info: determine_op_status: 	Operation monitor found resource prmVM2 active on bl460g1n6
Oct 21 11:21:08 [8301] bl460g1n6 crm_resource:    debug: determine_op_status: 	prmVM3_monitor_0 on bl460g1n6 returned 'ok' (0) instead of the expected value: 'not running' (7)
Oct 21 11:21:08 [8301] bl460g1n6 crm_resource:     info: determine_op_status: 	Operation monitor found resource prmVM3 active on bl460g1n6
Oct 21 11:21:08 [8301] bl460g1n6 crm_resource:    debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n7 to prmPing:2
Oct 21 11:21:08 [8301] bl460g1n6 crm_resource:    debug: dump_resource_attr: 	Looking up hv_memory in prmVM3
Oct 21 11:21:08 [8301] bl460g1n6 crm_resource:    debug: cib_native_signoff: 	Signing out of the CIB Service
Oct 21 11:21:08 [8301] bl460g1n6 crm_resource:    debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Oct 21 11:21:08 [7680] bl460g1n6        cib:    debug: qb_ipcs_dispatch_connection_request: 	HUP conn (7680-8301-15)
Oct 21 11:21:08 [8301] bl460g1n6 crm_resource:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-request-7680-8301-15-header
Oct 21 11:21:08 [7680] bl460g1n6        cib:    debug: qb_ipcs_disconnect: 	qb_ipcs_disconnect(7680-8301-15) state:2
Oct 21 11:21:08 [8301] bl460g1n6 crm_resource:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-response-7680-8301-15-header
Oct 21 11:21:08 [8301] bl460g1n6 crm_resource:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-event-7680-8301-15-header
Oct 21 11:21:08 [8301] bl460g1n6 crm_resource:     info: crm_xml_cleanup: 	Cleaning up memory from libxml2
Oct 21 11:21:08 [7680] bl460g1n6        cib:     info: crm_client_destroy: 	Destroying 0 events
Oct 21 11:21:08 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-response-7680-8301-15-header
Oct 21 11:21:08 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-event-7680-8301-15-header
Oct 21 11:21:08 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-request-7680-8301-15-header
Oct 21 11:21:08 [7682] bl460g1n6       lrmd:    debug: operation_finished: 	prmVM3_monitor_10000:8194 - exited with rc=0
Oct 21 11:21:08 [7682] bl460g1n6       lrmd:    debug: operation_finished: 	prmVM3_monitor_10000:8194:stderr [ -- empty -- ]
Oct 21 11:21:08 [7682] bl460g1n6       lrmd:    debug: operation_finished: 	prmVM3_monitor_10000:8194:stdout [ -- empty -- ]
Oct 21 11:21:08 [7682] bl460g1n6       lrmd:    debug: log_finished: 	finished - rsc:prmVM3 action:monitor call_id:45 pid:8194 exit-code:0 exec-time:0ms queue-time:0ms
Oct 21 11:21:14 [7682] bl460g1n6       lrmd:    debug: recurring_action_timer: 	Scheduling another invokation of prmPing_monitor_10000
Oct 21 11:21:15 [8323] bl460g1n6 attrd_updater:     info: attrd_update_delegate: 	Connecting to cluster... 5 retries remaining
Oct 21 11:21:15 [7683] bl460g1n6      attrd:     info: crm_client_new: 	Connecting 0x14f9260 for uid=0 gid=0 pid=8323 id=1ea5402f-cbec-4946-a603-375953cd33d0
Oct 21 11:21:15 [7683] bl460g1n6      attrd:    debug: handle_new_connection: 	IPC credentials authenticated (7683-8323-10)
Oct 21 11:21:15 [7683] bl460g1n6      attrd:    debug: qb_ipcs_shm_connect: 	connecting to client [8323]
Oct 21 11:21:15 [7683] bl460g1n6      attrd:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:15 [7683] bl460g1n6      attrd:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:15 [7683] bl460g1n6      attrd:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:15 [8323] bl460g1n6 attrd_updater:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:15 [8323] bl460g1n6 attrd_updater:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:15 [8323] bl460g1n6 attrd_updater:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:15 [8323] bl460g1n6 attrd_updater:    debug: attrd_update_delegate: 	Sent update: default_ping_set=100 for localhost
Oct 21 11:21:15 [7683] bl460g1n6      attrd:     info: attrd_client_message: 	Broadcasting default_ping_set[bl460g1n6] = 100 (writer)
Oct 21 11:21:15 [8323] bl460g1n6 attrd_updater:     info: crm_xml_cleanup: 	Cleaning up memory from libxml2
Oct 21 11:21:15 [7683] bl460g1n6      attrd:    debug: qb_ipcs_dispatch_connection_request: 	HUP conn (7683-8323-10)
Oct 21 11:21:15 [7683] bl460g1n6      attrd:    debug: qb_ipcs_disconnect: 	qb_ipcs_disconnect(7683-8323-10) state:2
Oct 21 11:21:15 [7683] bl460g1n6      attrd:     info: crm_client_destroy: 	Destroying 0 events
Oct 21 11:21:15 [7683] bl460g1n6      attrd:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-attrd-response-7683-8323-10-header
Oct 21 11:21:15 [7683] bl460g1n6      attrd:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-attrd-event-7683-8323-10-header
Oct 21 11:21:15 [7683] bl460g1n6      attrd:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-attrd-request-7683-8323-10-header
Oct 21 11:21:15 [7682] bl460g1n6       lrmd:    debug: operation_finished: 	prmPing_monitor_10000:8307 - exited with rc=0
Oct 21 11:21:15 [7682] bl460g1n6       lrmd:    debug: operation_finished: 	prmPing_monitor_10000:8307:stderr [ -- empty -- ]
Oct 21 11:21:15 [7682] bl460g1n6       lrmd:    debug: operation_finished: 	prmPing_monitor_10000:8307:stdout [ -- empty -- ]
Oct 21 11:21:15 [7682] bl460g1n6       lrmd:    debug: log_finished: 	finished - rsc:prmPing action:monitor call_id:47 pid:8307 exit-code:0 exec-time:0ms queue-time:0ms
Oct 21 11:21:18 [7682] bl460g1n6       lrmd:    debug: recurring_action_timer: 	Scheduling another invokation of prmVM2_monitor_10000
Oct 21 11:21:18 [7682] bl460g1n6       lrmd:    debug: recurring_action_timer: 	Scheduling another invokation of prmVM1_monitor_10000
Oct 21 11:21:18 [7682] bl460g1n6       lrmd:    debug: recurring_action_timer: 	Scheduling another invokation of prmVM3_monitor_10000
VirtualDomain(prmVM2)[8327]:	2013/10/21_11:21:18 DEBUG: Virtual domain vm2 is currently running.
VirtualDomain(prmVM1)[8334]:	2013/10/21_11:21:18 DEBUG: Virtual domain vm1 is currently running.
VirtualDomain(prmVM3)[8337]:	2013/10/21_11:21:18 DEBUG: Virtual domain vm3 is currently running.
Oct 21 11:21:18 [7680] bl460g1n6        cib:     info: crm_client_new: 	Connecting 0xbde1f0 for uid=0 gid=0 pid=8420 id=4ab5e28f-13a0-4ab5-9cdc-011989fa7c2b
Oct 21 11:21:18 [7680] bl460g1n6        cib:    debug: handle_new_connection: 	IPC credentials authenticated (7680-8420-14)
Oct 21 11:21:18 [7680] bl460g1n6        cib:    debug: qb_ipcs_shm_connect: 	connecting to client [8420]
Oct 21 11:21:18 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:18 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:18 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:18 [8420] bl460g1n6 crm_resource:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:18 [8420] bl460g1n6 crm_resource:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:18 [8420] bl460g1n6 crm_resource:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:18 [8420] bl460g1n6 crm_resource:    debug: cib_native_signon_raw: 	Connection to CIB successful
Oct 21 11:21:18 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_query operation for section 'all': OK (rc=0, origin=local/crm_resource/2, version=0.14.68)
Oct 21 11:21:18 [8420] bl460g1n6 crm_resource:    debug: unpack_config: 	STONITH timeout: 60000
Oct 21 11:21:18 [8420] bl460g1n6 crm_resource:    debug: unpack_config: 	STONITH of failed nodes is enabled
Oct 21 11:21:18 [8420] bl460g1n6 crm_resource:    debug: unpack_config: 	Stop all active resources: false
Oct 21 11:21:18 [8420] bl460g1n6 crm_resource:    debug: unpack_config: 	Cluster is symmetric - resources can run anywhere by default
Oct 21 11:21:18 [8420] bl460g1n6 crm_resource:    debug: unpack_config: 	Default stickiness: 0
Oct 21 11:21:18 [8420] bl460g1n6 crm_resource:    debug: unpack_config: 	On loss of CCM Quorum: Freeze resources
Oct 21 11:21:18 [8420] bl460g1n6 crm_resource:    debug: unpack_config: 	Node scores: 'red' = -INFINITY, 'yellow' = 0, 'green' = 0
Oct 21 11:21:18 [8420] bl460g1n6 crm_resource:  warning: handle_startup_fencing: 	Blind faith: not fencing unseen nodes
Oct 21 11:21:18 [8420] bl460g1n6 crm_resource:    debug: unpack_domains: 	Unpacking domains
Oct 21 11:21:18 [8420] bl460g1n6 crm_resource:     info: determine_online_status_fencing: 	Node bl460g1n8 is active
Oct 21 11:21:18 [8420] bl460g1n6 crm_resource:     info: determine_online_status: 	Node bl460g1n8 is online
Oct 21 11:21:18 [8420] bl460g1n6 crm_resource:     info: determine_online_status_fencing: 	Node bl460g1n6 is active
Oct 21 11:21:18 [8420] bl460g1n6 crm_resource:     info: determine_online_status: 	Node bl460g1n6 is online
Oct 21 11:21:18 [8420] bl460g1n6 crm_resource:     info: determine_online_status_fencing: 	Node bl460g1n7 is active
Oct 21 11:21:18 [8420] bl460g1n6 crm_resource:     info: determine_online_status: 	Node bl460g1n7 is online
Oct 21 11:21:18 [8420] bl460g1n6 crm_resource:    debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n8 to prmPing:0
Oct 21 11:21:18 [8420] bl460g1n6 crm_resource:    debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n6 to prmPing:1
Oct 21 11:21:18 [8420] bl460g1n6 crm_resource:    debug: determine_op_status: 	prmVM1_monitor_0 on bl460g1n6 returned 'ok' (0) instead of the expected value: 'not running' (7)
Oct 21 11:21:18 [8420] bl460g1n6 crm_resource:     info: determine_op_status: 	Operation monitor found resource prmVM1 active on bl460g1n6
Oct 21 11:21:18 [8420] bl460g1n6 crm_resource:    debug: determine_op_status: 	prmVM2_monitor_0 on bl460g1n6 returned 'ok' (0) instead of the expected value: 'not running' (7)
Oct 21 11:21:18 [8420] bl460g1n6 crm_resource:     info: determine_op_status: 	Operation monitor found resource prmVM2 active on bl460g1n6
Oct 21 11:21:18 [8420] bl460g1n6 crm_resource:    debug: determine_op_status: 	prmVM3_monitor_0 on bl460g1n6 returned 'ok' (0) instead of the expected value: 'not running' (7)
Oct 21 11:21:18 [8420] bl460g1n6 crm_resource:     info: determine_op_status: 	Operation monitor found resource prmVM3 active on bl460g1n6
Oct 21 11:21:18 [8420] bl460g1n6 crm_resource:    debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n7 to prmPing:2
Oct 21 11:21:18 [8420] bl460g1n6 crm_resource:    debug: dump_resource_attr: 	Looking up cpu in prmVM2
Oct 21 11:21:18 [8420] bl460g1n6 crm_resource:    debug: cib_native_signoff: 	Signing out of the CIB Service
Oct 21 11:21:18 [8420] bl460g1n6 crm_resource:    debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Oct 21 11:21:18 [7680] bl460g1n6        cib:    debug: qb_ipcs_dispatch_connection_request: 	HUP conn (7680-8420-14)
Oct 21 11:21:18 [7680] bl460g1n6        cib:    debug: qb_ipcs_disconnect: 	qb_ipcs_disconnect(7680-8420-14) state:2
Oct 21 11:21:18 [7680] bl460g1n6        cib:     info: crm_client_destroy: 	Destroying 0 events
Oct 21 11:21:18 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-response-7680-8420-14-header
Oct 21 11:21:18 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-event-7680-8420-14-header
Oct 21 11:21:18 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-request-7680-8420-14-header
Oct 21 11:21:18 [8420] bl460g1n6 crm_resource:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-request-7680-8420-14-header
Oct 21 11:21:18 [8420] bl460g1n6 crm_resource:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-response-7680-8420-14-header
Oct 21 11:21:18 [8420] bl460g1n6 crm_resource:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-event-7680-8420-14-header
Oct 21 11:21:18 [8420] bl460g1n6 crm_resource:     info: crm_xml_cleanup: 	Cleaning up memory from libxml2
Oct 21 11:21:18 [7680] bl460g1n6        cib:     info: crm_client_new: 	Connecting 0xbde1f0 for uid=0 gid=0 pid=8425 id=f2606a0e-b577-43eb-add3-35ad5cc27014
Oct 21 11:21:18 [7680] bl460g1n6        cib:    debug: handle_new_connection: 	IPC credentials authenticated (7680-8425-14)
Oct 21 11:21:18 [7680] bl460g1n6        cib:    debug: qb_ipcs_shm_connect: 	connecting to client [8425]
Oct 21 11:21:18 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:18 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:18 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:18 [8425] bl460g1n6 crm_resource:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:18 [8425] bl460g1n6 crm_resource:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:18 [8425] bl460g1n6 crm_resource:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:18 [8425] bl460g1n6 crm_resource:    debug: cib_native_signon_raw: 	Connection to CIB successful
Oct 21 11:21:19 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_query operation for section 'all': OK (rc=0, origin=local/crm_resource/2, version=0.14.68)
Oct 21 11:21:19 [7680] bl460g1n6        cib:     info: crm_client_new: 	Connecting 0xc786d0 for uid=0 gid=0 pid=8428 id=8e000ca2-f0fa-4d81-961a-c35b2a3c92d3
Oct 21 11:21:19 [7680] bl460g1n6        cib:    debug: handle_new_connection: 	IPC credentials authenticated (7680-8428-15)
Oct 21 11:21:19 [7680] bl460g1n6        cib:    debug: qb_ipcs_shm_connect: 	connecting to client [8428]
Oct 21 11:21:19 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:19 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:19 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:19 [8425] bl460g1n6 crm_resource:    debug: unpack_config: 	STONITH timeout: 60000
Oct 21 11:21:19 [8425] bl460g1n6 crm_resource:    debug: unpack_config: 	STONITH of failed nodes is enabled
Oct 21 11:21:19 [8428] bl460g1n6 crm_resource:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:19 [8425] bl460g1n6 crm_resource:    debug: unpack_config: 	Stop all active resources: false
Oct 21 11:21:19 [8425] bl460g1n6 crm_resource:    debug: unpack_config: 	Cluster is symmetric - resources can run anywhere by default
Oct 21 11:21:19 [8425] bl460g1n6 crm_resource:    debug: unpack_config: 	Default stickiness: 0
Oct 21 11:21:19 [8425] bl460g1n6 crm_resource:    debug: unpack_config: 	On loss of CCM Quorum: Freeze resources
Oct 21 11:21:19 [8425] bl460g1n6 crm_resource:    debug: unpack_config: 	Node scores: 'red' = -INFINITY, 'yellow' = 0, 'green' = 0
Oct 21 11:21:19 [8428] bl460g1n6 crm_resource:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:19 [8425] bl460g1n6 crm_resource:  warning: handle_startup_fencing: 	Blind faith: not fencing unseen nodes
Oct 21 11:21:19 [8428] bl460g1n6 crm_resource:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:19 [8425] bl460g1n6 crm_resource:    debug: unpack_domains: 	Unpacking domains
Oct 21 11:21:19 [8425] bl460g1n6 crm_resource:     info: determine_online_status_fencing: 	Node bl460g1n8 is active
Oct 21 11:21:19 [8425] bl460g1n6 crm_resource:     info: determine_online_status: 	Node bl460g1n8 is online
Oct 21 11:21:19 [8425] bl460g1n6 crm_resource:     info: determine_online_status_fencing: 	Node bl460g1n6 is active
Oct 21 11:21:19 [8425] bl460g1n6 crm_resource:     info: determine_online_status: 	Node bl460g1n6 is online
Oct 21 11:21:19 [8425] bl460g1n6 crm_resource:     info: determine_online_status_fencing: 	Node bl460g1n7 is active
Oct 21 11:21:19 [8425] bl460g1n6 crm_resource:     info: determine_online_status: 	Node bl460g1n7 is online
Oct 21 11:21:19 [8428] bl460g1n6 crm_resource:    debug: cib_native_signon_raw: 	Connection to CIB successful
Oct 21 11:21:19 [8425] bl460g1n6 crm_resource:    debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n8 to prmPing:0
Oct 21 11:21:19 [8425] bl460g1n6 crm_resource:    debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n6 to prmPing:1
Oct 21 11:21:19 [8425] bl460g1n6 crm_resource:    debug: determine_op_status: 	prmVM1_monitor_0 on bl460g1n6 returned 'ok' (0) instead of the expected value: 'not running' (7)
Oct 21 11:21:19 [8425] bl460g1n6 crm_resource:     info: determine_op_status: 	Operation monitor found resource prmVM1 active on bl460g1n6
Oct 21 11:21:19 [8425] bl460g1n6 crm_resource:    debug: determine_op_status: 	prmVM2_monitor_0 on bl460g1n6 returned 'ok' (0) instead of the expected value: 'not running' (7)
Oct 21 11:21:19 [8425] bl460g1n6 crm_resource:     info: determine_op_status: 	Operation monitor found resource prmVM2 active on bl460g1n6
Oct 21 11:21:19 [8425] bl460g1n6 crm_resource:    debug: determine_op_status: 	prmVM3_monitor_0 on bl460g1n6 returned 'ok' (0) instead of the expected value: 'not running' (7)
Oct 21 11:21:19 [8425] bl460g1n6 crm_resource:     info: determine_op_status: 	Operation monitor found resource prmVM3 active on bl460g1n6
Oct 21 11:21:19 [8425] bl460g1n6 crm_resource:    debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n7 to prmPing:2
Oct 21 11:21:19 [8425] bl460g1n6 crm_resource:    debug: dump_resource_attr: 	Looking up cpu in prmVM1
Oct 21 11:21:19 [8425] bl460g1n6 crm_resource:    debug: cib_native_signoff: 	Signing out of the CIB Service
Oct 21 11:21:19 [8425] bl460g1n6 crm_resource:    debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Oct 21 11:21:19 [8425] bl460g1n6 crm_resource:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-request-7680-8425-14-header
Oct 21 11:21:19 [8425] bl460g1n6 crm_resource:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-response-7680-8425-14-header
Oct 21 11:21:19 [8425] bl460g1n6 crm_resource:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-event-7680-8425-14-header
Oct 21 11:21:19 [8425] bl460g1n6 crm_resource:     info: crm_xml_cleanup: 	Cleaning up memory from libxml2
Oct 21 11:21:19 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_query operation for section 'all': OK (rc=0, origin=local/crm_resource/2, version=0.14.68)
Oct 21 11:21:19 [7680] bl460g1n6        cib:    debug: qb_ipcs_dispatch_connection_request: 	HUP conn (7680-8425-14)
Oct 21 11:21:19 [7680] bl460g1n6        cib:    debug: qb_ipcs_disconnect: 	qb_ipcs_disconnect(7680-8425-14) state:2
Oct 21 11:21:19 [7680] bl460g1n6        cib:     info: crm_client_destroy: 	Destroying 0 events
Oct 21 11:21:19 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-response-7680-8425-14-header
Oct 21 11:21:19 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-event-7680-8425-14-header
Oct 21 11:21:19 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-request-7680-8425-14-header
Oct 21 11:21:19 [8428] bl460g1n6 crm_resource:    debug: unpack_config: 	STONITH timeout: 60000
Oct 21 11:21:19 [8428] bl460g1n6 crm_resource:    debug: unpack_config: 	STONITH of failed nodes is enabled
Oct 21 11:21:19 [8428] bl460g1n6 crm_resource:    debug: unpack_config: 	Stop all active resources: false
Oct 21 11:21:19 [8428] bl460g1n6 crm_resource:    debug: unpack_config: 	Cluster is symmetric - resources can run anywhere by default
Oct 21 11:21:19 [8428] bl460g1n6 crm_resource:    debug: unpack_config: 	Default stickiness: 0
Oct 21 11:21:19 [8428] bl460g1n6 crm_resource:    debug: unpack_config: 	On loss of CCM Quorum: Freeze resources
Oct 21 11:21:19 [8428] bl460g1n6 crm_resource:    debug: unpack_config: 	Node scores: 'red' = -INFINITY, 'yellow' = 0, 'green' = 0
Oct 21 11:21:19 [8428] bl460g1n6 crm_resource:  warning: handle_startup_fencing: 	Blind faith: not fencing unseen nodes
Oct 21 11:21:19 [8428] bl460g1n6 crm_resource:    debug: unpack_domains: 	Unpacking domains
Oct 21 11:21:19 [8428] bl460g1n6 crm_resource:     info: determine_online_status_fencing: 	Node bl460g1n8 is active
Oct 21 11:21:19 [8428] bl460g1n6 crm_resource:     info: determine_online_status: 	Node bl460g1n8 is online
Oct 21 11:21:19 [8428] bl460g1n6 crm_resource:     info: determine_online_status_fencing: 	Node bl460g1n6 is active
Oct 21 11:21:19 [8428] bl460g1n6 crm_resource:     info: determine_online_status: 	Node bl460g1n6 is online
Oct 21 11:21:19 [8428] bl460g1n6 crm_resource:     info: determine_online_status_fencing: 	Node bl460g1n7 is active
Oct 21 11:21:19 [8428] bl460g1n6 crm_resource:     info: determine_online_status: 	Node bl460g1n7 is online
Oct 21 11:21:19 [8428] bl460g1n6 crm_resource:    debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n8 to prmPing:0
Oct 21 11:21:19 [8428] bl460g1n6 crm_resource:    debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n6 to prmPing:1
Oct 21 11:21:19 [8428] bl460g1n6 crm_resource:    debug: determine_op_status: 	prmVM1_monitor_0 on bl460g1n6 returned 'ok' (0) instead of the expected value: 'not running' (7)
Oct 21 11:21:19 [8428] bl460g1n6 crm_resource:     info: determine_op_status: 	Operation monitor found resource prmVM1 active on bl460g1n6
Oct 21 11:21:19 [8428] bl460g1n6 crm_resource:    debug: determine_op_status: 	prmVM2_monitor_0 on bl460g1n6 returned 'ok' (0) instead of the expected value: 'not running' (7)
Oct 21 11:21:19 [8428] bl460g1n6 crm_resource:     info: determine_op_status: 	Operation monitor found resource prmVM2 active on bl460g1n6
Oct 21 11:21:19 [8428] bl460g1n6 crm_resource:    debug: determine_op_status: 	prmVM3_monitor_0 on bl460g1n6 returned 'ok' (0) instead of the expected value: 'not running' (7)
Oct 21 11:21:19 [8428] bl460g1n6 crm_resource:     info: determine_op_status: 	Operation monitor found resource prmVM3 active on bl460g1n6
Oct 21 11:21:19 [8428] bl460g1n6 crm_resource:    debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n7 to prmPing:2
Oct 21 11:21:19 [8428] bl460g1n6 crm_resource:    debug: dump_resource_attr: 	Looking up cpu in prmVM3
Oct 21 11:21:19 [8428] bl460g1n6 crm_resource:    debug: cib_native_signoff: 	Signing out of the CIB Service
Oct 21 11:21:19 [8428] bl460g1n6 crm_resource:    debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Oct 21 11:21:19 [8428] bl460g1n6 crm_resource:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-request-7680-8428-15-header
Oct 21 11:21:19 [7680] bl460g1n6        cib:    debug: qb_ipcs_dispatch_connection_request: 	HUP conn (7680-8428-15)
Oct 21 11:21:19 [7680] bl460g1n6        cib:    debug: qb_ipcs_disconnect: 	qb_ipcs_disconnect(7680-8428-15) state:2
Oct 21 11:21:19 [8428] bl460g1n6 crm_resource:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-response-7680-8428-15-header
Oct 21 11:21:19 [8428] bl460g1n6 crm_resource:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-event-7680-8428-15-header
Oct 21 11:21:19 [7680] bl460g1n6        cib:     info: crm_client_destroy: 	Destroying 0 events
Oct 21 11:21:19 [8428] bl460g1n6 crm_resource:     info: crm_xml_cleanup: 	Cleaning up memory from libxml2
Oct 21 11:21:19 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-response-7680-8428-15-header
Oct 21 11:21:19 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-event-7680-8428-15-header
Oct 21 11:21:19 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-request-7680-8428-15-header
Oct 21 11:21:19 [7680] bl460g1n6        cib:     info: crm_client_new: 	Connecting 0xbde1f0 for uid=0 gid=0 pid=8438 id=55650f56-bcac-4c55-bb62-14ab8cfa59df
Oct 21 11:21:19 [7680] bl460g1n6        cib:    debug: handle_new_connection: 	IPC credentials authenticated (7680-8438-14)
Oct 21 11:21:19 [7680] bl460g1n6        cib:    debug: qb_ipcs_shm_connect: 	connecting to client [8438]
Oct 21 11:21:19 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:19 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:19 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:19 [8438] bl460g1n6 crm_resource:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:19 [8438] bl460g1n6 crm_resource:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:19 [8438] bl460g1n6 crm_resource:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:19 [8438] bl460g1n6 crm_resource:    debug: cib_native_signon_raw: 	Connection to CIB successful
Oct 21 11:21:19 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_query operation for section 'all': OK (rc=0, origin=local/crm_resource/2, version=0.14.68)
Oct 21 11:21:19 [8438] bl460g1n6 crm_resource:    debug: unpack_config: 	STONITH timeout: 60000
Oct 21 11:21:19 [8438] bl460g1n6 crm_resource:    debug: unpack_config: 	STONITH of failed nodes is enabled
Oct 21 11:21:19 [8438] bl460g1n6 crm_resource:    debug: unpack_config: 	Stop all active resources: false
Oct 21 11:21:19 [8438] bl460g1n6 crm_resource:    debug: unpack_config: 	Cluster is symmetric - resources can run anywhere by default
Oct 21 11:21:19 [8438] bl460g1n6 crm_resource:    debug: unpack_config: 	Default stickiness: 0
Oct 21 11:21:19 [8438] bl460g1n6 crm_resource:    debug: unpack_config: 	On loss of CCM Quorum: Freeze resources
Oct 21 11:21:19 [8438] bl460g1n6 crm_resource:    debug: unpack_config: 	Node scores: 'red' = -INFINITY, 'yellow' = 0, 'green' = 0
Oct 21 11:21:19 [8438] bl460g1n6 crm_resource:  warning: handle_startup_fencing: 	Blind faith: not fencing unseen nodes
Oct 21 11:21:19 [8438] bl460g1n6 crm_resource:    debug: unpack_domains: 	Unpacking domains
Oct 21 11:21:19 [8438] bl460g1n6 crm_resource:     info: determine_online_status_fencing: 	Node bl460g1n8 is active
Oct 21 11:21:19 [8438] bl460g1n6 crm_resource:     info: determine_online_status: 	Node bl460g1n8 is online
Oct 21 11:21:19 [8438] bl460g1n6 crm_resource:     info: determine_online_status_fencing: 	Node bl460g1n6 is active
Oct 21 11:21:19 [8438] bl460g1n6 crm_resource:     info: determine_online_status: 	Node bl460g1n6 is online
Oct 21 11:21:19 [8438] bl460g1n6 crm_resource:     info: determine_online_status_fencing: 	Node bl460g1n7 is active
Oct 21 11:21:19 [8438] bl460g1n6 crm_resource:     info: determine_online_status: 	Node bl460g1n7 is online
Oct 21 11:21:19 [8438] bl460g1n6 crm_resource:    debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n8 to prmPing:0
Oct 21 11:21:19 [8438] bl460g1n6 crm_resource:    debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n6 to prmPing:1
Oct 21 11:21:19 [8438] bl460g1n6 crm_resource:    debug: determine_op_status: 	prmVM1_monitor_0 on bl460g1n6 returned 'ok' (0) instead of the expected value: 'not running' (7)
Oct 21 11:21:19 [8438] bl460g1n6 crm_resource:     info: determine_op_status: 	Operation monitor found resource prmVM1 active on bl460g1n6
Oct 21 11:21:19 [8438] bl460g1n6 crm_resource:    debug: determine_op_status: 	prmVM2_monitor_0 on bl460g1n6 returned 'ok' (0) instead of the expected value: 'not running' (7)
Oct 21 11:21:19 [8438] bl460g1n6 crm_resource:     info: determine_op_status: 	Operation monitor found resource prmVM2 active on bl460g1n6
Oct 21 11:21:19 [8438] bl460g1n6 crm_resource:    debug: determine_op_status: 	prmVM3_monitor_0 on bl460g1n6 returned 'ok' (0) instead of the expected value: 'not running' (7)
Oct 21 11:21:19 [8438] bl460g1n6 crm_resource:     info: determine_op_status: 	Operation monitor found resource prmVM3 active on bl460g1n6
Oct 21 11:21:19 [8438] bl460g1n6 crm_resource:    debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n7 to prmPing:2
Oct 21 11:21:19 [8438] bl460g1n6 crm_resource:    debug: dump_resource_attr: 	Looking up hv_memory in prmVM2
Oct 21 11:21:19 [8438] bl460g1n6 crm_resource:    debug: cib_native_signoff: 	Signing out of the CIB Service
Oct 21 11:21:19 [8438] bl460g1n6 crm_resource:    debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Oct 21 11:21:19 [7680] bl460g1n6        cib:    debug: qb_ipcs_dispatch_connection_request: 	HUP conn (7680-8438-14)
Oct 21 11:21:19 [8438] bl460g1n6 crm_resource:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-request-7680-8438-14-header
Oct 21 11:21:19 [7680] bl460g1n6        cib:    debug: qb_ipcs_disconnect: 	qb_ipcs_disconnect(7680-8438-14) state:2
Oct 21 11:21:19 [8438] bl460g1n6 crm_resource:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-response-7680-8438-14-header
Oct 21 11:21:19 [8438] bl460g1n6 crm_resource:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-event-7680-8438-14-header
Oct 21 11:21:19 [7680] bl460g1n6        cib:     info: crm_client_destroy: 	Destroying 0 events
Oct 21 11:21:19 [8438] bl460g1n6 crm_resource:     info: crm_xml_cleanup: 	Cleaning up memory from libxml2
Oct 21 11:21:19 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-response-7680-8438-14-header
Oct 21 11:21:19 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-event-7680-8438-14-header
Oct 21 11:21:19 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-request-7680-8438-14-header
Oct 21 11:21:19 [7682] bl460g1n6       lrmd:    debug: operation_finished: 	prmVM2_monitor_10000:8327 - exited with rc=0
Oct 21 11:21:19 [7682] bl460g1n6       lrmd:    debug: operation_finished: 	prmVM2_monitor_10000:8327:stderr [ -- empty -- ]
Oct 21 11:21:19 [7682] bl460g1n6       lrmd:    debug: operation_finished: 	prmVM2_monitor_10000:8327:stdout [ -- empty -- ]
Oct 21 11:21:19 [7682] bl460g1n6       lrmd:    debug: log_finished: 	finished - rsc:prmVM2 action:monitor call_id:44 pid:8327 exit-code:0 exec-time:0ms queue-time:0ms
Oct 21 11:21:19 [7680] bl460g1n6        cib:     info: crm_client_new: 	Connecting 0xbde1f0 for uid=0 gid=0 pid=8441 id=5d443c75-1372-4bd8-a734-78cb08343a15
Oct 21 11:21:19 [7680] bl460g1n6        cib:    debug: handle_new_connection: 	IPC credentials authenticated (7680-8441-14)
Oct 21 11:21:19 [7680] bl460g1n6        cib:    debug: qb_ipcs_shm_connect: 	connecting to client [8441]
Oct 21 11:21:19 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:19 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:19 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:19 [8441] bl460g1n6 crm_resource:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:19 [8441] bl460g1n6 crm_resource:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:19 [8441] bl460g1n6 crm_resource:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:19 [8441] bl460g1n6 crm_resource:    debug: cib_native_signon_raw: 	Connection to CIB successful
Oct 21 11:21:19 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_query operation for section 'all': OK (rc=0, origin=local/crm_resource/2, version=0.14.68)
Oct 21 11:21:19 [8441] bl460g1n6 crm_resource:    debug: unpack_config: 	STONITH timeout: 60000
Oct 21 11:21:19 [8441] bl460g1n6 crm_resource:    debug: unpack_config: 	STONITH of failed nodes is enabled
Oct 21 11:21:19 [8441] bl460g1n6 crm_resource:    debug: unpack_config: 	Stop all active resources: false
Oct 21 11:21:19 [8441] bl460g1n6 crm_resource:    debug: unpack_config: 	Cluster is symmetric - resources can run anywhere by default
Oct 21 11:21:19 [8441] bl460g1n6 crm_resource:    debug: unpack_config: 	Default stickiness: 0
Oct 21 11:21:19 [8441] bl460g1n6 crm_resource:    debug: unpack_config: 	On loss of CCM Quorum: Freeze resources
Oct 21 11:21:19 [8441] bl460g1n6 crm_resource:    debug: unpack_config: 	Node scores: 'red' = -INFINITY, 'yellow' = 0, 'green' = 0
Oct 21 11:21:19 [8441] bl460g1n6 crm_resource:  warning: handle_startup_fencing: 	Blind faith: not fencing unseen nodes
Oct 21 11:21:19 [8441] bl460g1n6 crm_resource:    debug: unpack_domains: 	Unpacking domains
Oct 21 11:21:19 [8441] bl460g1n6 crm_resource:     info: determine_online_status_fencing: 	Node bl460g1n8 is active
Oct 21 11:21:19 [8441] bl460g1n6 crm_resource:     info: determine_online_status: 	Node bl460g1n8 is online
Oct 21 11:21:19 [8441] bl460g1n6 crm_resource:     info: determine_online_status_fencing: 	Node bl460g1n6 is active
Oct 21 11:21:19 [8441] bl460g1n6 crm_resource:     info: determine_online_status: 	Node bl460g1n6 is online
Oct 21 11:21:19 [8441] bl460g1n6 crm_resource:     info: determine_online_status_fencing: 	Node bl460g1n7 is active
Oct 21 11:21:19 [8441] bl460g1n6 crm_resource:     info: determine_online_status: 	Node bl460g1n7 is online
Oct 21 11:21:19 [8441] bl460g1n6 crm_resource:    debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n8 to prmPing:0
Oct 21 11:21:19 [8441] bl460g1n6 crm_resource:    debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n6 to prmPing:1
Oct 21 11:21:19 [8441] bl460g1n6 crm_resource:    debug: determine_op_status: 	prmVM1_monitor_0 on bl460g1n6 returned 'ok' (0) instead of the expected value: 'not running' (7)
Oct 21 11:21:19 [8441] bl460g1n6 crm_resource:     info: determine_op_status: 	Operation monitor found resource prmVM1 active on bl460g1n6
Oct 21 11:21:19 [7680] bl460g1n6        cib:     info: crm_client_new: 	Connecting 0xc786d0 for uid=0 gid=0 pid=8443 id=e1ef0a2d-7e36-4ae4-8139-d238e0f63529
Oct 21 11:21:19 [8441] bl460g1n6 crm_resource:    debug: determine_op_status: 	prmVM2_monitor_0 on bl460g1n6 returned 'ok' (0) instead of the expected value: 'not running' (7)
Oct 21 11:21:19 [7680] bl460g1n6        cib:    debug: handle_new_connection: 	IPC credentials authenticated (7680-8443-15)
Oct 21 11:21:19 [8441] bl460g1n6 crm_resource:     info: determine_op_status: 	Operation monitor found resource prmVM2 active on bl460g1n6
Oct 21 11:21:19 [7680] bl460g1n6        cib:    debug: qb_ipcs_shm_connect: 	connecting to client [8443]
Oct 21 11:21:19 [8441] bl460g1n6 crm_resource:    debug: determine_op_status: 	prmVM3_monitor_0 on bl460g1n6 returned 'ok' (0) instead of the expected value: 'not running' (7)
Oct 21 11:21:19 [8441] bl460g1n6 crm_resource:     info: determine_op_status: 	Operation monitor found resource prmVM3 active on bl460g1n6
Oct 21 11:21:19 [8441] bl460g1n6 crm_resource:    debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n7 to prmPing:2
Oct 21 11:21:19 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:19 [8441] bl460g1n6 crm_resource:    debug: dump_resource_attr: 	Looking up hv_memory in prmVM1
Oct 21 11:21:19 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:19 [8441] bl460g1n6 crm_resource:    debug: cib_native_signoff: 	Signing out of the CIB Service
Oct 21 11:21:19 [8441] bl460g1n6 crm_resource:    debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Oct 21 11:21:19 [8441] bl460g1n6 crm_resource:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-request-7680-8441-14-header
Oct 21 11:21:19 [8441] bl460g1n6 crm_resource:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-response-7680-8441-14-header
Oct 21 11:21:19 [8441] bl460g1n6 crm_resource:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-event-7680-8441-14-header
Oct 21 11:21:19 [8441] bl460g1n6 crm_resource:     info: crm_xml_cleanup: 	Cleaning up memory from libxml2
Oct 21 11:21:19 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:19 [7680] bl460g1n6        cib:    debug: qb_ipcs_dispatch_connection_request: 	HUP conn (7680-8441-14)
Oct 21 11:21:19 [7680] bl460g1n6        cib:    debug: qb_ipcs_disconnect: 	qb_ipcs_disconnect(7680-8441-14) state:2
Oct 21 11:21:19 [8443] bl460g1n6 crm_resource:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:19 [7680] bl460g1n6        cib:     info: crm_client_destroy: 	Destroying 0 events
Oct 21 11:21:19 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-response-7680-8441-14-header
Oct 21 11:21:19 [8443] bl460g1n6 crm_resource:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:19 [8443] bl460g1n6 crm_resource:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:19 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-event-7680-8441-14-header
Oct 21 11:21:19 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-request-7680-8441-14-header
Oct 21 11:21:19 [8443] bl460g1n6 crm_resource:    debug: cib_native_signon_raw: 	Connection to CIB successful
Oct 21 11:21:19 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_query operation for section 'all': OK (rc=0, origin=local/crm_resource/2, version=0.14.68)
Oct 21 11:21:19 [7682] bl460g1n6       lrmd:    debug: operation_finished: 	prmVM1_monitor_10000:8334 - exited with rc=0
Oct 21 11:21:19 [7682] bl460g1n6       lrmd:    debug: operation_finished: 	prmVM1_monitor_10000:8334:stderr [ -- empty -- ]
Oct 21 11:21:19 [7682] bl460g1n6       lrmd:    debug: operation_finished: 	prmVM1_monitor_10000:8334:stdout [ -- empty -- ]
Oct 21 11:21:19 [7682] bl460g1n6       lrmd:    debug: log_finished: 	finished - rsc:prmVM1 action:monitor call_id:43 pid:8334 exit-code:0 exec-time:0ms queue-time:0ms
Oct 21 11:21:19 [8443] bl460g1n6 crm_resource:    debug: unpack_config: 	STONITH timeout: 60000
Oct 21 11:21:19 [8443] bl460g1n6 crm_resource:    debug: unpack_config: 	STONITH of failed nodes is enabled
Oct 21 11:21:19 [8443] bl460g1n6 crm_resource:    debug: unpack_config: 	Stop all active resources: false
Oct 21 11:21:19 [8443] bl460g1n6 crm_resource:    debug: unpack_config: 	Cluster is symmetric - resources can run anywhere by default
Oct 21 11:21:19 [8443] bl460g1n6 crm_resource:    debug: unpack_config: 	Default stickiness: 0
Oct 21 11:21:19 [8443] bl460g1n6 crm_resource:    debug: unpack_config: 	On loss of CCM Quorum: Freeze resources
Oct 21 11:21:19 [8443] bl460g1n6 crm_resource:    debug: unpack_config: 	Node scores: 'red' = -INFINITY, 'yellow' = 0, 'green' = 0
Oct 21 11:21:19 [8443] bl460g1n6 crm_resource:  warning: handle_startup_fencing: 	Blind faith: not fencing unseen nodes
Oct 21 11:21:19 [8443] bl460g1n6 crm_resource:    debug: unpack_domains: 	Unpacking domains
Oct 21 11:21:19 [8443] bl460g1n6 crm_resource:     info: determine_online_status_fencing: 	Node bl460g1n8 is active
Oct 21 11:21:19 [8443] bl460g1n6 crm_resource:     info: determine_online_status: 	Node bl460g1n8 is online
Oct 21 11:21:19 [8443] bl460g1n6 crm_resource:     info: determine_online_status_fencing: 	Node bl460g1n6 is active
Oct 21 11:21:19 [8443] bl460g1n6 crm_resource:     info: determine_online_status: 	Node bl460g1n6 is online
Oct 21 11:21:19 [8443] bl460g1n6 crm_resource:     info: determine_online_status_fencing: 	Node bl460g1n7 is active
Oct 21 11:21:19 [8443] bl460g1n6 crm_resource:     info: determine_online_status: 	Node bl460g1n7 is online
Oct 21 11:21:19 [8443] bl460g1n6 crm_resource:    debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n8 to prmPing:0
Oct 21 11:21:19 [8443] bl460g1n6 crm_resource:    debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n6 to prmPing:1
Oct 21 11:21:19 [8443] bl460g1n6 crm_resource:    debug: determine_op_status: 	prmVM1_monitor_0 on bl460g1n6 returned 'ok' (0) instead of the expected value: 'not running' (7)
Oct 21 11:21:19 [8443] bl460g1n6 crm_resource:     info: determine_op_status: 	Operation monitor found resource prmVM1 active on bl460g1n6
Oct 21 11:21:19 [8443] bl460g1n6 crm_resource:    debug: determine_op_status: 	prmVM2_monitor_0 on bl460g1n6 returned 'ok' (0) instead of the expected value: 'not running' (7)
Oct 21 11:21:19 [8443] bl460g1n6 crm_resource:     info: determine_op_status: 	Operation monitor found resource prmVM2 active on bl460g1n6
Oct 21 11:21:19 [8443] bl460g1n6 crm_resource:    debug: determine_op_status: 	prmVM3_monitor_0 on bl460g1n6 returned 'ok' (0) instead of the expected value: 'not running' (7)
Oct 21 11:21:19 [8443] bl460g1n6 crm_resource:     info: determine_op_status: 	Operation monitor found resource prmVM3 active on bl460g1n6
Oct 21 11:21:19 [8443] bl460g1n6 crm_resource:    debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n7 to prmPing:2
Oct 21 11:21:19 [8443] bl460g1n6 crm_resource:    debug: dump_resource_attr: 	Looking up hv_memory in prmVM3
Oct 21 11:21:19 [8443] bl460g1n6 crm_resource:    debug: cib_native_signoff: 	Signing out of the CIB Service
Oct 21 11:21:19 [8443] bl460g1n6 crm_resource:    debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Oct 21 11:21:19 [8443] bl460g1n6 crm_resource:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-request-7680-8443-15-header
Oct 21 11:21:19 [7680] bl460g1n6        cib:    debug: qb_ipcs_dispatch_connection_request: 	HUP conn (7680-8443-15)
Oct 21 11:21:19 [7680] bl460g1n6        cib:    debug: qb_ipcs_disconnect: 	qb_ipcs_disconnect(7680-8443-15) state:2
Oct 21 11:21:19 [8443] bl460g1n6 crm_resource:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-response-7680-8443-15-header
Oct 21 11:21:19 [8443] bl460g1n6 crm_resource:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-event-7680-8443-15-header
Oct 21 11:21:19 [8443] bl460g1n6 crm_resource:     info: crm_xml_cleanup: 	Cleaning up memory from libxml2
Oct 21 11:21:19 [7680] bl460g1n6        cib:     info: crm_client_destroy: 	Destroying 0 events
Oct 21 11:21:19 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-response-7680-8443-15-header
Oct 21 11:21:19 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-event-7680-8443-15-header
Oct 21 11:21:19 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-request-7680-8443-15-header
Oct 21 11:21:19 [7682] bl460g1n6       lrmd:    debug: operation_finished: 	prmVM3_monitor_10000:8337 - exited with rc=0
Oct 21 11:21:19 [7682] bl460g1n6       lrmd:    debug: operation_finished: 	prmVM3_monitor_10000:8337:stderr [ -- empty -- ]
Oct 21 11:21:19 [7682] bl460g1n6       lrmd:    debug: operation_finished: 	prmVM3_monitor_10000:8337:stdout [ -- empty -- ]
Oct 21 11:21:19 [7682] bl460g1n6       lrmd:    debug: log_finished: 	finished - rsc:prmVM3 action:monitor call_id:45 pid:8337 exit-code:0 exec-time:0ms queue-time:0ms
Oct 21 11:21:24 [7663] bl460g1n6 corosync debug   [QB    ] ipc_shm.c:qb_ipcs_shm_connect:295 connecting to client [8498]
Oct 21 11:21:24 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:21:24 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:21:24 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:21:24 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_created:272 connection created
Oct 21 11:21:24 [7663] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_init_fn:306 lib_init_fn: conn=0x7fb1490da8f0
Oct 21 11:21:24 [7663] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_dispatch_connection_request:757 HUP conn (7666-8498-34)
Oct 21 11:21:24 [7663] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_disconnect:605 qb_ipcs_disconnect(7666-8498-34) state:2
Oct 21 11:21:24 [7663] bl460g1n6 corosync debug   [QB    ] loop_poll_epoll.c:_del:117 epoll_ctl(del): Bad file descriptor (9)
Oct 21 11:21:24 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_closed:417 cs_ipcs_connection_closed() 
Oct 21 11:21:24 [7663] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_exit_fn:325 exit_fn for conn=0x7fb1490da8f0
Oct 21 11:21:24 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_destroyed:390 cs_ipcs_connection_destroyed() 
Oct 21 11:21:24 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-response-7666-8498-34-header
Oct 21 11:21:24 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-event-7666-8498-34-header
Oct 21 11:21:24 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-request-7666-8498-34-header
Oct 21 11:21:25 [7682] bl460g1n6       lrmd:    debug: recurring_action_timer: 	Scheduling another invokation of prmPing_monitor_10000
Oct 21 11:21:26 [8554] bl460g1n6 attrd_updater:     info: attrd_update_delegate: 	Connecting to cluster... 5 retries remaining
Oct 21 11:21:26 [7683] bl460g1n6      attrd:     info: crm_client_new: 	Connecting 0x14f9260 for uid=0 gid=0 pid=8554 id=36f9e788-857b-4183-90f2-285b5c5de022
Oct 21 11:21:26 [7683] bl460g1n6      attrd:    debug: handle_new_connection: 	IPC credentials authenticated (7683-8554-10)
Oct 21 11:21:26 [7683] bl460g1n6      attrd:    debug: qb_ipcs_shm_connect: 	connecting to client [8554]
Oct 21 11:21:26 [7683] bl460g1n6      attrd:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:26 [7683] bl460g1n6      attrd:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:26 [7683] bl460g1n6      attrd:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:26 [8554] bl460g1n6 attrd_updater:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:26 [8554] bl460g1n6 attrd_updater:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:26 [8554] bl460g1n6 attrd_updater:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:26 [7683] bl460g1n6      attrd:     info: attrd_client_message: 	Broadcasting default_ping_set[bl460g1n6] = 100 (writer)
Oct 21 11:21:26 [8554] bl460g1n6 attrd_updater:    debug: attrd_update_delegate: 	Sent update: default_ping_set=100 for localhost
Oct 21 11:21:26 [8554] bl460g1n6 attrd_updater:     info: crm_xml_cleanup: 	Cleaning up memory from libxml2
Oct 21 11:21:26 [7683] bl460g1n6      attrd:    debug: qb_ipcs_dispatch_connection_request: 	HUP conn (7683-8554-10)
Oct 21 11:21:26 [7683] bl460g1n6      attrd:    debug: qb_ipcs_disconnect: 	qb_ipcs_disconnect(7683-8554-10) state:2
Oct 21 11:21:26 [7683] bl460g1n6      attrd:     info: crm_client_destroy: 	Destroying 0 events
Oct 21 11:21:26 [7683] bl460g1n6      attrd:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-attrd-response-7683-8554-10-header
Oct 21 11:21:26 [7683] bl460g1n6      attrd:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-attrd-event-7683-8554-10-header
Oct 21 11:21:26 [7683] bl460g1n6      attrd:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-attrd-request-7683-8554-10-header
Oct 21 11:21:26 [7682] bl460g1n6       lrmd:    debug: operation_finished: 	prmPing_monitor_10000:8537 - exited with rc=0
Oct 21 11:21:26 [7682] bl460g1n6       lrmd:    debug: operation_finished: 	prmPing_monitor_10000:8537:stderr [ -- empty -- ]
Oct 21 11:21:26 [7682] bl460g1n6       lrmd:    debug: operation_finished: 	prmPing_monitor_10000:8537:stdout [ -- empty -- ]
Oct 21 11:21:26 [7682] bl460g1n6       lrmd:    debug: log_finished: 	finished - rsc:prmPing action:monitor call_id:47 pid:8537 exit-code:0 exec-time:0ms queue-time:0ms
Oct 21 11:21:29 [7682] bl460g1n6       lrmd:    debug: recurring_action_timer: 	Scheduling another invokation of prmVM2_monitor_10000
Oct 21 11:21:29 [7682] bl460g1n6       lrmd:    debug: recurring_action_timer: 	Scheduling another invokation of prmVM1_monitor_10000
Oct 21 11:21:29 [7682] bl460g1n6       lrmd:    debug: recurring_action_timer: 	Scheduling another invokation of prmVM3_monitor_10000
VirtualDomain(prmVM2)[9252]:	2013/10/21_11:21:29 DEBUG: Virtual domain vm2 is currently running.
VirtualDomain(prmVM1)[9259]:	2013/10/21_11:21:29 DEBUG: Virtual domain vm1 is currently running.
VirtualDomain(prmVM3)[9262]:	2013/10/21_11:21:29 DEBUG: Virtual domain vm3 is currently running.
Oct 21 11:21:29 [7680] bl460g1n6        cib:     info: crm_client_new: 	Connecting 0xbde1f0 for uid=0 gid=0 pid=9345 id=a7a789aa-6ecd-46a3-9e2f-58eb66c80b89
Oct 21 11:21:29 [7680] bl460g1n6        cib:    debug: handle_new_connection: 	IPC credentials authenticated (7680-9345-14)
Oct 21 11:21:29 [7680] bl460g1n6        cib:    debug: qb_ipcs_shm_connect: 	connecting to client [9345]
Oct 21 11:21:29 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:29 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:29 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:29 [9345] bl460g1n6 crm_resource:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:29 [9345] bl460g1n6 crm_resource:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:29 [9345] bl460g1n6 crm_resource:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:29 [9345] bl460g1n6 crm_resource:    debug: cib_native_signon_raw: 	Connection to CIB successful
Oct 21 11:21:29 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_query operation for section 'all': OK (rc=0, origin=local/crm_resource/2, version=0.14.68)
Oct 21 11:21:29 [9345] bl460g1n6 crm_resource:    debug: unpack_config: 	STONITH timeout: 60000
Oct 21 11:21:29 [9345] bl460g1n6 crm_resource:    debug: unpack_config: 	STONITH of failed nodes is enabled
Oct 21 11:21:29 [9345] bl460g1n6 crm_resource:    debug: unpack_config: 	Stop all active resources: false
Oct 21 11:21:29 [9345] bl460g1n6 crm_resource:    debug: unpack_config: 	Cluster is symmetric - resources can run anywhere by default
Oct 21 11:21:29 [9345] bl460g1n6 crm_resource:    debug: unpack_config: 	Default stickiness: 0
Oct 21 11:21:29 [9345] bl460g1n6 crm_resource:    debug: unpack_config: 	On loss of CCM Quorum: Freeze resources
Oct 21 11:21:29 [9345] bl460g1n6 crm_resource:    debug: unpack_config: 	Node scores: 'red' = -INFINITY, 'yellow' = 0, 'green' = 0
Oct 21 11:21:29 [9345] bl460g1n6 crm_resource:  warning: handle_startup_fencing: 	Blind faith: not fencing unseen nodes
Oct 21 11:21:29 [9345] bl460g1n6 crm_resource:    debug: unpack_domains: 	Unpacking domains
Oct 21 11:21:29 [9345] bl460g1n6 crm_resource:     info: determine_online_status_fencing: 	Node bl460g1n8 is active
Oct 21 11:21:29 [9345] bl460g1n6 crm_resource:     info: determine_online_status: 	Node bl460g1n8 is online
Oct 21 11:21:29 [9345] bl460g1n6 crm_resource:     info: determine_online_status_fencing: 	Node bl460g1n6 is active
Oct 21 11:21:29 [9345] bl460g1n6 crm_resource:     info: determine_online_status: 	Node bl460g1n6 is online
Oct 21 11:21:29 [9345] bl460g1n6 crm_resource:     info: determine_online_status_fencing: 	Node bl460g1n7 is active
Oct 21 11:21:29 [9345] bl460g1n6 crm_resource:     info: determine_online_status: 	Node bl460g1n7 is online
Oct 21 11:21:29 [9345] bl460g1n6 crm_resource:    debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n8 to prmPing:0
Oct 21 11:21:29 [9345] bl460g1n6 crm_resource:    debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n6 to prmPing:1
Oct 21 11:21:29 [9345] bl460g1n6 crm_resource:    debug: determine_op_status: 	prmVM1_monitor_0 on bl460g1n6 returned 'ok' (0) instead of the expected value: 'not running' (7)
Oct 21 11:21:29 [9345] bl460g1n6 crm_resource:     info: determine_op_status: 	Operation monitor found resource prmVM1 active on bl460g1n6
Oct 21 11:21:29 [9345] bl460g1n6 crm_resource:    debug: determine_op_status: 	prmVM2_monitor_0 on bl460g1n6 returned 'ok' (0) instead of the expected value: 'not running' (7)
Oct 21 11:21:29 [9345] bl460g1n6 crm_resource:     info: determine_op_status: 	Operation monitor found resource prmVM2 active on bl460g1n6
Oct 21 11:21:29 [9345] bl460g1n6 crm_resource:    debug: determine_op_status: 	prmVM3_monitor_0 on bl460g1n6 returned 'ok' (0) instead of the expected value: 'not running' (7)
Oct 21 11:21:29 [9345] bl460g1n6 crm_resource:     info: determine_op_status: 	Operation monitor found resource prmVM3 active on bl460g1n6
Oct 21 11:21:29 [9345] bl460g1n6 crm_resource:    debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n7 to prmPing:2
Oct 21 11:21:29 [9345] bl460g1n6 crm_resource:    debug: dump_resource_attr: 	Looking up cpu in prmVM2
Oct 21 11:21:29 [9345] bl460g1n6 crm_resource:    debug: cib_native_signoff: 	Signing out of the CIB Service
Oct 21 11:21:29 [9345] bl460g1n6 crm_resource:    debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Oct 21 11:21:29 [9345] bl460g1n6 crm_resource:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-request-7680-9345-14-header
Oct 21 11:21:29 [7680] bl460g1n6        cib:    debug: qb_ipcs_dispatch_connection_request: 	HUP conn (7680-9345-14)
Oct 21 11:21:29 [7680] bl460g1n6        cib:    debug: qb_ipcs_disconnect: 	qb_ipcs_disconnect(7680-9345-14) state:2
Oct 21 11:21:29 [9345] bl460g1n6 crm_resource:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-response-7680-9345-14-header
Oct 21 11:21:29 [9345] bl460g1n6 crm_resource:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-event-7680-9345-14-header
Oct 21 11:21:29 [9345] bl460g1n6 crm_resource:     info: crm_xml_cleanup: 	Cleaning up memory from libxml2
Oct 21 11:21:29 [7680] bl460g1n6        cib:     info: crm_client_destroy: 	Destroying 0 events
Oct 21 11:21:29 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-response-7680-9345-14-header
Oct 21 11:21:29 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-event-7680-9345-14-header
Oct 21 11:21:29 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-request-7680-9345-14-header
Oct 21 11:21:29 [7680] bl460g1n6        cib:     info: crm_client_new: 	Connecting 0xbde1f0 for uid=0 gid=0 pid=9348 id=d853ddf8-ae86-4c7f-8076-70687d699fce
Oct 21 11:21:29 [7680] bl460g1n6        cib:    debug: handle_new_connection: 	IPC credentials authenticated (7680-9348-14)
Oct 21 11:21:29 [7680] bl460g1n6        cib:    debug: qb_ipcs_shm_connect: 	connecting to client [9348]
Oct 21 11:21:29 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:29 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:29 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:29 [9348] bl460g1n6 crm_resource:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:29 [9348] bl460g1n6 crm_resource:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:29 [9348] bl460g1n6 crm_resource:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:29 [9348] bl460g1n6 crm_resource:    debug: cib_native_signon_raw: 	Connection to CIB successful
Oct 21 11:21:29 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_query operation for section 'all': OK (rc=0, origin=local/crm_resource/2, version=0.14.68)
Oct 21 11:21:29 [9348] bl460g1n6 crm_resource:    debug: unpack_config: 	STONITH timeout: 60000
Oct 21 11:21:29 [9348] bl460g1n6 crm_resource:    debug: unpack_config: 	STONITH of failed nodes is enabled
Oct 21 11:21:29 [9348] bl460g1n6 crm_resource:    debug: unpack_config: 	Stop all active resources: false
Oct 21 11:21:29 [9348] bl460g1n6 crm_resource:    debug: unpack_config: 	Cluster is symmetric - resources can run anywhere by default
Oct 21 11:21:29 [9348] bl460g1n6 crm_resource:    debug: unpack_config: 	Default stickiness: 0
Oct 21 11:21:29 [9348] bl460g1n6 crm_resource:    debug: unpack_config: 	On loss of CCM Quorum: Freeze resources
Oct 21 11:21:29 [9348] bl460g1n6 crm_resource:    debug: unpack_config: 	Node scores: 'red' = -INFINITY, 'yellow' = 0, 'green' = 0
Oct 21 11:21:29 [9348] bl460g1n6 crm_resource:  warning: handle_startup_fencing: 	Blind faith: not fencing unseen nodes
Oct 21 11:21:29 [9348] bl460g1n6 crm_resource:    debug: unpack_domains: 	Unpacking domains
Oct 21 11:21:29 [9348] bl460g1n6 crm_resource:     info: determine_online_status_fencing: 	Node bl460g1n8 is active
Oct 21 11:21:29 [9348] bl460g1n6 crm_resource:     info: determine_online_status: 	Node bl460g1n8 is online
Oct 21 11:21:29 [9348] bl460g1n6 crm_resource:     info: determine_online_status_fencing: 	Node bl460g1n6 is active
Oct 21 11:21:29 [9348] bl460g1n6 crm_resource:     info: determine_online_status: 	Node bl460g1n6 is online
Oct 21 11:21:29 [9348] bl460g1n6 crm_resource:     info: determine_online_status_fencing: 	Node bl460g1n7 is active
Oct 21 11:21:29 [9348] bl460g1n6 crm_resource:     info: determine_online_status: 	Node bl460g1n7 is online
Oct 21 11:21:29 [9348] bl460g1n6 crm_resource:    debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n8 to prmPing:0
Oct 21 11:21:29 [9348] bl460g1n6 crm_resource:    debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n6 to prmPing:1
Oct 21 11:21:29 [9348] bl460g1n6 crm_resource:    debug: determine_op_status: 	prmVM1_monitor_0 on bl460g1n6 returned 'ok' (0) instead of the expected value: 'not running' (7)
Oct 21 11:21:29 [9348] bl460g1n6 crm_resource:     info: determine_op_status: 	Operation monitor found resource prmVM1 active on bl460g1n6
Oct 21 11:21:29 [9348] bl460g1n6 crm_resource:    debug: determine_op_status: 	prmVM2_monitor_0 on bl460g1n6 returned 'ok' (0) instead of the expected value: 'not running' (7)
Oct 21 11:21:29 [9348] bl460g1n6 crm_resource:     info: determine_op_status: 	Operation monitor found resource prmVM2 active on bl460g1n6
Oct 21 11:21:29 [9348] bl460g1n6 crm_resource:    debug: determine_op_status: 	prmVM3_monitor_0 on bl460g1n6 returned 'ok' (0) instead of the expected value: 'not running' (7)
Oct 21 11:21:29 [9348] bl460g1n6 crm_resource:     info: determine_op_status: 	Operation monitor found resource prmVM3 active on bl460g1n6
Oct 21 11:21:29 [9348] bl460g1n6 crm_resource:    debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n7 to prmPing:2
Oct 21 11:21:29 [9348] bl460g1n6 crm_resource:    debug: dump_resource_attr: 	Looking up cpu in prmVM1
Oct 21 11:21:29 [9348] bl460g1n6 crm_resource:    debug: cib_native_signoff: 	Signing out of the CIB Service
Oct 21 11:21:29 [9348] bl460g1n6 crm_resource:    debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Oct 21 11:21:29 [7680] bl460g1n6        cib:    debug: qb_ipcs_dispatch_connection_request: 	HUP conn (7680-9348-14)
Oct 21 11:21:29 [9348] bl460g1n6 crm_resource:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-request-7680-9348-14-header
Oct 21 11:21:29 [7680] bl460g1n6        cib:    debug: qb_ipcs_disconnect: 	qb_ipcs_disconnect(7680-9348-14) state:2
Oct 21 11:21:29 [9348] bl460g1n6 crm_resource:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-response-7680-9348-14-header
Oct 21 11:21:29 [9348] bl460g1n6 crm_resource:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-event-7680-9348-14-header
Oct 21 11:21:29 [9348] bl460g1n6 crm_resource:     info: crm_xml_cleanup: 	Cleaning up memory from libxml2
Oct 21 11:21:29 [7680] bl460g1n6        cib:     info: crm_client_destroy: 	Destroying 0 events
Oct 21 11:21:29 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-response-7680-9348-14-header
Oct 21 11:21:29 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-event-7680-9348-14-header
Oct 21 11:21:29 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-request-7680-9348-14-header
Oct 21 11:21:29 [7680] bl460g1n6        cib:     info: crm_client_new: 	Connecting 0xbde1f0 for uid=0 gid=0 pid=9353 id=81a84fe0-ece1-4850-a3c0-11440bdc6206
Oct 21 11:21:29 [7680] bl460g1n6        cib:    debug: handle_new_connection: 	IPC credentials authenticated (7680-9353-14)
Oct 21 11:21:29 [7680] bl460g1n6        cib:    debug: qb_ipcs_shm_connect: 	connecting to client [9353]
Oct 21 11:21:29 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:29 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:29 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:29 [9353] bl460g1n6 crm_resource:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:29 [9353] bl460g1n6 crm_resource:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:29 [9353] bl460g1n6 crm_resource:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:29 [9353] bl460g1n6 crm_resource:    debug: cib_native_signon_raw: 	Connection to CIB successful
Oct 21 11:21:29 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_query operation for section 'all': OK (rc=0, origin=local/crm_resource/2, version=0.14.68)
Oct 21 11:21:29 [9353] bl460g1n6 crm_resource:    debug: unpack_config: 	STONITH timeout: 60000
Oct 21 11:21:29 [9353] bl460g1n6 crm_resource:    debug: unpack_config: 	STONITH of failed nodes is enabled
Oct 21 11:21:29 [9353] bl460g1n6 crm_resource:    debug: unpack_config: 	Stop all active resources: false
Oct 21 11:21:29 [9353] bl460g1n6 crm_resource:    debug: unpack_config: 	Cluster is symmetric - resources can run anywhere by default
Oct 21 11:21:29 [9353] bl460g1n6 crm_resource:    debug: unpack_config: 	Default stickiness: 0
Oct 21 11:21:29 [9353] bl460g1n6 crm_resource:    debug: unpack_config: 	On loss of CCM Quorum: Freeze resources
Oct 21 11:21:29 [9353] bl460g1n6 crm_resource:    debug: unpack_config: 	Node scores: 'red' = -INFINITY, 'yellow' = 0, 'green' = 0
Oct 21 11:21:29 [9353] bl460g1n6 crm_resource:  warning: handle_startup_fencing: 	Blind faith: not fencing unseen nodes
Oct 21 11:21:29 [9353] bl460g1n6 crm_resource:    debug: unpack_domains: 	Unpacking domains
Oct 21 11:21:29 [9353] bl460g1n6 crm_resource:     info: determine_online_status_fencing: 	Node bl460g1n8 is active
Oct 21 11:21:29 [9353] bl460g1n6 crm_resource:     info: determine_online_status: 	Node bl460g1n8 is online
Oct 21 11:21:29 [9353] bl460g1n6 crm_resource:     info: determine_online_status_fencing: 	Node bl460g1n6 is active
Oct 21 11:21:29 [9353] bl460g1n6 crm_resource:     info: determine_online_status: 	Node bl460g1n6 is online
Oct 21 11:21:29 [9353] bl460g1n6 crm_resource:     info: determine_online_status_fencing: 	Node bl460g1n7 is active
Oct 21 11:21:29 [9353] bl460g1n6 crm_resource:     info: determine_online_status: 	Node bl460g1n7 is online
Oct 21 11:21:29 [9353] bl460g1n6 crm_resource:    debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n8 to prmPing:0
Oct 21 11:21:29 [9353] bl460g1n6 crm_resource:    debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n6 to prmPing:1
Oct 21 11:21:29 [9353] bl460g1n6 crm_resource:    debug: determine_op_status: 	prmVM1_monitor_0 on bl460g1n6 returned 'ok' (0) instead of the expected value: 'not running' (7)
Oct 21 11:21:29 [9353] bl460g1n6 crm_resource:     info: determine_op_status: 	Operation monitor found resource prmVM1 active on bl460g1n6
Oct 21 11:21:29 [9353] bl460g1n6 crm_resource:    debug: determine_op_status: 	prmVM2_monitor_0 on bl460g1n6 returned 'ok' (0) instead of the expected value: 'not running' (7)
Oct 21 11:21:29 [9353] bl460g1n6 crm_resource:     info: determine_op_status: 	Operation monitor found resource prmVM2 active on bl460g1n6
Oct 21 11:21:29 [9353] bl460g1n6 crm_resource:    debug: determine_op_status: 	prmVM3_monitor_0 on bl460g1n6 returned 'ok' (0) instead of the expected value: 'not running' (7)
Oct 21 11:21:29 [9353] bl460g1n6 crm_resource:     info: determine_op_status: 	Operation monitor found resource prmVM3 active on bl460g1n6
Oct 21 11:21:29 [9353] bl460g1n6 crm_resource:    debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n7 to prmPing:2
Oct 21 11:21:29 [9353] bl460g1n6 crm_resource:    debug: dump_resource_attr: 	Looking up cpu in prmVM3
Oct 21 11:21:29 [9353] bl460g1n6 crm_resource:    debug: cib_native_signoff: 	Signing out of the CIB Service
Oct 21 11:21:29 [9353] bl460g1n6 crm_resource:    debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Oct 21 11:21:29 [9353] bl460g1n6 crm_resource:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-request-7680-9353-14-header
Oct 21 11:21:29 [7680] bl460g1n6        cib:    debug: qb_ipcs_dispatch_connection_request: 	HUP conn (7680-9353-14)
Oct 21 11:21:29 [9353] bl460g1n6 crm_resource:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-response-7680-9353-14-header
Oct 21 11:21:29 [7680] bl460g1n6        cib:    debug: qb_ipcs_disconnect: 	qb_ipcs_disconnect(7680-9353-14) state:2
Oct 21 11:21:29 [9353] bl460g1n6 crm_resource:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-event-7680-9353-14-header
Oct 21 11:21:29 [9353] bl460g1n6 crm_resource:     info: crm_xml_cleanup: 	Cleaning up memory from libxml2
Oct 21 11:21:29 [7680] bl460g1n6        cib:     info: crm_client_destroy: 	Destroying 0 events
Oct 21 11:21:29 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-response-7680-9353-14-header
Oct 21 11:21:29 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-event-7680-9353-14-header
Oct 21 11:21:29 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-request-7680-9353-14-header
Oct 21 11:21:29 [7680] bl460g1n6        cib:     info: crm_client_new: 	Connecting 0xbde1f0 for uid=0 gid=0 pid=9379 id=310c2e90-de5c-42d1-ac36-c0ac9747e5cd
Oct 21 11:21:29 [7680] bl460g1n6        cib:    debug: handle_new_connection: 	IPC credentials authenticated (7680-9379-14)
Oct 21 11:21:29 [7680] bl460g1n6        cib:    debug: qb_ipcs_shm_connect: 	connecting to client [9379]
Oct 21 11:21:29 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:29 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:29 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:29 [9379] bl460g1n6 crm_resource:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:29 [9379] bl460g1n6 crm_resource:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:29 [9379] bl460g1n6 crm_resource:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:29 [9379] bl460g1n6 crm_resource:    debug: cib_native_signon_raw: 	Connection to CIB successful
Oct 21 11:21:29 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_query operation for section 'all': OK (rc=0, origin=local/crm_resource/2, version=0.14.68)
Oct 21 11:21:29 [9379] bl460g1n6 crm_resource:    debug: unpack_config: 	STONITH timeout: 60000
Oct 21 11:21:29 [9379] bl460g1n6 crm_resource:    debug: unpack_config: 	STONITH of failed nodes is enabled
Oct 21 11:21:29 [9379] bl460g1n6 crm_resource:    debug: unpack_config: 	Stop all active resources: false
Oct 21 11:21:29 [9379] bl460g1n6 crm_resource:    debug: unpack_config: 	Cluster is symmetric - resources can run anywhere by default
Oct 21 11:21:29 [9379] bl460g1n6 crm_resource:    debug: unpack_config: 	Default stickiness: 0
Oct 21 11:21:29 [9379] bl460g1n6 crm_resource:    debug: unpack_config: 	On loss of CCM Quorum: Freeze resources
Oct 21 11:21:29 [9379] bl460g1n6 crm_resource:    debug: unpack_config: 	Node scores: 'red' = -INFINITY, 'yellow' = 0, 'green' = 0
Oct 21 11:21:29 [9379] bl460g1n6 crm_resource:  warning: handle_startup_fencing: 	Blind faith: not fencing unseen nodes
Oct 21 11:21:29 [9379] bl460g1n6 crm_resource:    debug: unpack_domains: 	Unpacking domains
Oct 21 11:21:29 [9379] bl460g1n6 crm_resource:     info: determine_online_status_fencing: 	Node bl460g1n8 is active
Oct 21 11:21:29 [9379] bl460g1n6 crm_resource:     info: determine_online_status: 	Node bl460g1n8 is online
Oct 21 11:21:29 [9379] bl460g1n6 crm_resource:     info: determine_online_status_fencing: 	Node bl460g1n6 is active
Oct 21 11:21:29 [9379] bl460g1n6 crm_resource:     info: determine_online_status: 	Node bl460g1n6 is online
Oct 21 11:21:29 [9379] bl460g1n6 crm_resource:     info: determine_online_status_fencing: 	Node bl460g1n7 is active
Oct 21 11:21:29 [9379] bl460g1n6 crm_resource:     info: determine_online_status: 	Node bl460g1n7 is online
Oct 21 11:21:29 [9379] bl460g1n6 crm_resource:    debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n8 to prmPing:0
Oct 21 11:21:29 [9379] bl460g1n6 crm_resource:    debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n6 to prmPing:1
Oct 21 11:21:29 [9379] bl460g1n6 crm_resource:    debug: determine_op_status: 	prmVM1_monitor_0 on bl460g1n6 returned 'ok' (0) instead of the expected value: 'not running' (7)
Oct 21 11:21:29 [9379] bl460g1n6 crm_resource:     info: determine_op_status: 	Operation monitor found resource prmVM1 active on bl460g1n6
Oct 21 11:21:29 [9379] bl460g1n6 crm_resource:    debug: determine_op_status: 	prmVM2_monitor_0 on bl460g1n6 returned 'ok' (0) instead of the expected value: 'not running' (7)
Oct 21 11:21:29 [9379] bl460g1n6 crm_resource:     info: determine_op_status: 	Operation monitor found resource prmVM2 active on bl460g1n6
Oct 21 11:21:29 [9379] bl460g1n6 crm_resource:    debug: determine_op_status: 	prmVM3_monitor_0 on bl460g1n6 returned 'ok' (0) instead of the expected value: 'not running' (7)
Oct 21 11:21:29 [9379] bl460g1n6 crm_resource:     info: determine_op_status: 	Operation monitor found resource prmVM3 active on bl460g1n6
Oct 21 11:21:29 [9379] bl460g1n6 crm_resource:    debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n7 to prmPing:2
Oct 21 11:21:29 [9379] bl460g1n6 crm_resource:    debug: dump_resource_attr: 	Looking up hv_memory in prmVM2
Oct 21 11:21:29 [9379] bl460g1n6 crm_resource:    debug: cib_native_signoff: 	Signing out of the CIB Service
Oct 21 11:21:29 [9379] bl460g1n6 crm_resource:    debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Oct 21 11:21:29 [7680] bl460g1n6        cib:    debug: qb_ipcs_dispatch_connection_request: 	HUP conn (7680-9379-14)
Oct 21 11:21:29 [7680] bl460g1n6        cib:    debug: qb_ipcs_disconnect: 	qb_ipcs_disconnect(7680-9379-14) state:2
Oct 21 11:21:29 [9379] bl460g1n6 crm_resource:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-request-7680-9379-14-header
Oct 21 11:21:29 [7680] bl460g1n6        cib:     info: crm_client_destroy: 	Destroying 0 events
Oct 21 11:21:29 [9379] bl460g1n6 crm_resource:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-response-7680-9379-14-header
Oct 21 11:21:29 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-response-7680-9379-14-header
Oct 21 11:21:29 [9379] bl460g1n6 crm_resource:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-event-7680-9379-14-header
Oct 21 11:21:29 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-event-7680-9379-14-header
Oct 21 11:21:29 [9379] bl460g1n6 crm_resource:     info: crm_xml_cleanup: 	Cleaning up memory from libxml2
Oct 21 11:21:29 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-request-7680-9379-14-header
Oct 21 11:21:29 [7682] bl460g1n6       lrmd:    debug: operation_finished: 	prmVM2_monitor_10000:9252 - exited with rc=0
Oct 21 11:21:29 [7682] bl460g1n6       lrmd:    debug: operation_finished: 	prmVM2_monitor_10000:9252:stderr [ -- empty -- ]
Oct 21 11:21:29 [7682] bl460g1n6       lrmd:    debug: operation_finished: 	prmVM2_monitor_10000:9252:stdout [ -- empty -- ]
Oct 21 11:21:29 [7682] bl460g1n6       lrmd:    debug: log_finished: 	finished - rsc:prmVM2 action:monitor call_id:44 pid:9252 exit-code:0 exec-time:0ms queue-time:0ms
Oct 21 11:21:29 [7680] bl460g1n6        cib:     info: crm_client_new: 	Connecting 0xbde1f0 for uid=0 gid=0 pid=9400 id=e27c64a3-e2fc-4756-b874-db765916d0c4
Oct 21 11:21:29 [7680] bl460g1n6        cib:    debug: handle_new_connection: 	IPC credentials authenticated (7680-9400-14)
Oct 21 11:21:29 [7680] bl460g1n6        cib:    debug: qb_ipcs_shm_connect: 	connecting to client [9400]
Oct 21 11:21:29 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:29 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:29 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:29 [9400] bl460g1n6 crm_resource:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:29 [9400] bl460g1n6 crm_resource:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:29 [9400] bl460g1n6 crm_resource:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:29 [9400] bl460g1n6 crm_resource:    debug: cib_native_signon_raw: 	Connection to CIB successful
Oct 21 11:21:29 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_query operation for section 'all': OK (rc=0, origin=local/crm_resource/2, version=0.14.68)
Oct 21 11:21:29 [7680] bl460g1n6        cib:     info: crm_client_new: 	Connecting 0xc786d0 for uid=0 gid=0 pid=9402 id=7ddda0af-fe57-4cf5-bd68-8e72d9873240
Oct 21 11:21:29 [7680] bl460g1n6        cib:    debug: handle_new_connection: 	IPC credentials authenticated (7680-9402-15)
Oct 21 11:21:29 [7680] bl460g1n6        cib:    debug: qb_ipcs_shm_connect: 	connecting to client [9402]
Oct 21 11:21:29 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:29 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:29 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:29 [9402] bl460g1n6 crm_resource:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:29 [9402] bl460g1n6 crm_resource:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:29 [9402] bl460g1n6 crm_resource:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:29 [9400] bl460g1n6 crm_resource:    debug: unpack_config: 	STONITH timeout: 60000
Oct 21 11:21:29 [9400] bl460g1n6 crm_resource:    debug: unpack_config: 	STONITH of failed nodes is enabled
Oct 21 11:21:29 [9400] bl460g1n6 crm_resource:    debug: unpack_config: 	Stop all active resources: false
Oct 21 11:21:29 [9400] bl460g1n6 crm_resource:    debug: unpack_config: 	Cluster is symmetric - resources can run anywhere by default
Oct 21 11:21:29 [9400] bl460g1n6 crm_resource:    debug: unpack_config: 	Default stickiness: 0
Oct 21 11:21:29 [9400] bl460g1n6 crm_resource:    debug: unpack_config: 	On loss of CCM Quorum: Freeze resources
Oct 21 11:21:29 [9400] bl460g1n6 crm_resource:    debug: unpack_config: 	Node scores: 'red' = -INFINITY, 'yellow' = 0, 'green' = 0
Oct 21 11:21:29 [9400] bl460g1n6 crm_resource:  warning: handle_startup_fencing: 	Blind faith: not fencing unseen nodes
Oct 21 11:21:29 [9400] bl460g1n6 crm_resource:    debug: unpack_domains: 	Unpacking domains
Oct 21 11:21:29 [9400] bl460g1n6 crm_resource:     info: determine_online_status_fencing: 	Node bl460g1n8 is active
Oct 21 11:21:29 [9402] bl460g1n6 crm_resource:    debug: cib_native_signon_raw: 	Connection to CIB successful
Oct 21 11:21:29 [9400] bl460g1n6 crm_resource:     info: determine_online_status: 	Node bl460g1n8 is online
Oct 21 11:21:29 [9400] bl460g1n6 crm_resource:     info: determine_online_status_fencing: 	Node bl460g1n6 is active
Oct 21 11:21:29 [9400] bl460g1n6 crm_resource:     info: determine_online_status: 	Node bl460g1n6 is online
Oct 21 11:21:29 [9400] bl460g1n6 crm_resource:     info: determine_online_status_fencing: 	Node bl460g1n7 is active
Oct 21 11:21:29 [9400] bl460g1n6 crm_resource:     info: determine_online_status: 	Node bl460g1n7 is online
Oct 21 11:21:29 [9400] bl460g1n6 crm_resource:    debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n8 to prmPing:0
Oct 21 11:21:29 [9400] bl460g1n6 crm_resource:    debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n6 to prmPing:1
Oct 21 11:21:29 [9400] bl460g1n6 crm_resource:    debug: determine_op_status: 	prmVM1_monitor_0 on bl460g1n6 returned 'ok' (0) instead of the expected value: 'not running' (7)
Oct 21 11:21:29 [9400] bl460g1n6 crm_resource:     info: determine_op_status: 	Operation monitor found resource prmVM1 active on bl460g1n6
Oct 21 11:21:29 [9400] bl460g1n6 crm_resource:    debug: determine_op_status: 	prmVM2_monitor_0 on bl460g1n6 returned 'ok' (0) instead of the expected value: 'not running' (7)
Oct 21 11:21:29 [9400] bl460g1n6 crm_resource:     info: determine_op_status: 	Operation monitor found resource prmVM2 active on bl460g1n6
Oct 21 11:21:29 [9400] bl460g1n6 crm_resource:    debug: determine_op_status: 	prmVM3_monitor_0 on bl460g1n6 returned 'ok' (0) instead of the expected value: 'not running' (7)
Oct 21 11:21:29 [9400] bl460g1n6 crm_resource:     info: determine_op_status: 	Operation monitor found resource prmVM3 active on bl460g1n6
Oct 21 11:21:29 [9400] bl460g1n6 crm_resource:    debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n7 to prmPing:2
Oct 21 11:21:29 [9400] bl460g1n6 crm_resource:    debug: dump_resource_attr: 	Looking up hv_memory in prmVM1
Oct 21 11:21:29 [9400] bl460g1n6 crm_resource:    debug: cib_native_signoff: 	Signing out of the CIB Service
Oct 21 11:21:29 [9400] bl460g1n6 crm_resource:    debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Oct 21 11:21:29 [9400] bl460g1n6 crm_resource:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-request-7680-9400-14-header
Oct 21 11:21:29 [9400] bl460g1n6 crm_resource:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-response-7680-9400-14-header
Oct 21 11:21:29 [9400] bl460g1n6 crm_resource:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-event-7680-9400-14-header
Oct 21 11:21:29 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_query operation for section 'all': OK (rc=0, origin=local/crm_resource/2, version=0.14.68)
Oct 21 11:21:29 [9400] bl460g1n6 crm_resource:     info: crm_xml_cleanup: 	Cleaning up memory from libxml2
Oct 21 11:21:29 [7680] bl460g1n6        cib:    debug: qb_ipcs_dispatch_connection_request: 	HUP conn (7680-9400-14)
Oct 21 11:21:29 [7680] bl460g1n6        cib:    debug: qb_ipcs_disconnect: 	qb_ipcs_disconnect(7680-9400-14) state:2
Oct 21 11:21:29 [7680] bl460g1n6        cib:     info: crm_client_destroy: 	Destroying 0 events
Oct 21 11:21:29 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-response-7680-9400-14-header
Oct 21 11:21:29 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-event-7680-9400-14-header
Oct 21 11:21:29 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-request-7680-9400-14-header
Oct 21 11:21:29 [7682] bl460g1n6       lrmd:    debug: operation_finished: 	prmVM1_monitor_10000:9259 - exited with rc=0
Oct 21 11:21:29 [7682] bl460g1n6       lrmd:    debug: operation_finished: 	prmVM1_monitor_10000:9259:stderr [ -- empty -- ]
Oct 21 11:21:29 [7682] bl460g1n6       lrmd:    debug: operation_finished: 	prmVM1_monitor_10000:9259:stdout [ -- empty -- ]
Oct 21 11:21:29 [7682] bl460g1n6       lrmd:    debug: log_finished: 	finished - rsc:prmVM1 action:monitor call_id:43 pid:9259 exit-code:0 exec-time:0ms queue-time:0ms
Oct 21 11:21:29 [9402] bl460g1n6 crm_resource:    debug: unpack_config: 	STONITH timeout: 60000
Oct 21 11:21:29 [9402] bl460g1n6 crm_resource:    debug: unpack_config: 	STONITH of failed nodes is enabled
Oct 21 11:21:29 [9402] bl460g1n6 crm_resource:    debug: unpack_config: 	Stop all active resources: false
Oct 21 11:21:29 [9402] bl460g1n6 crm_resource:    debug: unpack_config: 	Cluster is symmetric - resources can run anywhere by default
Oct 21 11:21:29 [9402] bl460g1n6 crm_resource:    debug: unpack_config: 	Default stickiness: 0
Oct 21 11:21:29 [9402] bl460g1n6 crm_resource:    debug: unpack_config: 	On loss of CCM Quorum: Freeze resources
Oct 21 11:21:29 [9402] bl460g1n6 crm_resource:    debug: unpack_config: 	Node scores: 'red' = -INFINITY, 'yellow' = 0, 'green' = 0
Oct 21 11:21:29 [9402] bl460g1n6 crm_resource:  warning: handle_startup_fencing: 	Blind faith: not fencing unseen nodes
Oct 21 11:21:29 [9402] bl460g1n6 crm_resource:    debug: unpack_domains: 	Unpacking domains
Oct 21 11:21:29 [9402] bl460g1n6 crm_resource:     info: determine_online_status_fencing: 	Node bl460g1n8 is active
Oct 21 11:21:29 [9402] bl460g1n6 crm_resource:     info: determine_online_status: 	Node bl460g1n8 is online
Oct 21 11:21:29 [9402] bl460g1n6 crm_resource:     info: determine_online_status_fencing: 	Node bl460g1n6 is active
Oct 21 11:21:29 [9402] bl460g1n6 crm_resource:     info: determine_online_status: 	Node bl460g1n6 is online
Oct 21 11:21:29 [9402] bl460g1n6 crm_resource:     info: determine_online_status_fencing: 	Node bl460g1n7 is active
Oct 21 11:21:29 [9402] bl460g1n6 crm_resource:     info: determine_online_status: 	Node bl460g1n7 is online
Oct 21 11:21:29 [9402] bl460g1n6 crm_resource:    debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n8 to prmPing:0
Oct 21 11:21:29 [9402] bl460g1n6 crm_resource:    debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n6 to prmPing:1
Oct 21 11:21:29 [9402] bl460g1n6 crm_resource:    debug: determine_op_status: 	prmVM1_monitor_0 on bl460g1n6 returned 'ok' (0) instead of the expected value: 'not running' (7)
Oct 21 11:21:29 [9402] bl460g1n6 crm_resource:     info: determine_op_status: 	Operation monitor found resource prmVM1 active on bl460g1n6
Oct 21 11:21:29 [9402] bl460g1n6 crm_resource:    debug: determine_op_status: 	prmVM2_monitor_0 on bl460g1n6 returned 'ok' (0) instead of the expected value: 'not running' (7)
Oct 21 11:21:29 [9402] bl460g1n6 crm_resource:     info: determine_op_status: 	Operation monitor found resource prmVM2 active on bl460g1n6
Oct 21 11:21:29 [9402] bl460g1n6 crm_resource:    debug: determine_op_status: 	prmVM3_monitor_0 on bl460g1n6 returned 'ok' (0) instead of the expected value: 'not running' (7)
Oct 21 11:21:29 [9402] bl460g1n6 crm_resource:     info: determine_op_status: 	Operation monitor found resource prmVM3 active on bl460g1n6
Oct 21 11:21:29 [9402] bl460g1n6 crm_resource:    debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n7 to prmPing:2
Oct 21 11:21:29 [9402] bl460g1n6 crm_resource:    debug: dump_resource_attr: 	Looking up hv_memory in prmVM3
Oct 21 11:21:29 [9402] bl460g1n6 crm_resource:    debug: cib_native_signoff: 	Signing out of the CIB Service
Oct 21 11:21:29 [9402] bl460g1n6 crm_resource:    debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Oct 21 11:21:29 [7680] bl460g1n6        cib:    debug: qb_ipcs_dispatch_connection_request: 	HUP conn (7680-9402-15)
Oct 21 11:21:29 [9402] bl460g1n6 crm_resource:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-request-7680-9402-15-header
Oct 21 11:21:29 [7680] bl460g1n6        cib:    debug: qb_ipcs_disconnect: 	qb_ipcs_disconnect(7680-9402-15) state:2
Oct 21 11:21:29 [9402] bl460g1n6 crm_resource:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-response-7680-9402-15-header
Oct 21 11:21:29 [9402] bl460g1n6 crm_resource:    debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-event-7680-9402-15-header
Oct 21 11:21:29 [7680] bl460g1n6        cib:     info: crm_client_destroy: 	Destroying 0 events
Oct 21 11:21:29 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-response-7680-9402-15-header
Oct 21 11:21:29 [9402] bl460g1n6 crm_resource:     info: crm_xml_cleanup: 	Cleaning up memory from libxml2
Oct 21 11:21:29 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-event-7680-9402-15-header
Oct 21 11:21:29 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-request-7680-9402-15-header
Oct 21 11:21:29 [7682] bl460g1n6       lrmd:    debug: operation_finished: 	prmVM3_monitor_10000:9262 - exited with rc=0
Oct 21 11:21:29 [7682] bl460g1n6       lrmd:    debug: operation_finished: 	prmVM3_monitor_10000:9262:stderr [ -- empty -- ]
Oct 21 11:21:29 [7682] bl460g1n6       lrmd:    debug: operation_finished: 	prmVM3_monitor_10000:9262:stdout [ -- empty -- ]
Oct 21 11:21:29 [7682] bl460g1n6       lrmd:    debug: log_finished: 	finished - rsc:prmVM3 action:monitor call_id:45 pid:9262 exit-code:0 exec-time:0ms queue-time:0ms
Oct 21 11:21:30 [7680] bl460g1n6        cib:     info: crm_client_new: 	Connecting 0xbde1f0 for uid=0 gid=0 pid=9623 id=9eb0bd0b-b3fb-4e25-9ad0-6b269633c254
Oct 21 11:21:30 [7680] bl460g1n6        cib:    debug: handle_new_connection: 	IPC credentials authenticated (7680-9623-14)
Oct 21 11:21:30 [7680] bl460g1n6        cib:    debug: qb_ipcs_shm_connect: 	connecting to client [9623]
Oct 21 11:21:30 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:30 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:30 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:30 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_query operation for section 'all': OK (rc=0, origin=local/crm_mon/2, version=0.14.68)
Oct 21 11:21:30 [7680] bl460g1n6        cib:    debug: qb_ipcs_dispatch_connection_request: 	HUP conn (7680-9623-14)
Oct 21 11:21:30 [7680] bl460g1n6        cib:    debug: qb_ipcs_disconnect: 	qb_ipcs_disconnect(7680-9623-14) state:2
Oct 21 11:21:30 [7680] bl460g1n6        cib:     info: crm_client_destroy: 	Destroying 0 events
Oct 21 11:21:30 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_ro-response-7680-9623-14-header
Oct 21 11:21:30 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_ro-event-7680-9623-14-header
Oct 21 11:21:30 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_ro-request-7680-9623-14-header
Oct 21 11:21:30 [7680] bl460g1n6        cib:     info: crm_client_new: 	Connecting 0xbde1f0 for uid=0 gid=0 pid=9625 id=d38f7941-ec36-42e6-955f-62091d695378
Oct 21 11:21:30 [7680] bl460g1n6        cib:    debug: handle_new_connection: 	IPC credentials authenticated (7680-9625-14)
Oct 21 11:21:30 [7680] bl460g1n6        cib:    debug: qb_ipcs_shm_connect: 	connecting to client [9625]
Oct 21 11:21:30 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:30 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:30 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:30 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_query operation for section 'all': OK (rc=0, origin=local/cibadmin/2, version=0.14.68)
Oct 21 11:21:30 [7680] bl460g1n6        cib:    debug: qb_ipcs_dispatch_connection_request: 	HUP conn (7680-9625-14)
Oct 21 11:21:30 [7680] bl460g1n6        cib:    debug: qb_ipcs_disconnect: 	qb_ipcs_disconnect(7680-9625-14) state:2
Oct 21 11:21:30 [7680] bl460g1n6        cib:     info: crm_client_destroy: 	Destroying 0 events
Oct 21 11:21:30 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-response-7680-9625-14-header
Oct 21 11:21:30 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-event-7680-9625-14-header
Oct 21 11:21:30 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-request-7680-9625-14-header
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [QB    ] ipc_shm.c:qb_ipcs_shm_connect:295 connecting to client [9700]
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_created:272 connection created
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_init_fn:306 lib_init_fn: conn=0x7fb1490da8f0
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_dispatch_connection_request:757 HUP conn (7666-9700-34)
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_disconnect:605 qb_ipcs_disconnect(7666-9700-34) state:2
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [QB    ] loop_poll_epoll.c:_del:117 epoll_ctl(del): Bad file descriptor (9)
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_closed:417 cs_ipcs_connection_closed() 
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_exit_fn:325 exit_fn for conn=0x7fb1490da8f0
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_destroyed:390 cs_ipcs_connection_destroyed() 
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-response-7666-9700-34-header
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-event-7666-9700-34-header
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-request-7666-9700-34-header
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [QB    ] ipc_shm.c:qb_ipcs_shm_connect:295 connecting to client [9702]
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_created:272 connection created
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_init_fn:306 lib_init_fn: conn=0x7fb1490da8f0
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_write_to_file:808  writing total of: 8392724
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_dispatch_connection_request:757 HUP conn (7666-9702-34)
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_disconnect:605 qb_ipcs_disconnect(7666-9702-34) state:2
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [QB    ] loop_poll_epoll.c:_del:117 epoll_ctl(del): Bad file descriptor (9)
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_closed:417 cs_ipcs_connection_closed() 
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_exit_fn:325 exit_fn for conn=0x7fb1490da8f0
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_destroyed:390 cs_ipcs_connection_destroyed() 
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-response-7666-9702-34-header
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-event-7666-9702-34-header
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-request-7666-9702-34-header
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [QB    ] ipc_shm.c:qb_ipcs_shm_connect:295 connecting to client [9708]
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_created:272 connection created
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_init_fn:306 lib_init_fn: conn=0x7fb1490da8f0
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_dispatch_connection_request:757 HUP conn (7666-9708-34)
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_disconnect:605 qb_ipcs_disconnect(7666-9708-34) state:2
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [QB    ] loop_poll_epoll.c:_del:117 epoll_ctl(del): Bad file descriptor (9)
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_closed:417 cs_ipcs_connection_closed() 
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_exit_fn:325 exit_fn for conn=0x7fb1490da8f0
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_destroyed:390 cs_ipcs_connection_destroyed() 
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-response-7666-9708-34-header
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-event-7666-9708-34-header
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-request-7666-9708-34-header
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [QB    ] ipc_shm.c:qb_ipcs_shm_connect:295 connecting to client [9709]
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_created:272 connection created
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_init_fn:306 lib_init_fn: conn=0x7fb1490da8f0
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [QB    ] ipc_shm.c:qb_ipcs_shm_connect:295 connecting to client [9709]
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_created:272 connection created
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [QUORUM] vsf_quorum.c:quorum_lib_init_fn:316 lib_init_fn: conn=0x7fb1490d70c0
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [QUORUM] vsf_quorum.c:message_handler_req_lib_quorum_gettype:471 got quorum_type request on 0x7fb1490d70c0
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [QB    ] ipc_shm.c:qb_ipcs_shm_connect:295 connecting to client [9709]
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_created:272 connection created
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [QB    ] ipc_shm.c:qb_ipcs_shm_connect:295 connecting to client [9709]
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_created:272 connection created
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [QUORUM] vsf_quorum.c:message_handler_req_lib_quorum_getquorate:395 got quorate request on 0x7fb1490d70c0
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [QUORUM] vsf_quorum.c:message_handler_req_lib_quorum_trackstart:412 got trackstart request on 0x7fb1490d70c0
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [QUORUM] vsf_quorum.c:message_handler_req_lib_quorum_trackstart:420 sending initial status to 0x7fb1490d70c0
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [QUORUM] vsf_quorum.c:send_library_notification:359 sending quorum notification to 0x7fb1490d70c0, length = 60
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [QUORUM] vsf_quorum.c:message_handler_req_lib_quorum_trackstop:448 got trackstop request on 0x7fb1490d70c0
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:message_handler_req_lib_votequorum_getinfo:2041 got getinfo request on 0x7fb1494dc2f0 for node 3232261592
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:message_handler_req_lib_votequorum_getinfo:2133 getinfo response error: 1
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:message_handler_req_lib_votequorum_getinfo:2041 got getinfo request on 0x7fb1494dc2f0 for node 3232261592
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:message_handler_req_lib_votequorum_getinfo:2133 getinfo response error: 1
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:message_handler_req_lib_votequorum_getinfo:2041 got getinfo request on 0x7fb1494dc2f0 for node 3232261593
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:message_handler_req_lib_votequorum_getinfo:2133 getinfo response error: 1
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:message_handler_req_lib_votequorum_getinfo:2041 got getinfo request on 0x7fb1494dc2f0 for node 3232261594
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:message_handler_req_lib_votequorum_getinfo:2133 getinfo response error: 1
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_dispatch_connection_request:757 HUP conn (7666-9709-34)
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_disconnect:605 qb_ipcs_disconnect(7666-9709-34) state:2
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [QB    ] loop_poll_epoll.c:_del:117 epoll_ctl(del): Bad file descriptor (9)
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_closed:417 cs_ipcs_connection_closed() 
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_exit_fn:325 exit_fn for conn=0x7fb1490da8f0
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_destroyed:390 cs_ipcs_connection_destroyed() 
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-response-7666-9709-34-header
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-event-7666-9709-34-header
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-request-7666-9709-34-header
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_dispatch_connection_request:757 HUP conn (7666-9709-35)
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_disconnect:605 qb_ipcs_disconnect(7666-9709-35) state:2
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [QB    ] loop_poll_epoll.c:_del:117 epoll_ctl(del): Bad file descriptor (9)
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_closed:417 cs_ipcs_connection_closed() 
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [QUORUM] vsf_quorum.c:quorum_lib_exit_fn:328 lib_exit_fn: conn=0x7fb1490d70c0
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_destroyed:390 cs_ipcs_connection_destroyed() 
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-quorum-response-7666-9709-35-header
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-quorum-event-7666-9709-35-header
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-quorum-request-7666-9709-35-header
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_dispatch_connection_request:757 HUP conn (7666-9709-36)
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_disconnect:605 qb_ipcs_disconnect(7666-9709-36) state:2
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [QB    ] loop_poll_epoll.c:_del:117 epoll_ctl(del): Bad file descriptor (9)
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_closed:417 cs_ipcs_connection_closed() 
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_destroyed:390 cs_ipcs_connection_destroyed() 
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cfg-response-7666-9709-36-header
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cfg-event-7666-9709-36-header
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cfg-request-7666-9709-36-header
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_dispatch_connection_request:757 HUP conn (7666-9709-37)
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_disconnect:605 qb_ipcs_disconnect(7666-9709-37) state:2
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [QB    ] loop_poll_epoll.c:_del:117 epoll_ctl(del): Bad file descriptor (9)
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_closed:417 cs_ipcs_connection_closed() 
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_destroyed:390 cs_ipcs_connection_destroyed() 
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-votequorum-response-7666-9709-37-header
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-votequorum-event-7666-9709-37-header
Oct 21 11:21:31 [7663] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-votequorum-request-7666-9709-37-header
Oct 21 11:21:31 [7680] bl460g1n6        cib:     info: crm_client_new: 	Connecting 0xbde1f0 for uid=0 gid=0 pid=9711 id=a08a479f-faca-4878-9238-cacd5fa9bfa5
Oct 21 11:21:31 [7680] bl460g1n6        cib:    debug: handle_new_connection: 	IPC credentials authenticated (7680-9711-14)
Oct 21 11:21:31 [7680] bl460g1n6        cib:    debug: qb_ipcs_shm_connect: 	connecting to client [9711]
Oct 21 11:21:31 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:31 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:31 [7680] bl460g1n6        cib:    debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Oct 21 11:21:31 [7680] bl460g1n6        cib:     info: cib_process_request: 	Completed cib_query operation for section 'all': OK (rc=0, origin=local/crm_mon/2, version=0.14.68)
Oct 21 11:21:31 [7680] bl460g1n6        cib:    debug: qb_ipcs_dispatch_connection_request: 	HUP conn (7680-9711-14)
Oct 21 11:21:31 [7680] bl460g1n6        cib:    debug: qb_ipcs_disconnect: 	qb_ipcs_disconnect(7680-9711-14) state:2
Oct 21 11:21:31 [7680] bl460g1n6        cib:     info: crm_client_destroy: 	Destroying 0 events
Oct 21 11:21:31 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_ro-response-7680-9711-14-header
Oct 21 11:21:31 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_ro-event-7680-9711-14-header
Oct 21 11:21:31 [7680] bl460g1n6        cib:    debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_ro-request-7680-9711-14-header
