Jan 15 15:38:05 [30773] bl460g1n6 corosync notice  [MAIN  ] main.c:main:1176 Corosync Cluster Engine ('2.3.3'): started and ready to provide service.
Jan 15 15:38:05 [30773] bl460g1n6 corosync info    [MAIN  ] main.c:main:1177 Corosync built-in features: watchdog upstart snmp pie relro bindnow
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [TOTEM ] totempg.c:totempg_waiting_trans_ack_cb:285 waiting_trans_ack changed to 1
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [TOTEM ] totemsrp.c:totemsrp_initialize:901 Token Timeout (1000 ms) retransmit timeout (238 ms)
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [TOTEM ] totemsrp.c:totemsrp_initialize:904 token hold (180 ms) retransmits before loss (4 retrans)
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [TOTEM ] totemsrp.c:totemsrp_initialize:911 join (50 ms) send_join (0 ms) consensus (1200 ms) merge (200 ms)
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [TOTEM ] totemsrp.c:totemsrp_initialize:914 downcheck (1000 ms) fail to recv const (2500 msgs)
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [TOTEM ] totemsrp.c:totemsrp_initialize:916 seqno unchanged const (30 rotations) Maximum network MTU 1401
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [TOTEM ] totemsrp.c:totemsrp_initialize:920 window size per rotation (50 messages) maximum messages per rotation (17 messages)
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [TOTEM ] totemsrp.c:totemsrp_initialize:924 missed count const (5 messages)
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [TOTEM ] totemsrp.c:totemsrp_initialize:927 send threads (0 threads)
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [TOTEM ] totemsrp.c:totemsrp_initialize:930 RRP token expired timeout (238 ms)
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [TOTEM ] totemsrp.c:totemsrp_initialize:933 RRP token problem counter (10000 ms)
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [TOTEM ] totemsrp.c:totemsrp_initialize:936 RRP threshold (10 problem count)
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [TOTEM ] totemsrp.c:totemsrp_initialize:939 RRP multicast threshold (100 problem count)
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [TOTEM ] totemsrp.c:totemsrp_initialize:942 RRP automatic recovery check timeout (1000 ms)
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [TOTEM ] totemsrp.c:totemsrp_initialize:944 RRP mode set to active.
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [TOTEM ] totemsrp.c:totemsrp_initialize:947 heartbeat_failures_allowed (0)
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [TOTEM ] totemsrp.c:totemsrp_initialize:949 max_network_delay (50 ms)
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [TOTEM ] totemsrp.c:totemsrp_initialize:972 HeartBeat is Disabled. To enable set heartbeat_failures_allowed > 0
Jan 15 15:38:05 [30773] bl460g1n6 corosync notice  [TOTEM ] totemnet.c:totemnet_instance_initialize:242 Initializing transport (UDP/IP Multicast).
Jan 15 15:38:05 [30773] bl460g1n6 corosync notice  [TOTEM ] totemcrypto.c:init_nss:579 Initializing transmit/receive security (NSS) crypto: aes256 hash: sha1
Jan 15 15:38:05 [30773] bl460g1n6 corosync notice  [TOTEM ] totemnet.c:totemnet_instance_initialize:242 Initializing transport (UDP/IP Multicast).
Jan 15 15:38:05 [30773] bl460g1n6 corosync notice  [TOTEM ] totemcrypto.c:init_nss:579 Initializing transmit/receive security (NSS) crypto: aes256 hash: sha1
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [TOTEM ] totemudp.c:totemudp_build_sockets_ip:905 Receive multicast socket recv buffer size (320000 bytes).
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [TOTEM ] totemudp.c:totemudp_build_sockets_ip:911 Transmit multicast socket send buffer size (320000 bytes).
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [TOTEM ] totemudp.c:totemudp_build_sockets_ip:917 Local receive multicast loop socket recv buffer size (320000 bytes).
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [TOTEM ] totemudp.c:totemudp_build_sockets_ip:923 Local transmit multicast loop socket send buffer size (320000 bytes).
Jan 15 15:38:05 [30773] bl460g1n6 corosync notice  [TOTEM ] totemudp.c:timer_function_netif_check_timeout:670 The network interface [192.168.101.216] is now up.
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [TOTEM ] totemsrp.c:main_iface_change_fn:4669 Created or loaded sequence id 0.192.168.101.216 for this ring.
Jan 15 15:38:05 [30773] bl460g1n6 corosync notice  [SERV  ] service.c:corosync_service_link_and_init:174 Service engine loaded: corosync configuration map access [0]
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_service_init:865 Initializing IPC on cmap [0]
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_get_ipc_type:811 No configured qb.ipc_type. Using native ipc
Jan 15 15:38:05 [30773] bl460g1n6 corosync info    [QB    ] ipc_setup.c:qb_ipcs_us_publish:377 server name: cmap
Jan 15 15:38:05 [30773] bl460g1n6 corosync notice  [SERV  ] service.c:corosync_service_link_and_init:174 Service engine loaded: corosync configuration service [1]
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_service_init:865 Initializing IPC on cfg [1]
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_get_ipc_type:811 No configured qb.ipc_type. Using native ipc
Jan 15 15:38:05 [30773] bl460g1n6 corosync info    [QB    ] ipc_setup.c:qb_ipcs_us_publish:377 server name: cfg
Jan 15 15:38:05 [30773] bl460g1n6 corosync notice  [SERV  ] service.c:corosync_service_link_and_init:174 Service engine loaded: corosync cluster closed process group service v1.01 [2]
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_service_init:865 Initializing IPC on cpg [2]
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_get_ipc_type:811 No configured qb.ipc_type. Using native ipc
Jan 15 15:38:05 [30773] bl460g1n6 corosync info    [QB    ] ipc_setup.c:qb_ipcs_us_publish:377 server name: cpg
Jan 15 15:38:05 [30773] bl460g1n6 corosync notice  [SERV  ] service.c:corosync_service_link_and_init:174 Service engine loaded: corosync profile loading service [4]
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_service_init:851 NOT Initializing IPC on pload [4]
Jan 15 15:38:05 [30773] bl460g1n6 corosync info    [WD    ] wd.c:setup_watchdog:651 Watchdog is now been tickled by corosync.
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [WD    ] wd.c:setup_watchdog:652 Software Watchdog
Jan 15 15:38:05 [30773] bl460g1n6 corosync info    [WD    ] wd.c:wd_scan_resources:580 no resources configured.
Jan 15 15:38:05 [30773] bl460g1n6 corosync notice  [SERV  ] service.c:corosync_service_link_and_init:174 Service engine loaded: corosync watchdog service [7]
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_service_init:851 NOT Initializing IPC on wd [7]
Jan 15 15:38:05 [30773] bl460g1n6 corosync notice  [QUORUM] vsf_quorum.c:quorum_exec_init_fn:274 Using quorum provider corosync_votequorum
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:votequorum_readconfig:1046 Reading configuration (runtime: 0)
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:votequorum_read_nodelist_configuration:965 No nodelist defined or our node is not in the nodelist
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:votequorum_readconfig:1241 ev_tracking=0, ev_tracking_barrier = 0: expected_votes = 2
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:recalculate_quorum:925 total_votes=1, expected_votes=2
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:calculate_quorum:744 node 3232261592 state=1, votes=1, expected=2
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:decode_flags:594 flags: quorate: No Leaving: No WFA Status: No First: Yes Qdevice: No QdeviceAlive: No QdeviceCastVote: No QdeviceMasterWins: No
Jan 15 15:38:05 [30773] bl460g1n6 corosync notice  [SERV  ] service.c:corosync_service_link_and_init:174 Service engine loaded: corosync vote quorum service v1.0 [5]
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_service_init:865 Initializing IPC on votequorum [5]
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_get_ipc_type:811 No configured qb.ipc_type. Using native ipc
Jan 15 15:38:05 [30773] bl460g1n6 corosync info    [QB    ] ipc_setup.c:qb_ipcs_us_publish:377 server name: votequorum
Jan 15 15:38:05 [30773] bl460g1n6 corosync notice  [SERV  ] service.c:corosync_service_link_and_init:174 Service engine loaded: corosync cluster quorum service v0.1 [3]
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_service_init:865 Initializing IPC on quorum [3]
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_get_ipc_type:811 No configured qb.ipc_type. Using native ipc
Jan 15 15:38:05 [30773] bl460g1n6 corosync info    [QB    ] ipc_setup.c:qb_ipcs_us_publish:377 server name: quorum
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [TOTEM ] totemudp.c:totemudp_build_sockets_ip:905 Receive multicast socket recv buffer size (320000 bytes).
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [TOTEM ] totemudp.c:totemudp_build_sockets_ip:911 Transmit multicast socket send buffer size (320000 bytes).
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [TOTEM ] totemudp.c:totemudp_build_sockets_ip:917 Local receive multicast loop socket recv buffer size (320000 bytes).
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [TOTEM ] totemudp.c:totemudp_build_sockets_ip:923 Local transmit multicast loop socket send buffer size (320000 bytes).
Jan 15 15:38:05 [30773] bl460g1n6 corosync notice  [TOTEM ] totemudp.c:timer_function_netif_check_timeout:670 The network interface [192.168.102.216] is now up.
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [TOTEM ] totemsrp.c:memb_state_gather_enter:2087 entering GATHER state from 15(interface change).
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [TOTEM ] totemsrp.c:memb_state_commit_token_create:3138 Creating commit token because I am the rep.
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [TOTEM ] totemsrp.c:old_ring_state_save:1550 Saving state aru 0 high seq received 0
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [TOTEM ] totemsrp.c:memb_ring_id_set_and_store:3383 Storing new sequence id for ring 4
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [TOTEM ] totemsrp.c:memb_state_commit_enter:2135 entering COMMIT state.
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [TOTEM ] totemsrp.c:message_handler_memb_commit_token:4516 got commit token
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [TOTEM ] totemsrp.c:memb_state_recovery_enter:2172 entering RECOVERY state.
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [TOTEM ] totemsrp.c:memb_state_recovery_enter:2218 position [0] member 192.168.101.216:
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [TOTEM ] totemsrp.c:memb_state_recovery_enter:2222 previous ring seq 0 rep 192.168.101.216
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [TOTEM ] totemsrp.c:memb_state_recovery_enter:2228 aru 0 high delivered 0 received flag 1
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [TOTEM ] totemsrp.c:memb_state_recovery_enter:2326 Did not need to originate any messages in recovery.
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [TOTEM ] totemsrp.c:message_handler_memb_commit_token:4516 got commit token
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [TOTEM ] totemsrp.c:message_handler_memb_commit_token:4569 Sending initial ORF token
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [TOTEM ] totemsrp.c:message_handler_memb_commit_token:4516 got commit token
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [TOTEM ] totemsrp.c:message_handler_memb_commit_token:4569 Sending initial ORF token
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [TOTEM ] totemsrp.c:message_handler_memb_commit_token:4516 got commit token
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [TOTEM ] totemsrp.c:message_handler_memb_commit_token:4569 Sending initial ORF token
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [TOTEM ] totemsrp.c:message_handler_memb_commit_token:4516 got commit token
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [TOTEM ] totemsrp.c:message_handler_memb_commit_token:4569 Sending initial ORF token
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [TOTEM ] totemsrp.c:message_handler_memb_commit_token:4516 got commit token
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [TOTEM ] totemsrp.c:message_handler_memb_commit_token:4569 Sending initial ORF token
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [TOTEM ] totemsrp.c:message_handler_orf_token:3799 token retrans flag is 0 my set retrans flag0 retrans queue empty 1 count 0, aru 0
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [TOTEM ] totemsrp.c:message_handler_orf_token:3810 install seq 0 aru 0 high seq received 0
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [TOTEM ] totemsrp.c:message_handler_orf_token:3799 token retrans flag is 0 my set retrans flag0 retrans queue empty 1 count 1, aru 0
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [TOTEM ] totemsrp.c:message_handler_orf_token:3810 install seq 0 aru 0 high seq received 0
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [TOTEM ] totemsrp.c:message_handler_orf_token:3799 token retrans flag is 0 my set retrans flag0 retrans queue empty 1 count 2, aru 0
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [TOTEM ] totemsrp.c:message_handler_orf_token:3810 install seq 0 aru 0 high seq received 0
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [TOTEM ] totemsrp.c:message_handler_orf_token:3799 token retrans flag is 0 my set retrans flag0 retrans queue empty 1 count 3, aru 0
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [TOTEM ] totemsrp.c:message_handler_orf_token:3810 install seq 0 aru 0 high seq received 0
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [TOTEM ] totemsrp.c:message_handler_orf_token:3829 retrans flag count 4 token aru 0 install seq 0 aru 0 0
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [TOTEM ] totemsrp.c:old_ring_state_reset:1566 Resetting old ring state
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [TOTEM ] totemsrp.c:deliver_messages_from_recovery_to_regular:1772 recovery to regular 1-0
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [TOTEM ] totempg.c:totempg_waiting_trans_ack_cb:285 waiting_trans_ack changed to 1
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [MAIN  ] main.c:member_object_joined:336 Member joined: r(0) ip(192.168.101.216) r(1) ip(192.168.102.216) 
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [TOTEM ] totemsrp.c:memb_state_operational_enter:2010 entering OPERATIONAL state.
Jan 15 15:38:05 [30773] bl460g1n6 corosync notice  [TOTEM ] totemsrp.c:memb_state_operational_enter:2016 A new membership (192.168.101.216:4) was formed. Members joined: -1062705704
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:message_handler_req_exec_votequorum_nodeinfo:1703 got nodeinfo message from cluster node 3232261592
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:message_handler_req_exec_votequorum_nodeinfo:1708 nodeinfo message[3232261592]: votes: 1, expected: 2 flags: 8
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:decode_flags:594 flags: quorate: No Leaving: No WFA Status: No First: Yes Qdevice: No QdeviceAlive: No QdeviceCastVote: No QdeviceMasterWins: No
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:recalculate_quorum:925 total_votes=1, expected_votes=2
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:calculate_quorum:744 node 3232261592 state=1, votes=1, expected=2
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [SYNC  ] sync.c:sync_barrier_handler:232 Committing synchronization for corosync configuration map access
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_sync_activate:386 Single node sync -> no action
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [CPG   ] cpg.c:downlist_log:776 comparing: sender r(0) ip(192.168.101.216) r(1) ip(192.168.102.216) ; members(old:0 left:0)
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [CPG   ] cpg.c:downlist_log:776 chosen downlist: sender r(0) ip(192.168.101.216) r(1) ip(192.168.102.216) ; members(old:0 left:0)
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [SYNC  ] sync.c:sync_barrier_handler:232 Committing synchronization for corosync cluster closed process group service v1.01
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:decode_flags:594 flags: quorate: No Leaving: No WFA Status: No First: Yes Qdevice: No QdeviceAlive: No QdeviceCastVote: No QdeviceMasterWins: No
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:message_handler_req_exec_votequorum_nodeinfo:1703 got nodeinfo message from cluster node 3232261592
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:message_handler_req_exec_votequorum_nodeinfo:1708 nodeinfo message[3232261592]: votes: 1, expected: 2 flags: 8
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:decode_flags:594 flags: quorate: No Leaving: No WFA Status: No First: Yes Qdevice: No QdeviceAlive: No QdeviceCastVote: No QdeviceMasterWins: No
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:recalculate_quorum:925 total_votes=1, expected_votes=2
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:calculate_quorum:744 node 3232261592 state=1, votes=1, expected=2
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:message_handler_req_exec_votequorum_nodeinfo:1703 got nodeinfo message from cluster node 3232261592
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:message_handler_req_exec_votequorum_nodeinfo:1708 nodeinfo message[0]: votes: 0, expected: 0 flags: 0
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [SYNC  ] sync.c:sync_barrier_handler:232 Committing synchronization for corosync vote quorum service v1.0
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:recalculate_quorum:925 total_votes=1, expected_votes=2
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:calculate_quorum:744 node 3232261592 state=1, votes=1, expected=2
Jan 15 15:38:05 [30773] bl460g1n6 corosync notice  [QUORUM] vsf_quorum.c:log_view_list:132 Members[1]: -1062705704
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [QUORUM] vsf_quorum.c:send_library_notification:359 sending quorum notification to (nil), length = 52
Jan 15 15:38:05 [30773] bl460g1n6 corosync notice  [MAIN  ] main.c:corosync_sync_completed:279 Completed service synchronization, ready to provide service.
Jan 15 15:38:05 [30773] bl460g1n6 corosync debug   [TOTEM ] totempg.c:totempg_waiting_trans_ack_cb:285 waiting_trans_ack changed to 0
Jan 15 15:38:06 [30773] bl460g1n6 corosync debug   [QB    ] ipc_setup.c:handle_new_connection:484 IPC credentials authenticated (30775-30782-25)
Jan 15 15:38:06 [30773] bl460g1n6 corosync debug   [QB    ] ipc_shm.c:qb_ipcs_shm_connect:295 connecting to client [30782]
Jan 15 15:38:06 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:06 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:06 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:06 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_created:272 connection created
Jan 15 15:38:06 [30773] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_dispatch_connection_request:757 HUP conn (30775-30782-25)
Jan 15 15:38:06 [30773] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_disconnect:605 qb_ipcs_disconnect(30775-30782-25) state:2
Jan 15 15:38:06 [30773] bl460g1n6 corosync debug   [QB    ] loop_poll_epoll.c:_del:117 epoll_ctl(del): Bad file descriptor (9)
Jan 15 15:38:06 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_closed:417 cs_ipcs_connection_closed() 
Jan 15 15:38:06 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_destroyed:390 cs_ipcs_connection_destroyed() 
Jan 15 15:38:06 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cfg-response-30775-30782-25-header
Jan 15 15:38:06 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cfg-event-30775-30782-25-header
Jan 15 15:38:06 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cfg-request-30775-30782-25-header
Jan 15 15:38:06 [30773] bl460g1n6 corosync debug   [TOTEM ] totemsrp.c:memb_state_gather_enter:2087 entering GATHER state from 11(merge during join).
Jan 15 15:38:06 [30773] bl460g1n6 corosync debug   [TOTEM ] totemsrp.c:memb_state_commit_token_create:3138 Creating commit token because I am the rep.
Jan 15 15:38:06 [30773] bl460g1n6 corosync debug   [TOTEM ] totemsrp.c:old_ring_state_save:1550 Saving state aru 6 high seq received 6
Jan 15 15:38:06 [30773] bl460g1n6 corosync debug   [TOTEM ] totemsrp.c:memb_ring_id_set_and_store:3383 Storing new sequence id for ring 8
Jan 15 15:38:06 [30773] bl460g1n6 corosync debug   [TOTEM ] totemsrp.c:memb_state_commit_enter:2135 entering COMMIT state.
Jan 15 15:38:06 [30773] bl460g1n6 corosync debug   [TOTEM ] totemsrp.c:message_handler_memb_commit_token:4516 got commit token
Jan 15 15:38:06 [30773] bl460g1n6 corosync debug   [TOTEM ] totemsrp.c:message_handler_memb_commit_token:4516 got commit token
Jan 15 15:38:06 [30773] bl460g1n6 corosync debug   [TOTEM ] totemsrp.c:memb_state_recovery_enter:2172 entering RECOVERY state.
Jan 15 15:38:06 [30773] bl460g1n6 corosync debug   [TOTEM ] totemsrp.c:memb_state_recovery_enter:2214 TRANS [0] member 192.168.101.216:
Jan 15 15:38:06 [30773] bl460g1n6 corosync debug   [TOTEM ] totemsrp.c:memb_state_recovery_enter:2218 position [0] member 192.168.101.216:
Jan 15 15:38:06 [30773] bl460g1n6 corosync debug   [TOTEM ] totemsrp.c:memb_state_recovery_enter:2222 previous ring seq 4 rep 192.168.101.216
Jan 15 15:38:06 [30773] bl460g1n6 corosync debug   [TOTEM ] totemsrp.c:memb_state_recovery_enter:2228 aru 6 high delivered 6 received flag 1
Jan 15 15:38:06 [30773] bl460g1n6 corosync debug   [TOTEM ] totemsrp.c:memb_state_recovery_enter:2218 position [1] member 192.168.101.217:
Jan 15 15:38:06 [30773] bl460g1n6 corosync debug   [TOTEM ] totemsrp.c:memb_state_recovery_enter:2222 previous ring seq 4 rep 192.168.101.217
Jan 15 15:38:06 [30773] bl460g1n6 corosync debug   [TOTEM ] totemsrp.c:memb_state_recovery_enter:2228 aru 6 high delivered 6 received flag 1
Jan 15 15:38:06 [30773] bl460g1n6 corosync debug   [TOTEM ] totemsrp.c:memb_state_recovery_enter:2326 Did not need to originate any messages in recovery.
Jan 15 15:38:06 [30773] bl460g1n6 corosync debug   [TOTEM ] totemsrp.c:message_handler_memb_commit_token:4516 got commit token
Jan 15 15:38:06 [30773] bl460g1n6 corosync debug   [TOTEM ] totemsrp.c:message_handler_memb_commit_token:4569 Sending initial ORF token
Jan 15 15:38:06 [30773] bl460g1n6 corosync debug   [TOTEM ] totemsrp.c:message_handler_memb_commit_token:4516 got commit token
Jan 15 15:38:06 [30773] bl460g1n6 corosync debug   [TOTEM ] totemsrp.c:message_handler_memb_commit_token:4569 Sending initial ORF token
Jan 15 15:38:06 [30773] bl460g1n6 corosync debug   [TOTEM ] totemsrp.c:message_handler_memb_commit_token:4516 got commit token
Jan 15 15:38:06 [30773] bl460g1n6 corosync debug   [TOTEM ] totemsrp.c:message_handler_memb_commit_token:4569 Sending initial ORF token
Jan 15 15:38:06 [30773] bl460g1n6 corosync debug   [TOTEM ] totemsrp.c:message_handler_memb_commit_token:4516 got commit token
Jan 15 15:38:06 [30773] bl460g1n6 corosync debug   [TOTEM ] totemsrp.c:message_handler_memb_commit_token:4569 Sending initial ORF token
Jan 15 15:38:06 [30773] bl460g1n6 corosync debug   [TOTEM ] totemsrp.c:message_handler_orf_token:3799 token retrans flag is 0 my set retrans flag0 retrans queue empty 1 count 0, aru 0
Jan 15 15:38:06 [30773] bl460g1n6 corosync debug   [TOTEM ] totemsrp.c:message_handler_orf_token:3810 install seq 0 aru 0 high seq received 0
Jan 15 15:38:06 [30773] bl460g1n6 corosync debug   [TOTEM ] totemsrp.c:message_handler_orf_token:3799 token retrans flag is 0 my set retrans flag0 retrans queue empty 1 count 1, aru 0
Jan 15 15:38:06 [30773] bl460g1n6 corosync debug   [TOTEM ] totemsrp.c:message_handler_orf_token:3810 install seq 0 aru 0 high seq received 0
Jan 15 15:38:06 [30773] bl460g1n6 corosync debug   [TOTEM ] totemsrp.c:message_handler_orf_token:3799 token retrans flag is 0 my set retrans flag0 retrans queue empty 1 count 2, aru 0
Jan 15 15:38:06 [30773] bl460g1n6 corosync debug   [TOTEM ] totemsrp.c:message_handler_orf_token:3810 install seq 0 aru 0 high seq received 0
Jan 15 15:38:06 [30773] bl460g1n6 corosync debug   [TOTEM ] totemsrp.c:message_handler_orf_token:3799 token retrans flag is 0 my set retrans flag0 retrans queue empty 1 count 3, aru 0
Jan 15 15:38:06 [30773] bl460g1n6 corosync debug   [TOTEM ] totemsrp.c:message_handler_orf_token:3810 install seq 0 aru 0 high seq received 0
Jan 15 15:38:06 [30773] bl460g1n6 corosync debug   [TOTEM ] totemsrp.c:message_handler_orf_token:3829 retrans flag count 4 token aru 0 install seq 0 aru 0 0
Jan 15 15:38:06 [30773] bl460g1n6 corosync debug   [TOTEM ] totemsrp.c:old_ring_state_reset:1566 Resetting old ring state
Jan 15 15:38:06 [30773] bl460g1n6 corosync debug   [TOTEM ] totemsrp.c:deliver_messages_from_recovery_to_regular:1772 recovery to regular 1-0
Jan 15 15:38:06 [30773] bl460g1n6 corosync debug   [TOTEM ] totempg.c:totempg_waiting_trans_ack_cb:285 waiting_trans_ack changed to 1
Jan 15 15:38:06 [30773] bl460g1n6 corosync debug   [MAIN  ] main.c:member_object_joined:336 Member joined: r(0) ip(192.168.101.217) r(1) ip(192.168.102.217) 
Jan 15 15:38:06 [30773] bl460g1n6 corosync debug   [TOTEM ] totemsrp.c:memb_state_operational_enter:2010 entering OPERATIONAL state.
Jan 15 15:38:06 [30773] bl460g1n6 corosync notice  [TOTEM ] totemsrp.c:memb_state_operational_enter:2016 A new membership (192.168.101.216:8) was formed. Members joined: -1062705703
Jan 15 15:38:06 [30773] bl460g1n6 corosync debug   [SYNC  ] sync.c:sync_barrier_handler:232 Committing synchronization for corosync configuration map access
Jan 15 15:38:06 [30773] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_sync_activate:400 My config version is 0 -> no action
Jan 15 15:38:06 [30773] bl460g1n6 corosync debug   [CPG   ] cpg.c:downlist_log:776 comparing: sender r(0) ip(192.168.101.216) r(1) ip(192.168.102.216) ; members(old:1 left:0)
Jan 15 15:38:06 [30773] bl460g1n6 corosync debug   [CPG   ] cpg.c:downlist_log:776 comparing: sender r(0) ip(192.168.101.217) r(1) ip(192.168.102.217) ; members(old:1 left:0)
Jan 15 15:38:06 [30773] bl460g1n6 corosync debug   [CPG   ] cpg.c:downlist_log:776 chosen downlist: sender r(0) ip(192.168.101.216) r(1) ip(192.168.102.216) ; members(old:1 left:0)
Jan 15 15:38:06 [30773] bl460g1n6 corosync debug   [SYNC  ] sync.c:sync_barrier_handler:232 Committing synchronization for corosync cluster closed process group service v1.01
Jan 15 15:38:06 [30773] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:decode_flags:594 flags: quorate: No Leaving: No WFA Status: No First: No Qdevice: No QdeviceAlive: No QdeviceCastVote: No QdeviceMasterWins: No
Jan 15 15:38:06 [30773] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:message_handler_req_exec_votequorum_nodeinfo:1703 got nodeinfo message from cluster node 3232261592
Jan 15 15:38:06 [30773] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:message_handler_req_exec_votequorum_nodeinfo:1708 nodeinfo message[3232261592]: votes: 1, expected: 2 flags: 0
Jan 15 15:38:06 [30773] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:decode_flags:594 flags: quorate: No Leaving: No WFA Status: No First: No Qdevice: No QdeviceAlive: No QdeviceCastVote: No QdeviceMasterWins: No
Jan 15 15:38:06 [30773] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:recalculate_quorum:925 total_votes=1, expected_votes=2
Jan 15 15:38:06 [30773] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:calculate_quorum:744 node 3232261592 state=1, votes=1, expected=2
Jan 15 15:38:06 [30773] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:message_handler_req_exec_votequorum_nodeinfo:1703 got nodeinfo message from cluster node 3232261592
Jan 15 15:38:06 [30773] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:message_handler_req_exec_votequorum_nodeinfo:1708 nodeinfo message[0]: votes: 0, expected: 0 flags: 0
Jan 15 15:38:06 [30773] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:message_handler_req_exec_votequorum_nodeinfo:1703 got nodeinfo message from cluster node 3232261593
Jan 15 15:38:06 [30773] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:message_handler_req_exec_votequorum_nodeinfo:1708 nodeinfo message[3232261593]: votes: 1, expected: 2 flags: 0
Jan 15 15:38:06 [30773] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:decode_flags:594 flags: quorate: No Leaving: No WFA Status: No First: No Qdevice: No QdeviceAlive: No QdeviceCastVote: No QdeviceMasterWins: No
Jan 15 15:38:06 [30773] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:recalculate_quorum:925 total_votes=2, expected_votes=2
Jan 15 15:38:06 [30773] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:calculate_quorum:744 node 3232261592 state=1, votes=1, expected=2
Jan 15 15:38:06 [30773] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:calculate_quorum:744 node 3232261593 state=1, votes=1, expected=2
Jan 15 15:38:06 [30773] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:get_lowest_node_id:534 lowest node id: -1062705704 us: -1062705704
Jan 15 15:38:06 [30773] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:are_we_quorate:851 quorum regained, resuming activity
Jan 15 15:38:06 [30773] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:message_handler_req_exec_votequorum_nodeinfo:1703 got nodeinfo message from cluster node 3232261593
Jan 15 15:38:06 [30773] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:message_handler_req_exec_votequorum_nodeinfo:1708 nodeinfo message[0]: votes: 0, expected: 0 flags: 0
Jan 15 15:38:06 [30773] bl460g1n6 corosync debug   [SYNC  ] sync.c:sync_barrier_handler:232 Committing synchronization for corosync vote quorum service v1.0
Jan 15 15:38:06 [30773] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:recalculate_quorum:925 total_votes=2, expected_votes=2
Jan 15 15:38:06 [30773] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:calculate_quorum:744 node 3232261592 state=1, votes=1, expected=2
Jan 15 15:38:06 [30773] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:calculate_quorum:744 node 3232261593 state=1, votes=1, expected=2
Jan 15 15:38:06 [30773] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:get_lowest_node_id:534 lowest node id: -1062705704 us: -1062705704
Jan 15 15:38:06 [30773] bl460g1n6 corosync notice  [QUORUM] vsf_quorum.c:quorum_api_set_quorum:148 This node is within the primary component and will provide service.
Jan 15 15:38:06 [30773] bl460g1n6 corosync notice  [QUORUM] vsf_quorum.c:log_view_list:132 Members[2]: -1062705704 -1062705703
Jan 15 15:38:06 [30773] bl460g1n6 corosync debug   [QUORUM] vsf_quorum.c:send_library_notification:359 sending quorum notification to (nil), length = 56
Jan 15 15:38:06 [30773] bl460g1n6 corosync notice  [MAIN  ] main.c:corosync_sync_completed:279 Completed service synchronization, ready to provide service.
Jan 15 15:38:06 [30773] bl460g1n6 corosync debug   [TOTEM ] totempg.c:totempg_waiting_trans_ack_cb:285 waiting_trans_ack changed to 0
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (   logging.c:775   )    info: crm_log_init: 	Changed active directory to /var/lib/heartbeat/cores/root
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: ( pacemaker.c:900   )   debug: main: 	Checking for old instances of pacemakerd
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (       ipc.c:780   )    info: crm_ipc_connect: 	Could not establish pacemakerd connection: Connection refused (111)
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ipc_setup.c:handle_new_connection:484 IPC credentials authenticated (30775-30786-25)
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ipc_shm.c:qb_ipcs_shm_connect:295 connecting to client [30786]
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_created:272 connection created
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_init_fn:306 lib_init_fn: conn=0x7f438535ab50
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (   cluster.c:526   )   debug: get_cluster_type: 	Testing with Corosync
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ipc_setup.c:handle_new_connection:484 IPC credentials authenticated (30775-30786-26)
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ipc_shm.c:qb_ipcs_shm_connect:295 connecting to client [30786]
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_created:272 connection created
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_init_fn:306 lib_init_fn: conn=0x7f438535c020
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (      ipcc.c:378   )   debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_dispatch_connection_request:757 HUP conn (30775-30786-26)
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_disconnect:605 qb_ipcs_disconnect(30775-30786-26) state:2
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] loop_poll_epoll.c:_del:117 epoll_ctl(del): Bad file descriptor (9)
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_closed:417 cs_ipcs_connection_closed() 
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_exit_fn:325 exit_fn for conn=0x7f438535c020
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-request-30775-30786-26-header
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_destroyed:390 cs_ipcs_connection_destroyed() 
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-response-30775-30786-26-header
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-response-30775-30786-26-header
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-event-30775-30786-26-header
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (   cluster.c:573   )    info: get_cluster_type: 	Detected an active 'corosync' cluster
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (  corosync.c:326   )    info: mcp_read_config: 	Reading configure for stack: corosync
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-event-30775-30786-26-header
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-request-30775-30786-26-header
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (  corosync.c:426   )  notice: mcp_read_config: 	Configured corosync to accept connections from group 189: OK (1)
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (      ipcc.c:378   )   debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_dispatch_connection_request:757 HUP conn (30775-30786-25)
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_disconnect:605 qb_ipcs_disconnect(30775-30786-25) state:2
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] loop_poll_epoll.c:_del:117 epoll_ctl(del): Bad file descriptor (9)
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_closed:417 cs_ipcs_connection_closed() 
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_exit_fn:325 exit_fn for conn=0x7f438535ab50
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_destroyed:390 cs_ipcs_connection_destroyed() 
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-response-30775-30786-25-header
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-request-30775-30786-25-header
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-response-30775-30786-25-header
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-event-30775-30786-25-header
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (   logging.c:314   )  notice: crm_add_logfile: 	Additional logging available in /var/log/ha-debug
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-event-30775-30786-25-header
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: ( pacemaker.c:935   )  notice: main: 	Starting Pacemaker 1.1.11-0.27.b48276b.git.el6 (Build: b48276b):  generated-manpages agent-manpages ascii-docs ncurses libqb-logging libqb-ipc lha-fencing nagios  corosync-native snmp
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: ( pacemaker.c:935   )  notice: main: 	Starting Pacemaker 1.1.11-0.27.b48276b.git.el6 (Build: b48276b):  generated-manpages agent-manpages ascii-docs ncurses libqb-logging libqb-ipc lha-fencing nagios  corosync-native snmp
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: ( pacemaker.c:945   )    info: main: 	Maximum core file size is: 18446744073709551615
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: ( pacemaker.c:945   )    info: main: 	Maximum core file size is: 18446744073709551615
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-request-30775-30786-25-header
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: ( ipc_setup.c:377   )    info: qb_ipcs_us_publish: 	server name: pacemakerd
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: ( ipc_setup.c:377   )    info: qb_ipcs_us_publish: 	server name: pacemakerd
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ipc_setup.c:handle_new_connection:484 IPC credentials authenticated (30775-30786-25)
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ipc_shm.c:qb_ipcs_shm_connect:295 connecting to client [30786]
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_created:272 connection created
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (  corosync.c:142   )   debug: cluster_connect_cfg: 	Our nodeid: -1062705704
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (  corosync.c:142   )   debug: cluster_connect_cfg: 	Our nodeid: -1062705704
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ipc_setup.c:handle_new_connection:484 IPC credentials authenticated (30775-30786-26)
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ipc_shm.c:qb_ipcs_shm_connect:295 connecting to client [30786]
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_created:272 connection created
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [CPG   ] cpg.c:cpg_lib_init_fn:1459 lib_init_fn: conn=0x7f438535b360, cpd=0x7f438535bb64
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (       cpg.c:110   )   debug: get_local_nodeid: 	Local nodeid is 3232261592
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (       cpg.c:110   )   debug: get_local_nodeid: 	Local nodeid is 3232261592
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (membership.c:400   )    info: crm_get_peer: 	Created entry 1e0d5776-8f4a-4d29-841a-8c424c088ba9/0xf94360 for node (null)/3232261592 (1 total)
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (membership.c:400   )    info: crm_get_peer: 	Created entry 1e0d5776-8f4a-4d29-841a-8c424c088ba9/0xf94360 for node (null)/3232261592 (1 total)
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ipc_setup.c:handle_new_connection:484 IPC credentials authenticated (30775-30786-27)
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ipc_shm.c:qb_ipcs_shm_connect:295 connecting to client [30786]
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_created:272 connection created
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_init_fn:306 lib_init_fn: conn=0x7f4385660ed0
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [CPG   ] cpg.c:message_handler_req_exec_cpg_procjoin:1260 got procjoin message from cluster node -1062705704 (r(0) ip(192.168.101.216) r(1) ip(192.168.102.216) ) for pid 30786
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (      ipcc.c:378   )   debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (      ipcc.c:378   )   debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-request-30775-30786-27-header
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-request-30775-30786-27-header
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_dispatch_connection_request:757 HUP conn (30775-30786-27)
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_disconnect:605 qb_ipcs_disconnect(30775-30786-27) state:2
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] loop_poll_epoll.c:_del:117 epoll_ctl(del): Bad file descriptor (9)
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_closed:417 cs_ipcs_connection_closed() 
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_exit_fn:325 exit_fn for conn=0x7f4385660ed0
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_destroyed:390 cs_ipcs_connection_destroyed() 
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-response-30775-30786-27-header
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-response-30775-30786-27-header
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-response-30775-30786-27-header
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-event-30775-30786-27-header
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-event-30775-30786-27-header
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (  corosync.c:134   )  notice: corosync_node_name: 	Unable to get node name for nodeid 3232261592
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (  corosync.c:134   )  notice: corosync_node_name: 	Unable to get node name for nodeid 3232261592
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (   cluster.c:350   )  notice: get_node_name: 	Could not obtain a node name for corosync nodeid 3232261592
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (   cluster.c:350   )  notice: get_node_name: 	Could not obtain a node name for corosync nodeid 3232261592
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (membership.c:439   )    info: crm_get_peer: 	Node 3232261592 has uuid 3232261592
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-event-30775-30786-27-header
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (membership.c:439   )    info: crm_get_peer: 	Node 3232261592 has uuid 3232261592
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (membership.c:558   )    info: crm_update_peer_proc: 	cluster_connect_cpg: Node (null)[3232261592] - corosync-cpg is now online
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (membership.c:558   )    info: crm_update_peer_proc: 	cluster_connect_cpg: Node (null)[3232261592] - corosync-cpg is now online
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (  corosync.c:255   )   debug: cluster_connect_quorum: 	Configuring Pacemaker to obtain quorum from Corosync
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (  corosync.c:255   )   debug: cluster_connect_quorum: 	Configuring Pacemaker to obtain quorum from Corosync
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-request-30775-30786-27-header
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ipc_setup.c:handle_new_connection:484 IPC credentials authenticated (30775-30786-27)
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ipc_shm.c:qb_ipcs_shm_connect:295 connecting to client [30786]
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_created:272 connection created
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QUORUM] vsf_quorum.c:quorum_lib_init_fn:316 lib_init_fn: conn=0x7f4385761940
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QUORUM] vsf_quorum.c:message_handler_req_lib_quorum_gettype:471 got quorum_type request on 0x7f4385761940
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QUORUM] vsf_quorum.c:message_handler_req_lib_quorum_getquorate:395 got quorate request on 0x7f4385761940
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (  corosync.c:273   )  notice: cluster_connect_quorum: 	Quorum acquired
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (  corosync.c:273   )  notice: cluster_connect_quorum: 	Quorum acquired
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QUORUM] vsf_quorum.c:message_handler_req_lib_quorum_trackstart:412 got trackstart request on 0x7f4385761940
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QUORUM] vsf_quorum.c:message_handler_req_lib_quorum_trackstart:420 sending initial status to 0x7f4385761940
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QUORUM] vsf_quorum.c:send_library_notification:359 sending quorum notification to 0x7f4385761940, length = 56
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ipc_setup.c:handle_new_connection:484 IPC credentials authenticated (30775-30786-28)
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ipc_shm.c:qb_ipcs_shm_connect:295 connecting to client [30786]
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_created:272 connection created
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_init_fn:306 lib_init_fn: conn=0x7f43857655c0
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (      ipcc.c:378   )   debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (      ipcc.c:378   )   debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_dispatch_connection_request:757 HUP conn (30775-30786-28)
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_disconnect:605 qb_ipcs_disconnect(30775-30786-28) state:2
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-request-30775-30786-28-header
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] loop_poll_epoll.c:_del:117 epoll_ctl(del): Bad file descriptor (9)
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_closed:417 cs_ipcs_connection_closed() 
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_exit_fn:325 exit_fn for conn=0x7f43857655c0
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_destroyed:390 cs_ipcs_connection_destroyed() 
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-response-30775-30786-28-header
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-request-30775-30786-28-header
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-response-30775-30786-28-header
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-response-30775-30786-28-header
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-event-30775-30786-28-header
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-event-30775-30786-28-header
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-event-30775-30786-28-header
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-request-30775-30786-28-header
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ipc_setup.c:handle_new_connection:484 IPC credentials authenticated (30775-30786-28)
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ipc_shm.c:qb_ipcs_shm_connect:295 connecting to client [30786]
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_created:272 connection created
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_init_fn:306 lib_init_fn: conn=0x7f4385865db0
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (      ipcc.c:378   )   debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (      ipcc.c:378   )   debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_dispatch_connection_request:757 HUP conn (30775-30786-28)
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_disconnect:605 qb_ipcs_disconnect(30775-30786-28) state:2
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] loop_poll_epoll.c:_del:117 epoll_ctl(del): Bad file descriptor (9)
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_closed:417 cs_ipcs_connection_closed() 
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_exit_fn:325 exit_fn for conn=0x7f4385865db0
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_destroyed:390 cs_ipcs_connection_destroyed() 
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-request-30775-30786-28-header
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-response-30775-30786-28-header
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-request-30775-30786-28-header
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-response-30775-30786-28-header
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-response-30775-30786-28-header
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-event-30775-30786-28-header
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-event-30775-30786-28-header
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (  corosync.c:134   )  notice: corosync_node_name: 	Unable to get node name for nodeid 3232261592
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (  corosync.c:134   )  notice: corosync_node_name: 	Unable to get node name for nodeid 3232261592
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (   cluster.c:338   )  notice: get_node_name: 	Defaulting to uname -n for the local corosync node name
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (   cluster.c:338   )  notice: get_node_name: 	Defaulting to uname -n for the local corosync node name
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-event-30775-30786-28-header
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (membership.c:411   )    info: crm_get_peer: 	Node 3232261592 is now known as bl460g1n6
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (membership.c:411   )    info: crm_get_peer: 	Node 3232261592 is now known as bl460g1n6
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-request-30775-30786-28-header
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [CPG   ] cpg.c:message_handler_req_exec_cpg_procjoin:1260 got procjoin message from cluster node -1062705703 (r(0) ip(192.168.101.217) r(1) ip(192.168.102.217) ) for pid 29914
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: ( pacemaker.c:270   )    info: start_child: 	Using uid=189 and group=189 for process cib
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: ( pacemaker.c:270   )    info: start_child: 	Using uid=189 and group=189 for process cib
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: ( pacemaker.c:281   )    info: start_child: 	Forked child 30790 for process cib
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: ( pacemaker.c:281   )    info: start_child: 	Forked child 30790 for process cib
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: ( pacemaker.c:590   )   debug: update_node_processes: 	Node bl460g1n6 now has process list: 00000000000000000000000000000100 (was 00000000000000000000000004000000)
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: ( pacemaker.c:590   )   debug: update_node_processes: 	Node bl460g1n6 now has process list: 00000000000000000000000000000100 (was 00000000000000000000000004000000)
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: ( pacemaker.c:281   )    info: start_child: 	Forked child 30791 for process stonith-ng
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: ( pacemaker.c:281   )    info: start_child: 	Forked child 30791 for process stonith-ng
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: ( pacemaker.c:590   )   debug: update_node_processes: 	Node bl460g1n6 now has process list: 00000000000000000000000000100100 (was 00000000000000000000000000000100)
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: ( pacemaker.c:590   )   debug: update_node_processes: 	Node bl460g1n6 now has process list: 00000000000000000000000000100100 (was 00000000000000000000000000000100)
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: ( pacemaker.c:281   )    info: start_child: 	Forked child 30792 for process lrmd
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: ( pacemaker.c:281   )    info: start_child: 	Forked child 30792 for process lrmd
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: ( pacemaker.c:590   )   debug: update_node_processes: 	Node bl460g1n6 now has process list: 00000000000000000000000000100110 (was 00000000000000000000000000100100)
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: ( pacemaker.c:590   )   debug: update_node_processes: 	Node bl460g1n6 now has process list: 00000000000000000000000000100110 (was 00000000000000000000000000100100)
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: ( pacemaker.c:270   )    info: start_child: 	Using uid=189 and group=189 for process attrd
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: ( pacemaker.c:270   )    info: start_child: 	Using uid=189 and group=189 for process attrd
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: ( pacemaker.c:281   )    info: start_child: 	Forked child 30793 for process attrd
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: ( pacemaker.c:281   )    info: start_child: 	Forked child 30793 for process attrd
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: ( pacemaker.c:590   )   debug: update_node_processes: 	Node bl460g1n6 now has process list: 00000000000000000000000000101110 (was 00000000000000000000000000100110)
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: ( pacemaker.c:590   )   debug: update_node_processes: 	Node bl460g1n6 now has process list: 00000000000000000000000000101110 (was 00000000000000000000000000100110)
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: ( pacemaker.c:270   )    info: start_child: 	Using uid=189 and group=189 for process pengine
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: ( pacemaker.c:270   )    info: start_child: 	Using uid=189 and group=189 for process pengine
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: ( pacemaker.c:281   )    info: start_child: 	Forked child 30794 for process pengine
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: ( pacemaker.c:281   )    info: start_child: 	Forked child 30794 for process pengine
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: ( pacemaker.c:590   )   debug: update_node_processes: 	Node bl460g1n6 now has process list: 00000000000000000000000000111110 (was 00000000000000000000000000101110)
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: ( pacemaker.c:590   )   debug: update_node_processes: 	Node bl460g1n6 now has process list: 00000000000000000000000000111110 (was 00000000000000000000000000101110)
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: ( pacemaker.c:270   )    info: start_child: 	Using uid=189 and group=189 for process crmd
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: ( pacemaker.c:270   )    info: start_child: 	Using uid=189 and group=189 for process crmd
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: ( pacemaker.c:281   )    info: start_child: 	Forked child 30795 for process crmd
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: ( pacemaker.c:281   )    info: start_child: 	Forked child 30795 for process crmd
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: ( pacemaker.c:590   )   debug: update_node_processes: 	Node bl460g1n6 now has process list: 00000000000000000000000000111310 (was 00000000000000000000000000111110)
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: ( pacemaker.c:590   )   debug: update_node_processes: 	Node bl460g1n6 now has process list: 00000000000000000000000000111310 (was 00000000000000000000000000111110)
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: ( pacemaker.c:1027  )    info: main: 	Starting mainloop
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: ( pacemaker.c:1027  )    info: main: 	Starting mainloop
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (  corosync.c:191   )    info: pcmk_quorum_notification: 	Membership 8: quorum retained (2)
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (  corosync.c:191   )    info: pcmk_quorum_notification: 	Membership 8: quorum retained (2)
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (  corosync.c:210   )   debug: pcmk_quorum_notification: 	Member[0] 3232261592 
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (  corosync.c:210   )   debug: pcmk_quorum_notification: 	Member[0] 3232261592 
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (membership.c:615   )  notice: crm_update_peer_state: 	pcmk_quorum_notification: Node bl460g1n6[3232261592] - state is now member (was (null))
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (membership.c:615   )  notice: crm_update_peer_state: 	pcmk_quorum_notification: Node bl460g1n6[3232261592] - state is now member (was (null))
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (  corosync.c:210   )   debug: pcmk_quorum_notification: 	Member[1] 3232261593 
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (  corosync.c:210   )   debug: pcmk_quorum_notification: 	Member[1] 3232261593 
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (membership.c:400   )    info: crm_get_peer: 	Created entry b67d9d0c-4440-404b-9215-9f56ff57d0cd/0xf95c70 for node (null)/3232261593 (2 total)
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (membership.c:400   )    info: crm_get_peer: 	Created entry b67d9d0c-4440-404b-9215-9f56ff57d0cd/0xf95c70 for node (null)/3232261593 (2 total)
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ipc_setup.c:handle_new_connection:484 IPC credentials authenticated (30775-30786-28)
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ipc_shm.c:qb_ipcs_shm_connect:295 connecting to client [30786]
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30790] bl460g1n6        cib: (   logging.c:775   )    info: crm_log_init: 	Changed active directory to /var/lib/heartbeat/cores/hacluster
Jan 15 15:38:08 [30790] bl460g1n6        cib: (      main.c:230   )  notice: main: 	Using new config location: /var/lib/pacemaker/cib
Jan 15 15:38:08 [30790] bl460g1n6        cib: (   cluster.c:536   )    info: get_cluster_type: 	Verifying cluster type: 'corosync'
Jan 15 15:38:08 [30790] bl460g1n6        cib: (   cluster.c:573   )    info: get_cluster_type: 	Assuming an active 'corosync' cluster
Jan 15 15:38:08 [30790] bl460g1n6        cib: (        io.c:259   )    info: retrieveCib: 	Reading cluster configuration from: /var/lib/pacemaker/cib/cib.xml (digest: /var/lib/pacemaker/cib/cib.xml.sig)
Jan 15 15:38:08 [30790] bl460g1n6        cib: (        io.c:262   ) warning: retrieveCib: 	Cluster configuration not found: /var/lib/pacemaker/cib/cib.xml
Jan 15 15:38:08 [30790] bl460g1n6        cib: (        io.c:380   ) warning: readCibXmlFile: 	Primary configuration corrupt or unusable, trying backups in /var/lib/pacemaker/cib
Jan 15 15:38:08 [30790] bl460g1n6        cib: (        io.c:412   ) warning: readCibXmlFile: 	Continuing with an empty configuration.
Jan 15 15:38:08 [30790] bl460g1n6        cib: (       xml.c:2627  )    info: validate_with_relaxng: 	Creating RNG parser context
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30792] bl460g1n6       lrmd: (   logging.c:775   )    info: crm_log_init: 	Changed active directory to /var/lib/heartbeat/cores/root
Jan 15 15:38:08 [30791] bl460g1n6 stonith-ng: (   logging.c:775   )    info: crm_log_init: 	Changed active directory to /var/lib/heartbeat/cores/root
Jan 15 15:38:08 [30791] bl460g1n6 stonith-ng: (   cluster.c:536   )    info: get_cluster_type: 	Verifying cluster type: 'corosync'
Jan 15 15:38:08 [30791] bl460g1n6 stonith-ng: (   cluster.c:573   )    info: get_cluster_type: 	Assuming an active 'corosync' cluster
Jan 15 15:38:08 [30791] bl460g1n6 stonith-ng: (   cluster.c:179   )  notice: crm_cluster_connect: 	Connecting to cluster infrastructure: corosync
Jan 15 15:38:08 [30792] bl460g1n6       lrmd: ( ipc_setup.c:377   )    info: qb_ipcs_us_publish: 	server name: lrmd
Jan 15 15:38:08 [30792] bl460g1n6       lrmd: (      main.c:318   )    info: main: 	Starting
Jan 15 15:38:08 [30793] bl460g1n6      attrd: (   logging.c:775   )    info: crm_log_init: 	Changed active directory to /var/lib/heartbeat/cores/hacluster
Jan 15 15:38:08 [30793] bl460g1n6      attrd: (      main.c:313   )    info: main: 	Starting up
Jan 15 15:38:08 [30793] bl460g1n6      attrd: (   cluster.c:536   )    info: get_cluster_type: 	Verifying cluster type: 'corosync'
Jan 15 15:38:08 [30793] bl460g1n6      attrd: (   cluster.c:573   )    info: get_cluster_type: 	Assuming an active 'corosync' cluster
Jan 15 15:38:08 [30793] bl460g1n6      attrd: (   cluster.c:179   )  notice: crm_cluster_connect: 	Connecting to cluster infrastructure: corosync
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30794] bl460g1n6    pengine: (   logging.c:775   )    info: crm_log_init: 	Changed active directory to /var/lib/heartbeat/cores/hacluster
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_created:272 connection created
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_init_fn:306 lib_init_fn: conn=0x7f4385762cf0
Jan 15 15:38:08 [30794] bl460g1n6    pengine: (      main.c:172   )   debug: main: 	Init server comms
Jan 15 15:38:08 [30794] bl460g1n6    pengine: ( ipc_setup.c:377   )    info: qb_ipcs_us_publish: 	server name: pengine
Jan 15 15:38:08 [30794] bl460g1n6    pengine: (      main.c:180   )    info: main: 	Starting pengine
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ipc_setup.c:handle_new_connection:484 IPC credentials authenticated (30775-30791-29)
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ipc_shm.c:qb_ipcs_shm_connect:295 connecting to client [30791]
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30795] bl460g1n6       crmd: (   logging.c:775   )    info: crm_log_init: 	Changed active directory to /var/lib/heartbeat/cores/hacluster
Jan 15 15:38:08 [30795] bl460g1n6       crmd: (      main.c:97    )  notice: main: 	CRM Git Version: b48276b
Jan 15 15:38:08 [30795] bl460g1n6       crmd: (      main.c:134   )   debug: crmd_init: 	Starting crmd
Jan 15 15:38:08 [30795] bl460g1n6       crmd: (       fsa.c:193   )   debug: s_crmd_fsa: 	Processing I_STARTUP: [ state=S_STARTING cause=C_STARTUP origin=crmd_init ]
Jan 15 15:38:08 [30795] bl460g1n6       crmd: (      misc.c:47    )    info: do_log: 	FSA: Input I_STARTUP from crmd_init() received in state S_STARTING
Jan 15 15:38:08 [30795] bl460g1n6       crmd: (   control.c:488   )   debug: do_startup: 	Registering Signal Handlers
Jan 15 15:38:08 [30795] bl460g1n6       crmd: (   control.c:495   )   debug: do_startup: 	Creating CIB and LRM objects
Jan 15 15:38:08 [30795] bl460g1n6       crmd: (   cluster.c:536   )    info: get_cluster_type: 	Verifying cluster type: 'corosync'
Jan 15 15:38:08 [30795] bl460g1n6       crmd: (   cluster.c:573   )    info: get_cluster_type: 	Assuming an active 'corosync' cluster
Jan 15 15:38:08 [30795] bl460g1n6       crmd: (       ipc.c:780   )    info: crm_ipc_connect: 	Could not establish cib_shm connection: Connection refused (111)
Jan 15 15:38:08 [30795] bl460g1n6       crmd: (cib_native.c:229   )   debug: cib_native_signon_raw: 	Connection unsuccessful (0 (nil))
Jan 15 15:38:08 [30795] bl460g1n6       crmd: (cib_native.c:272   )   debug: cib_native_signon_raw: 	Connection to CIB failed: Transport endpoint is not connected
Jan 15 15:38:08 [30795] bl460g1n6       crmd: (cib_native.c:282   )   debug: cib_native_signoff: 	Signing out of the CIB Service
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_created:272 connection created
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [CPG   ] cpg.c:cpg_lib_init_fn:1459 lib_init_fn: conn=0x7f4385764410, cpd=0x7f4385765124
Jan 15 15:38:08 [30791] bl460g1n6 stonith-ng: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30791] bl460g1n6 stonith-ng: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30791] bl460g1n6 stonith-ng: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ipc_setup.c:handle_new_connection:484 IPC credentials authenticated (30775-30793-30)
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ipc_shm.c:qb_ipcs_shm_connect:295 connecting to client [30793]
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (      ipcc.c:378   )   debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (      ipcc.c:378   )   debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-request-30775-30786-28-header
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-request-30775-30786-28-header
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-response-30775-30786-28-header
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-response-30775-30786-28-header
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-event-30775-30786-28-header
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-event-30775-30786-28-header
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (  corosync.c:134   )  notice: corosync_node_name: 	Unable to get node name for nodeid 3232261593
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (  corosync.c:134   )  notice: corosync_node_name: 	Unable to get node name for nodeid 3232261593
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (   cluster.c:350   )  notice: get_node_name: 	Could not obtain a node name for corosync nodeid 3232261593
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (   cluster.c:350   )  notice: get_node_name: 	Could not obtain a node name for corosync nodeid 3232261593
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (membership.c:439   )    info: crm_get_peer: 	Node 3232261593 has uuid 3232261593
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (membership.c:439   )    info: crm_get_peer: 	Node 3232261593 has uuid 3232261593
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (  corosync.c:214   )    info: pcmk_quorum_notification: 	Obtaining name for new node 3232261593
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (  corosync.c:214   )    info: pcmk_quorum_notification: 	Obtaining name for new node 3232261593
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30790] bl460g1n6        cib: (        io.c:596   )   debug: activateCibXml: 	Triggering CIB write for start op
Jan 15 15:38:08 [30790] bl460g1n6        cib: (      main.c:586   )    info: startCib: 	CIB Initialization completed successfully
Jan 15 15:38:08 [30790] bl460g1n6        cib: (   cluster.c:179   )  notice: crm_cluster_connect: 	Connecting to cluster infrastructure: corosync
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_created:272 connection created
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [CPG   ] cpg.c:cpg_lib_init_fn:1459 lib_init_fn: conn=0x7f4385766660, cpd=0x7f4385766de4
Jan 15 15:38:08 [30793] bl460g1n6      attrd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30793] bl460g1n6      attrd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30793] bl460g1n6      attrd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_dispatch_connection_request:757 HUP conn (30775-30786-28)
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_disconnect:605 qb_ipcs_disconnect(30775-30786-28) state:2
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] loop_poll_epoll.c:_del:117 epoll_ctl(del): Bad file descriptor (9)
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_closed:417 cs_ipcs_connection_closed() 
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_exit_fn:325 exit_fn for conn=0x7f4385762cf0
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_destroyed:390 cs_ipcs_connection_destroyed() 
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-response-30775-30786-28-header
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-event-30775-30786-28-header
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-request-30775-30786-28-header
Jan 15 15:38:08 [30793] bl460g1n6      attrd: (       cpg.c:110   )   debug: get_local_nodeid: 	Local nodeid is 3232261592
Jan 15 15:38:08 [30791] bl460g1n6 stonith-ng: (       cpg.c:110   )   debug: get_local_nodeid: 	Local nodeid is 3232261592
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ipc_setup.c:handle_new_connection:484 IPC credentials authenticated (30775-30790-28)
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ipc_shm.c:qb_ipcs_shm_connect:295 connecting to client [30790]
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_created:272 connection created
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [CPG   ] cpg.c:cpg_lib_init_fn:1459 lib_init_fn: conn=0x7f4385762cf0, cpd=0x7f43857628e4
Jan 15 15:38:08 [30790] bl460g1n6        cib: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30790] bl460g1n6        cib: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [CPG   ] cpg.c:message_handler_req_exec_cpg_procjoin:1260 got procjoin message from cluster node -1062705703 (r(0) ip(192.168.101.217) r(1) ip(192.168.102.217) ) for pid 29919
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [CPG   ] cpg.c:message_handler_req_exec_cpg_procjoin:1260 got procjoin message from cluster node -1062705703 (r(0) ip(192.168.101.217) r(1) ip(192.168.102.217) ) for pid 29921
Jan 15 15:38:08 [30793] bl460g1n6      attrd: (membership.c:400   )    info: crm_get_peer: 	Created entry dca981cd-90df-407b-b087-0acd5655a372/0x172b3b0 for node (null)/3232261592 (1 total)
Jan 15 15:38:08 [30791] bl460g1n6 stonith-ng: (membership.c:400   )    info: crm_get_peer: 	Created entry 21034731-3fef-40be-a112-0958fa8b199d/0xb078a0 for node (null)/3232261592 (1 total)
Jan 15 15:38:08 [30790] bl460g1n6        cib: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ipc_setup.c:handle_new_connection:484 IPC credentials authenticated (30775-30786-31)
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ipc_shm.c:qb_ipcs_shm_connect:295 connecting to client [30786]
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_created:272 connection created
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_init_fn:306 lib_init_fn: conn=0x7f438576a3a0
Jan 15 15:38:08 [30790] bl460g1n6        cib: (       cpg.c:110   )   debug: get_local_nodeid: 	Local nodeid is 3232261592
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [CPG   ] cpg.c:message_handler_req_exec_cpg_procjoin:1260 got procjoin message from cluster node -1062705704 (r(0) ip(192.168.101.216) r(1) ip(192.168.102.216) ) for pid 30791
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [CPG   ] cpg.c:message_handler_req_exec_cpg_procjoin:1260 got procjoin message from cluster node -1062705704 (r(0) ip(192.168.101.216) r(1) ip(192.168.102.216) ) for pid 30793
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ipc_setup.c:handle_new_connection:484 IPC credentials authenticated (30775-30791-32)
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ipc_shm.c:qb_ipcs_shm_connect:295 connecting to client [30791]
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30791] bl460g1n6 stonith-ng: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_created:272 connection created
Jan 15 15:38:08 [30791] bl460g1n6 stonith-ng: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_init_fn:306 lib_init_fn: conn=0x7f438576c130
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (      ipcc.c:378   )   debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (      ipcc.c:378   )   debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-request-30775-30786-31-header
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-request-30775-30786-31-header
Jan 15 15:38:08 [30791] bl460g1n6 stonith-ng: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-response-30775-30786-31-header
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-response-30775-30786-31-header
Jan 15 15:38:08 [30790] bl460g1n6        cib: (membership.c:400   )    info: crm_get_peer: 	Created entry 0b61719b-98ca-4080-a5d7-12327d3ac138/0x1772180 for node (null)/3232261592 (1 total)
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-event-30775-30786-31-header
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-event-30775-30786-31-header
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ipc_setup.c:handle_new_connection:484 IPC credentials authenticated (30775-30793-33)
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ipc_shm.c:qb_ipcs_shm_connect:295 connecting to client [30793]
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (  corosync.c:134   )  notice: corosync_node_name: 	Unable to get node name for nodeid 3232261593
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (  corosync.c:134   )  notice: corosync_node_name: 	Unable to get node name for nodeid 3232261593
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_created:272 connection created
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_init_fn:306 lib_init_fn: conn=0x7f438576e280
Jan 15 15:38:08 [30793] bl460g1n6      attrd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30793] bl460g1n6      attrd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30793] bl460g1n6      attrd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_dispatch_connection_request:757 HUP conn (30775-30786-31)
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_disconnect:605 qb_ipcs_disconnect(30775-30786-31) state:2
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] loop_poll_epoll.c:_del:117 epoll_ctl(del): Bad file descriptor (9)
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_closed:417 cs_ipcs_connection_closed() 
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_exit_fn:325 exit_fn for conn=0x7f438576a3a0
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_destroyed:390 cs_ipcs_connection_destroyed() 
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-response-30775-30786-31-header
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-event-30775-30786-31-header
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-request-30775-30786-31-header
Jan 15 15:38:08 [30791] bl460g1n6 stonith-ng: (      ipcc.c:378   )   debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Jan 15 15:38:08 [30793] bl460g1n6      attrd: (      ipcc.c:378   )   debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Jan 15 15:38:08 [30791] bl460g1n6 stonith-ng: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-request-30775-30791-32-header
Jan 15 15:38:08 [30793] bl460g1n6      attrd: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-request-30775-30793-33-header
Jan 15 15:38:08 [30791] bl460g1n6 stonith-ng: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-response-30775-30791-32-header
Jan 15 15:38:08 [30793] bl460g1n6      attrd: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-response-30775-30793-33-header
Jan 15 15:38:08 [30791] bl460g1n6 stonith-ng: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-event-30775-30791-32-header
Jan 15 15:38:08 [30793] bl460g1n6      attrd: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-event-30775-30793-33-header
Jan 15 15:38:08 [30791] bl460g1n6 stonith-ng: (  corosync.c:134   )  notice: corosync_node_name: 	Unable to get node name for nodeid 3232261592
Jan 15 15:38:08 [30793] bl460g1n6      attrd: (  corosync.c:134   )  notice: corosync_node_name: 	Unable to get node name for nodeid 3232261592
Jan 15 15:38:08 [30791] bl460g1n6 stonith-ng: (   cluster.c:350   )  notice: get_node_name: 	Could not obtain a node name for corosync nodeid 3232261592
Jan 15 15:38:08 [30793] bl460g1n6      attrd: (   cluster.c:350   )  notice: get_node_name: 	Could not obtain a node name for corosync nodeid 3232261592
Jan 15 15:38:08 [30791] bl460g1n6 stonith-ng: (membership.c:439   )    info: crm_get_peer: 	Node 3232261592 has uuid 3232261592
Jan 15 15:38:08 [30791] bl460g1n6 stonith-ng: (membership.c:558   )    info: crm_update_peer_proc: 	cluster_connect_cpg: Node (null)[3232261592] - corosync-cpg is now online
Jan 15 15:38:08 [30793] bl460g1n6      attrd: (membership.c:439   )    info: crm_get_peer: 	Node 3232261592 has uuid 3232261592
Jan 15 15:38:08 [30791] bl460g1n6 stonith-ng: (  corosync.c:345   )    info: init_cs_connection_once: 	Connection to 'corosync': established
Jan 15 15:38:08 [30793] bl460g1n6      attrd: (membership.c:558   )    info: crm_update_peer_proc: 	cluster_connect_cpg: Node (null)[3232261592] - corosync-cpg is now online
Jan 15 15:38:08 [30793] bl460g1n6      attrd: (membership.c:615   )  notice: crm_update_peer_state: 	attrd_peer_change_cb: Node (null)[3232261592] - state is now member (was (null))
Jan 15 15:38:08 [30793] bl460g1n6      attrd: (  corosync.c:345   )    info: init_cs_connection_once: 	Connection to 'corosync': established
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ipc_setup.c:handle_new_connection:484 IPC credentials authenticated (30775-30790-31)
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ipc_shm.c:qb_ipcs_shm_connect:295 connecting to client [30790]
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_created:272 connection created
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_init_fn:306 lib_init_fn: conn=0x7f438576a3a0
Jan 15 15:38:08 [30790] bl460g1n6        cib: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30790] bl460g1n6        cib: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30790] bl460g1n6        cib: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [CPG   ] cpg.c:message_handler_req_exec_cpg_procjoin:1260 got procjoin message from cluster node -1062705703 (r(0) ip(192.168.101.217) r(1) ip(192.168.102.217) ) for pid 29918
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_dispatch_connection_request:757 HUP conn (30775-30791-32)
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_disconnect:605 qb_ipcs_disconnect(30775-30791-32) state:2
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] loop_poll_epoll.c:_del:117 epoll_ctl(del): Bad file descriptor (9)
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_closed:417 cs_ipcs_connection_closed() 
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_exit_fn:325 exit_fn for conn=0x7f438576c130
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_destroyed:390 cs_ipcs_connection_destroyed() 
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-response-30775-30791-32-header
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-event-30775-30791-32-header
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-request-30775-30791-32-header
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_dispatch_connection_request:757 HUP conn (30775-30793-33)
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_disconnect:605 qb_ipcs_disconnect(30775-30793-33) state:2
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] loop_poll_epoll.c:_del:117 epoll_ctl(del): Bad file descriptor (9)
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_closed:417 cs_ipcs_connection_closed() 
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_exit_fn:325 exit_fn for conn=0x7f438576e280
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_destroyed:390 cs_ipcs_connection_destroyed() 
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-response-30775-30793-33-header
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-event-30775-30793-33-header
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-request-30775-30793-33-header
Jan 15 15:38:08 [30790] bl460g1n6        cib: (      ipcc.c:378   )   debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Jan 15 15:38:08 [30790] bl460g1n6        cib: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-request-30775-30790-31-header
Jan 15 15:38:08 [30790] bl460g1n6        cib: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-response-30775-30790-31-header
Jan 15 15:38:08 [30790] bl460g1n6        cib: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-event-30775-30790-31-header
Jan 15 15:38:08 [30790] bl460g1n6        cib: (  corosync.c:134   )  notice: corosync_node_name: 	Unable to get node name for nodeid 3232261592
Jan 15 15:38:08 [30790] bl460g1n6        cib: (   cluster.c:350   )  notice: get_node_name: 	Could not obtain a node name for corosync nodeid 3232261592
Jan 15 15:38:08 [30790] bl460g1n6        cib: (membership.c:439   )    info: crm_get_peer: 	Node 3232261592 has uuid 3232261592
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [CPG   ] cpg.c:message_handler_req_exec_cpg_procjoin:1260 got procjoin message from cluster node -1062705704 (r(0) ip(192.168.101.216) r(1) ip(192.168.102.216) ) for pid 30790
Jan 15 15:38:08 [30790] bl460g1n6        cib: (membership.c:558   )    info: crm_update_peer_proc: 	cluster_connect_cpg: Node (null)[3232261592] - corosync-cpg is now online
Jan 15 15:38:08 [30790] bl460g1n6        cib: (  corosync.c:345   )    info: init_cs_connection_once: 	Connection to 'corosync': established
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ipc_setup.c:handle_new_connection:484 IPC credentials authenticated (30775-30786-32)
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ipc_shm.c:qb_ipcs_shm_connect:295 connecting to client [30786]
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_created:272 connection created
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_init_fn:306 lib_init_fn: conn=0x7f438576c130
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_dispatch_connection_request:757 HUP conn (30775-30790-31)
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_disconnect:605 qb_ipcs_disconnect(30775-30790-31) state:2
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] loop_poll_epoll.c:_del:117 epoll_ctl(del): Bad file descriptor (9)
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_closed:417 cs_ipcs_connection_closed() 
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_exit_fn:325 exit_fn for conn=0x7f438576a3a0
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_destroyed:390 cs_ipcs_connection_destroyed() 
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-response-30775-30790-31-header
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-event-30775-30790-31-header
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-request-30775-30790-31-header
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ipc_setup.c:handle_new_connection:484 IPC credentials authenticated (30775-30791-31)
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ipc_shm.c:qb_ipcs_shm_connect:295 connecting to client [30791]
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_created:272 connection created
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_init_fn:306 lib_init_fn: conn=0x7f438576ae10
Jan 15 15:38:08 [30791] bl460g1n6 stonith-ng: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30791] bl460g1n6 stonith-ng: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30791] bl460g1n6 stonith-ng: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (      ipcc.c:378   )   debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (      ipcc.c:378   )   debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-request-30775-30786-32-header
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-request-30775-30786-32-header
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-response-30775-30786-32-header
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-response-30775-30786-32-header
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-event-30775-30786-32-header
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-event-30775-30786-32-header
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ipc_setup.c:handle_new_connection:484 IPC credentials authenticated (30775-30793-33)
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ipc_shm.c:qb_ipcs_shm_connect:295 connecting to client [30793]
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (  corosync.c:134   )  notice: corosync_node_name: 	Unable to get node name for nodeid 3232261593
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (  corosync.c:134   )  notice: corosync_node_name: 	Unable to get node name for nodeid 3232261593
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (   cluster.c:350   )  notice: get_node_name: 	Could not obtain a node name for corosync nodeid 3232261593
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (   cluster.c:350   )  notice: get_node_name: 	Could not obtain a node name for corosync nodeid 3232261593
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (membership.c:615   )  notice: crm_update_peer_state: 	pcmk_quorum_notification: Node (null)[3232261593] - state is now member (was (null))
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (membership.c:615   )  notice: crm_update_peer_state: 	pcmk_quorum_notification: Node (null)[3232261593] - state is now member (was (null))
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (membership.c:411   )    info: crm_get_peer: 	Node 3232261593 is now known as bl460g1n7
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: (membership.c:411   )    info: crm_get_peer: 	Node 3232261593 is now known as bl460g1n7
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: ( pacemaker.c:590   )   debug: update_node_processes: 	Node bl460g1n7 now has process list: 00000000000000000000000000000100 (was 00000000000000000000000000000000)
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: ( pacemaker.c:590   )   debug: update_node_processes: 	Node bl460g1n7 now has process list: 00000000000000000000000000000100 (was 00000000000000000000000000000000)
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: ( pacemaker.c:590   )   debug: update_node_processes: 	Node bl460g1n7 now has process list: 00000000000000000000000000100100 (was 00000000000000000000000000000100)
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: ( pacemaker.c:590   )   debug: update_node_processes: 	Node bl460g1n7 now has process list: 00000000000000000000000000100100 (was 00000000000000000000000000000100)
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: ( pacemaker.c:590   )   debug: update_node_processes: 	Node bl460g1n7 now has process list: 00000000000000000000000000100110 (was 00000000000000000000000000100100)
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: ( pacemaker.c:590   )   debug: update_node_processes: 	Node bl460g1n7 now has process list: 00000000000000000000000000100110 (was 00000000000000000000000000100100)
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: ( pacemaker.c:590   )   debug: update_node_processes: 	Node bl460g1n7 now has process list: 00000000000000000000000000101110 (was 00000000000000000000000000100110)
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: ( pacemaker.c:590   )   debug: update_node_processes: 	Node bl460g1n7 now has process list: 00000000000000000000000000101110 (was 00000000000000000000000000100110)
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: ( pacemaker.c:590   )   debug: update_node_processes: 	Node bl460g1n7 now has process list: 00000000000000000000000000111110 (was 00000000000000000000000000101110)
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: ( pacemaker.c:590   )   debug: update_node_processes: 	Node bl460g1n7 now has process list: 00000000000000000000000000111110 (was 00000000000000000000000000101110)
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: ( pacemaker.c:590   )   debug: update_node_processes: 	Node bl460g1n7 now has process list: 00000000000000000000000000111310 (was 00000000000000000000000000111110)
Jan 15 15:38:08 [30786] bl460g1n6 pacemakerd: ( pacemaker.c:590   )   debug: update_node_processes: 	Node bl460g1n7 now has process list: 00000000000000000000000000111310 (was 00000000000000000000000000111110)
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_created:272 connection created
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_init_fn:306 lib_init_fn: conn=0x7f438576d6a0
Jan 15 15:38:08 [30793] bl460g1n6      attrd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30793] bl460g1n6      attrd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30793] bl460g1n6      attrd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_dispatch_connection_request:757 HUP conn (30775-30786-32)
Jan 15 15:38:08 [30791] bl460g1n6 stonith-ng: (      ipcc.c:378   )   debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_disconnect:605 qb_ipcs_disconnect(30775-30786-32) state:2
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] loop_poll_epoll.c:_del:117 epoll_ctl(del): Bad file descriptor (9)
Jan 15 15:38:08 [30791] bl460g1n6 stonith-ng: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-request-30775-30791-31-header
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_closed:417 cs_ipcs_connection_closed() 
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_exit_fn:325 exit_fn for conn=0x7f438576c130
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_destroyed:390 cs_ipcs_connection_destroyed() 
Jan 15 15:38:08 [30791] bl460g1n6 stonith-ng: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-response-30775-30791-31-header
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-response-30775-30786-32-header
Jan 15 15:38:08 [30791] bl460g1n6 stonith-ng: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-event-30775-30791-31-header
Jan 15 15:38:08 [30791] bl460g1n6 stonith-ng: (  corosync.c:134   )  notice: corosync_node_name: 	Unable to get node name for nodeid 3232261592
Jan 15 15:38:08 [30791] bl460g1n6 stonith-ng: (   cluster.c:338   )  notice: get_node_name: 	Defaulting to uname -n for the local corosync node name
Jan 15 15:38:08 [30791] bl460g1n6 stonith-ng: (membership.c:411   )    info: crm_get_peer: 	Node 3232261592 is now known as bl460g1n6
Jan 15 15:38:08 [30791] bl460g1n6 stonith-ng: (       ipc.c:780   )    info: crm_ipc_connect: 	Could not establish cib_rw connection: Connection refused (111)
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-event-30775-30786-32-header
Jan 15 15:38:08 [30791] bl460g1n6 stonith-ng: (cib_native.c:229   )   debug: cib_native_signon_raw: 	Connection unsuccessful (0 (nil))
Jan 15 15:38:08 [30791] bl460g1n6 stonith-ng: (cib_native.c:272   )   debug: cib_native_signon_raw: 	Connection to CIB failed: Transport endpoint is not connected
Jan 15 15:38:08 [30791] bl460g1n6 stonith-ng: (cib_native.c:282   )   debug: cib_native_signoff: 	Signing out of the CIB Service
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-request-30775-30786-32-header
Jan 15 15:38:08 [30793] bl460g1n6      attrd: (      ipcc.c:378   )   debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Jan 15 15:38:08 [30793] bl460g1n6      attrd: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-request-30775-30793-33-header
Jan 15 15:38:08 [30793] bl460g1n6      attrd: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-response-30775-30793-33-header
Jan 15 15:38:08 [30793] bl460g1n6      attrd: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-event-30775-30793-33-header
Jan 15 15:38:08 [30793] bl460g1n6      attrd: (  corosync.c:134   )  notice: corosync_node_name: 	Unable to get node name for nodeid 3232261592
Jan 15 15:38:08 [30793] bl460g1n6      attrd: (   cluster.c:338   )  notice: get_node_name: 	Defaulting to uname -n for the local corosync node name
Jan 15 15:38:08 [30793] bl460g1n6      attrd: (membership.c:411   )    info: crm_get_peer: 	Node 3232261592 is now known as bl460g1n6
Jan 15 15:38:08 [30793] bl460g1n6      attrd: (      main.c:329   )    info: main: 	Cluster connection active
Jan 15 15:38:08 [30793] bl460g1n6      attrd: ( ipc_setup.c:377   )    info: qb_ipcs_us_publish: 	server name: attrd
Jan 15 15:38:08 [30793] bl460g1n6      attrd: (      main.c:333   )    info: main: 	Accepting attribute updates
Jan 15 15:38:08 [30793] bl460g1n6      attrd: (      main.c:153   )   debug: attrd_cib_connect: 	CIB signon attempt 1
Jan 15 15:38:08 [30793] bl460g1n6      attrd: (       ipc.c:780   )    info: crm_ipc_connect: 	Could not establish cib_rw connection: Connection refused (111)
Jan 15 15:38:08 [30793] bl460g1n6      attrd: (cib_native.c:229   )   debug: cib_native_signon_raw: 	Connection unsuccessful (0 (nil))
Jan 15 15:38:08 [30793] bl460g1n6      attrd: (cib_native.c:272   )   debug: cib_native_signon_raw: 	Connection to CIB failed: Transport endpoint is not connected
Jan 15 15:38:08 [30793] bl460g1n6      attrd: (cib_native.c:282   )   debug: cib_native_signoff: 	Signing out of the CIB Service
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ipc_setup.c:handle_new_connection:484 IPC credentials authenticated (30775-30790-32)
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ipc_shm.c:qb_ipcs_shm_connect:295 connecting to client [30790]
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_created:272 connection created
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_init_fn:306 lib_init_fn: conn=0x7f438576c130
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_dispatch_connection_request:757 HUP conn (30775-30791-31)
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_disconnect:605 qb_ipcs_disconnect(30775-30791-31) state:2
Jan 15 15:38:08 [30790] bl460g1n6        cib: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] loop_poll_epoll.c:_del:117 epoll_ctl(del): Bad file descriptor (9)
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_closed:417 cs_ipcs_connection_closed() 
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_exit_fn:325 exit_fn for conn=0x7f438576ae10
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_destroyed:390 cs_ipcs_connection_destroyed() 
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-response-30775-30791-31-header
Jan 15 15:38:08 [30790] bl460g1n6        cib: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30790] bl460g1n6        cib: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-event-30775-30791-31-header
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-request-30775-30791-31-header
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_dispatch_connection_request:757 HUP conn (30775-30793-33)
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_disconnect:605 qb_ipcs_disconnect(30775-30793-33) state:2
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] loop_poll_epoll.c:_del:117 epoll_ctl(del): Bad file descriptor (9)
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_closed:417 cs_ipcs_connection_closed() 
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_exit_fn:325 exit_fn for conn=0x7f438576d6a0
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_destroyed:390 cs_ipcs_connection_destroyed() 
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-response-30775-30793-33-header
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-event-30775-30793-33-header
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-request-30775-30793-33-header
Jan 15 15:38:08 [30790] bl460g1n6        cib: (      ipcc.c:378   )   debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Jan 15 15:38:08 [30790] bl460g1n6        cib: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-request-30775-30790-32-header
Jan 15 15:38:08 [30790] bl460g1n6        cib: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-response-30775-30790-32-header
Jan 15 15:38:08 [30790] bl460g1n6        cib: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-event-30775-30790-32-header
Jan 15 15:38:08 [30790] bl460g1n6        cib: (  corosync.c:134   )  notice: corosync_node_name: 	Unable to get node name for nodeid 3232261592
Jan 15 15:38:08 [30790] bl460g1n6        cib: (   cluster.c:338   )  notice: get_node_name: 	Defaulting to uname -n for the local corosync node name
Jan 15 15:38:08 [30790] bl460g1n6        cib: (membership.c:411   )    info: crm_get_peer: 	Node 3232261592 is now known as bl460g1n6
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_dispatch_connection_request:757 HUP conn (30775-30790-32)
Jan 15 15:38:08 [30790] bl460g1n6        cib: ( ipc_setup.c:377   )    info: qb_ipcs_us_publish: 	server name: cib_ro
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_disconnect:605 qb_ipcs_disconnect(30775-30790-32) state:2
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] loop_poll_epoll.c:_del:117 epoll_ctl(del): Bad file descriptor (9)
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_closed:417 cs_ipcs_connection_closed() 
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_exit_fn:325 exit_fn for conn=0x7f438576c130
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_destroyed:390 cs_ipcs_connection_destroyed() 
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-response-30775-30790-32-header
Jan 15 15:38:08 [30790] bl460g1n6        cib: ( ipc_setup.c:377   )    info: qb_ipcs_us_publish: 	server name: cib_rw
Jan 15 15:38:08 [30790] bl460g1n6        cib: ( ipc_setup.c:377   )    info: qb_ipcs_us_publish: 	server name: cib_shm
Jan 15 15:38:08 [30790] bl460g1n6        cib: (      main.c:550   )    info: cib_init: 	Starting cib mainloop
Jan 15 15:38:08 [30790] bl460g1n6        cib: (       cpg.c:378   )    info: pcmk_cpg_membership: 	Joined[0.0] cib.3232261592 
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-event-30775-30790-32-header
Jan 15 15:38:08 [30790] bl460g1n6        cib: (       cpg.c:384   )    info: pcmk_cpg_membership: 	Member[0.0] cib.3232261592 
Jan 15 15:38:08 [30790] bl460g1n6        cib: (membership.c:400   )    info: crm_get_peer: 	Created entry 4e1ac5a1-b161-474b-bc50-8efe070b04cc/0x1774b50 for node (null)/3232261593 (2 total)
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-request-30775-30790-32-header
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ipc_setup.c:handle_new_connection:484 IPC credentials authenticated (30775-30790-31)
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ipc_shm.c:qb_ipcs_shm_connect:295 connecting to client [30790]
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_created:272 connection created
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_init_fn:306 lib_init_fn: conn=0x7f438576c130
Jan 15 15:38:08 [30790] bl460g1n6        cib: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30790] bl460g1n6        cib: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30790] bl460g1n6        cib: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:08 [30790] bl460g1n6        cib: (      ipcc.c:378   )   debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Jan 15 15:38:08 [30790] bl460g1n6        cib: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-request-30775-30790-31-header
Jan 15 15:38:08 [30790] bl460g1n6        cib: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-response-30775-30790-31-header
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_dispatch_connection_request:757 HUP conn (30775-30790-31)
Jan 15 15:38:08 [30790] bl460g1n6        cib: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-event-30775-30790-31-header
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_disconnect:605 qb_ipcs_disconnect(30775-30790-31) state:2
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] loop_poll_epoll.c:_del:117 epoll_ctl(del): Bad file descriptor (9)
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_closed:417 cs_ipcs_connection_closed() 
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_exit_fn:325 exit_fn for conn=0x7f438576c130
Jan 15 15:38:08 [30790] bl460g1n6        cib: (  corosync.c:134   )  notice: corosync_node_name: 	Unable to get node name for nodeid 3232261593
Jan 15 15:38:08 [30790] bl460g1n6        cib: (   cluster.c:350   )  notice: get_node_name: 	Could not obtain a node name for corosync nodeid 3232261593
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_destroyed:390 cs_ipcs_connection_destroyed() 
Jan 15 15:38:08 [30790] bl460g1n6        cib: (membership.c:439   )    info: crm_get_peer: 	Node 3232261593 has uuid 3232261593
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-response-30775-30790-31-header
Jan 15 15:38:08 [30790] bl460g1n6        cib: (       cpg.c:384   )    info: pcmk_cpg_membership: 	Member[0.1] cib.3232261593 
Jan 15 15:38:08 [30790] bl460g1n6        cib: (membership.c:558   )    info: crm_update_peer_proc: 	pcmk_cpg_membership: Node (null)[3232261593] - corosync-cpg is now online
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-event-30775-30790-31-header
Jan 15 15:38:08 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-request-30775-30790-31-header
Jan 15 15:38:08 [30790] bl460g1n6        cib: (     utils.c:1222  )   debug: get_last_sequence: 	Series file /var/lib/pacemaker/cib/cib.last does not exist
Jan 15 15:38:08 [30790] bl460g1n6        cib: (        io.c:748   )   debug: write_cib_contents: 	Writing CIB to disk
Jan 15 15:38:08 [30790] bl460g1n6        cib: (        io.c:773   )    info: write_cib_contents: 	Wrote version 0.0.0 of the CIB to disk (digest: d3813d3f6bc333e7748d9257dda8345d)
Jan 15 15:38:08 [30790] bl460g1n6        cib: (        io.c:781   )   debug: write_cib_contents: 	Wrote digest d3813d3f6bc333e7748d9257dda8345d to disk
Jan 15 15:38:08 [30790] bl460g1n6        cib: (        io.c:259   )    info: retrieveCib: 	Reading cluster configuration from: /var/lib/pacemaker/cib/cib.ffKpx3 (digest: /var/lib/pacemaker/cib/cib.P53SeC)
Jan 15 15:38:08 [30790] bl460g1n6        cib: (        io.c:786   )   debug: write_cib_contents: 	Activating /var/lib/pacemaker/cib/cib.ffKpx3
Jan 15 15:38:09 [30790] bl460g1n6        cib: (       ipc.c:334   )    info: crm_client_new: 	Connecting 0x1775230 for uid=189 gid=189 pid=30795 id=cc068cb9-d0e9-4926-8c5d-e7263aa1c9fe
Jan 15 15:38:09 [30790] bl460g1n6        cib: ( ipc_setup.c:484   )   debug: handle_new_connection: 	IPC credentials authenticated (30790-30795-10)
Jan 15 15:38:09 [30790] bl460g1n6        cib: (   ipc_shm.c:295   )   debug: qb_ipcs_shm_connect: 	connecting to client [30795]
Jan 15 15:38:09 [30790] bl460g1n6        cib: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:38:09 [30790] bl460g1n6        cib: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:38:09 [30790] bl460g1n6        cib: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (cib_native.c:268   )   debug: cib_native_signon_raw: 	Connection to CIB successful
Jan 15 15:38:09 [30790] bl460g1n6        cib: ( callbacks.c:193   )   debug: cib_common_callback_worker: 	Setting cib_refresh_notify callbacks for crmd (cc068cb9-d0e9-4926-8c5d-e7263aa1c9fe): on
Jan 15 15:38:09 [30790] bl460g1n6        cib: ( callbacks.c:193   )   debug: cib_common_callback_worker: 	Setting cib_diff_notify callbacks for crmd (cc068cb9-d0e9-4926-8c5d-e7263aa1c9fe): on
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (       cib.c:215   )    info: do_cib_control: 	CIB connection established
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (   cluster.c:179   )  notice: crm_cluster_connect: 	Connecting to cluster infrastructure: corosync
Jan 15 15:38:09 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_query operation for section 'all': OK (rc=0, origin=local/crmd/2, version=0.0.0)
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ipc_setup.c:handle_new_connection:484 IPC credentials authenticated (30775-30795-31)
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ipc_shm.c:qb_ipcs_shm_connect:295 connecting to client [30795]
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_created:272 connection created
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [CPG   ] cpg.c:cpg_lib_init_fn:1459 lib_init_fn: conn=0x7f438576c130, cpd=0x7f438576be24
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (       cpg.c:110   )   debug: get_local_nodeid: 	Local nodeid is 3232261592
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (membership.c:400   )    info: crm_get_peer: 	Created entry 86f26d75-ef6c-41c1-8dfd-018826c1911a/0x18a2e80 for node (null)/3232261592 (1 total)
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ipc_setup.c:handle_new_connection:484 IPC credentials authenticated (30775-30795-32)
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ipc_shm.c:qb_ipcs_shm_connect:295 connecting to client [30795]
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_created:272 connection created
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_init_fn:306 lib_init_fn: conn=0x7f438576ae10
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [CPG   ] cpg.c:message_handler_req_exec_cpg_procjoin:1260 got procjoin message from cluster node -1062705703 (r(0) ip(192.168.101.217) r(1) ip(192.168.102.217) ) for pid 29923
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [CPG   ] cpg.c:message_handler_req_exec_cpg_procjoin:1260 got procjoin message from cluster node -1062705704 (r(0) ip(192.168.101.216) r(1) ip(192.168.102.216) ) for pid 30795
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (      ipcc.c:378   )   debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-request-30775-30795-32-header
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_dispatch_connection_request:757 HUP conn (30775-30795-32)
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_disconnect:605 qb_ipcs_disconnect(30775-30795-32) state:2
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-response-30775-30795-32-header
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] loop_poll_epoll.c:_del:117 epoll_ctl(del): Bad file descriptor (9)
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_closed:417 cs_ipcs_connection_closed() 
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-event-30775-30795-32-header
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_exit_fn:325 exit_fn for conn=0x7f438576ae10
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_destroyed:390 cs_ipcs_connection_destroyed() 
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-response-30775-30795-32-header
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (  corosync.c:134   )  notice: corosync_node_name: 	Unable to get node name for nodeid 3232261592
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (   cluster.c:350   )  notice: get_node_name: 	Could not obtain a node name for corosync nodeid 3232261592
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (membership.c:439   )    info: crm_get_peer: 	Node 3232261592 has uuid 3232261592
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (membership.c:558   )    info: crm_update_peer_proc: 	cluster_connect_cpg: Node (null)[3232261592] - corosync-cpg is now online
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-event-30775-30795-32-header
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (  corosync.c:345   )    info: init_cs_connection_once: 	Connection to 'corosync': established
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-request-30775-30795-32-header
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ipc_setup.c:handle_new_connection:484 IPC credentials authenticated (30775-30795-32)
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ipc_shm.c:qb_ipcs_shm_connect:295 connecting to client [30795]
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_created:272 connection created
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_init_fn:306 lib_init_fn: conn=0x7f4385774aa0
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (      ipcc.c:378   )   debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-request-30775-30795-32-header
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_dispatch_connection_request:757 HUP conn (30775-30795-32)
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-response-30775-30795-32-header
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_disconnect:605 qb_ipcs_disconnect(30775-30795-32) state:2
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] loop_poll_epoll.c:_del:117 epoll_ctl(del): Bad file descriptor (9)
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-event-30775-30795-32-header
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_closed:417 cs_ipcs_connection_closed() 
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_exit_fn:325 exit_fn for conn=0x7f4385774aa0
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_destroyed:390 cs_ipcs_connection_destroyed() 
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-response-30775-30795-32-header
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (  corosync.c:134   )  notice: corosync_node_name: 	Unable to get node name for nodeid 3232261592
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (   cluster.c:338   )  notice: get_node_name: 	Defaulting to uname -n for the local corosync node name
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (membership.c:411   )    info: crm_get_peer: 	Node 3232261592 is now known as bl460g1n6
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-event-30775-30795-32-header
Jan 15 15:38:09 [30795] bl460g1n6       crmd: ( callbacks.c:118   )    info: peer_update_callback: 	bl460g1n6 is now (null)
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (  corosync.c:255   )   debug: cluster_connect_quorum: 	Configuring Pacemaker to obtain quorum from Corosync
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-request-30775-30795-32-header
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ipc_setup.c:handle_new_connection:484 IPC credentials authenticated (30775-30795-32)
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ipc_shm.c:qb_ipcs_shm_connect:295 connecting to client [30795]
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_created:272 connection created
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QUORUM] vsf_quorum.c:quorum_lib_init_fn:316 lib_init_fn: conn=0x7f4385774aa0
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QUORUM] vsf_quorum.c:message_handler_req_lib_quorum_gettype:471 got quorum_type request on 0x7f4385774aa0
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QUORUM] vsf_quorum.c:message_handler_req_lib_quorum_getquorate:395 got quorate request on 0x7f4385774aa0
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (  corosync.c:273   )  notice: cluster_connect_quorum: 	Quorum acquired
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QUORUM] vsf_quorum.c:message_handler_req_lib_quorum_trackstart:412 got trackstart request on 0x7f4385774aa0
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QUORUM] vsf_quorum.c:message_handler_req_lib_quorum_trackstart:420 sending initial status to 0x7f4385774aa0
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QUORUM] vsf_quorum.c:send_library_notification:359 sending quorum notification to 0x7f4385774aa0, length = 56
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ipc_setup.c:handle_new_connection:484 IPC credentials authenticated (30775-30795-33)
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ipc_shm.c:qb_ipcs_shm_connect:295 connecting to client [30795]
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_created:272 connection created
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_init_fn:306 lib_init_fn: conn=0x7f438576d0c0
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (      ipcc.c:378   )   debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-request-30775-30795-33-header
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_dispatch_connection_request:757 HUP conn (30775-30795-33)
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-response-30775-30795-33-header
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_disconnect:605 qb_ipcs_disconnect(30775-30795-33) state:2
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] loop_poll_epoll.c:_del:117 epoll_ctl(del): Bad file descriptor (9)
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-event-30775-30795-33-header
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_closed:417 cs_ipcs_connection_closed() 
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_exit_fn:325 exit_fn for conn=0x7f438576d0c0
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_destroyed:390 cs_ipcs_connection_destroyed() 
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-response-30775-30795-33-header
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-event-30775-30795-33-header
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-request-30775-30795-33-header
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ipc_setup.c:handle_new_connection:484 IPC credentials authenticated (30775-30795-33)
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ipc_shm.c:qb_ipcs_shm_connect:295 connecting to client [30795]
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_created:272 connection created
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_init_fn:306 lib_init_fn: conn=0x7f438576d0c0
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (      ipcc.c:378   )   debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-request-30775-30795-33-header
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_dispatch_connection_request:757 HUP conn (30775-30795-33)
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-response-30775-30795-33-header
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_disconnect:605 qb_ipcs_disconnect(30775-30795-33) state:2
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] loop_poll_epoll.c:_del:117 epoll_ctl(del): Bad file descriptor (9)
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-event-30775-30795-33-header
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_closed:417 cs_ipcs_connection_closed() 
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_exit_fn:325 exit_fn for conn=0x7f438576d0c0
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_destroyed:390 cs_ipcs_connection_destroyed() 
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-response-30775-30795-33-header
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (   control.c:146   )    info: do_ha_control: 	Connected to the cluster
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-event-30775-30795-33-header
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-request-30775-30795-33-header
Jan 15 15:38:09 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_modify operation for section nodes: OK (rc=0, origin=local/crmd/3, version=0.0.0)
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (       lrm.c:299   )   debug: do_lrm_control: 	Connecting to the LRM
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (lrmd_client.:938   )    info: lrmd_ipc_connect: 	Connecting to lrmd
Jan 15 15:38:09 [30792] bl460g1n6       lrmd: (       ipc.c:334   )    info: crm_client_new: 	Connecting 0x1c40df0 for uid=189 gid=189 pid=30795 id=ea951299-8a4d-4fd6-8900-d6588e07ac38
Jan 15 15:38:09 [30792] bl460g1n6       lrmd: ( ipc_setup.c:484   )   debug: handle_new_connection: 	IPC credentials authenticated (30792-30795-6)
Jan 15 15:38:09 [30792] bl460g1n6       lrmd: (   ipc_shm.c:295   )   debug: qb_ipcs_shm_connect: 	connecting to client [30795]
Jan 15 15:38:09 [30792] bl460g1n6       lrmd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Jan 15 15:38:09 [30792] bl460g1n6       lrmd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Jan 15 15:38:09 [30792] bl460g1n6       lrmd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Jan 15 15:38:09 [30792] bl460g1n6       lrmd: (      lrmd.c:1313  )   debug: process_lrmd_message: 	Processed register operation from ea951299-8a4d-4fd6-8900-d6588e07ac38: rc=0, reply=0, notify=0, exit=4201864
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (       lrm.c:321   )    info: do_lrm_control: 	LRM connection established
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (   control.c:773   )    info: do_started: 	Delaying start, no membership data (0000000000100000)
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (  messages.c:90    )   debug: register_fsa_input_adv: 	Stalling the FSA pending further input: source=do_started cause=C_FSA_INTERNAL data=(nil) queue=0
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (       fsa.c:240   )   debug: s_crmd_fsa: 	Exiting the FSA: queue=0, fsa_actions=0x2, stalled=true
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (  corosync.c:191   )    info: pcmk_quorum_notification: 	Membership 8: quorum retained (2)
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (  corosync.c:210   )   debug: pcmk_quorum_notification: 	Member[0] 3232261592 
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (membership.c:615   )  notice: crm_update_peer_state: 	pcmk_quorum_notification: Node bl460g1n6[3232261592] - state is now member (was (null))
Jan 15 15:38:09 [30795] bl460g1n6       crmd: ( callbacks.c:124   )    info: peer_update_callback: 	bl460g1n6 is now member (was (null))
Jan 15 15:38:09 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_query operation for section crm_config: OK (rc=0, origin=local/crmd/4, version=0.0.0)
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (  corosync.c:210   )   debug: pcmk_quorum_notification: 	Member[1] 3232261593 
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (membership.c:400   )    info: crm_get_peer: 	Created entry 42d1c91f-ad53-45b5-8167-d2405c4fe866/0x19ea180 for node (null)/3232261593 (2 total)
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ipc_setup.c:handle_new_connection:484 IPC credentials authenticated (30775-30795-33)
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ipc_shm.c:qb_ipcs_shm_connect:295 connecting to client [30795]
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_created:272 connection created
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_init_fn:306 lib_init_fn: conn=0x7f438576d0c0
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (      ipcc.c:378   )   debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-request-30775-30795-33-header
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_dispatch_connection_request:757 HUP conn (30775-30795-33)
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_disconnect:605 qb_ipcs_disconnect(30775-30795-33) state:2
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-response-30775-30795-33-header
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] loop_poll_epoll.c:_del:117 epoll_ctl(del): Bad file descriptor (9)
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-event-30775-30795-33-header
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_closed:417 cs_ipcs_connection_closed() 
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_exit_fn:325 exit_fn for conn=0x7f438576d0c0
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_destroyed:390 cs_ipcs_connection_destroyed() 
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-response-30775-30795-33-header
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (  corosync.c:134   )  notice: corosync_node_name: 	Unable to get node name for nodeid 3232261593
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (   cluster.c:350   )  notice: get_node_name: 	Could not obtain a node name for corosync nodeid 3232261593
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (membership.c:439   )    info: crm_get_peer: 	Node 3232261593 has uuid 3232261593
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (  corosync.c:214   )    info: pcmk_quorum_notification: 	Obtaining name for new node 3232261593
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-event-30775-30795-33-header
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-request-30775-30795-33-header
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ipc_setup.c:handle_new_connection:484 IPC credentials authenticated (30775-30795-33)
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ipc_shm.c:qb_ipcs_shm_connect:295 connecting to client [30795]
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_created:272 connection created
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_init_fn:306 lib_init_fn: conn=0x7f438576d0c0
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (      ipcc.c:378   )   debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-request-30775-30795-33-header
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_dispatch_connection_request:757 HUP conn (30775-30795-33)
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-response-30775-30795-33-header
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_disconnect:605 qb_ipcs_disconnect(30775-30795-33) state:2
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] loop_poll_epoll.c:_del:117 epoll_ctl(del): Bad file descriptor (9)
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-event-30775-30795-33-header
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_closed:417 cs_ipcs_connection_closed() 
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_exit_fn:325 exit_fn for conn=0x7f438576d0c0
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_destroyed:390 cs_ipcs_connection_destroyed() 
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-response-30775-30795-33-header
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (  corosync.c:134   )  notice: corosync_node_name: 	Unable to get node name for nodeid 3232261593
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-event-30775-30795-33-header
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-request-30775-30795-33-header
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ipc_setup.c:handle_new_connection:484 IPC credentials authenticated (30775-30795-33)
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ipc_shm.c:qb_ipcs_shm_connect:295 connecting to client [30795]
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:09 [30790] bl460g1n6        cib: (       ipc.c:334   )    info: crm_client_new: 	Connecting 0x15c3e70 for uid=0 gid=0 pid=30791 id=9371ae0a-f42a-4124-b3e5-4b0c45c649ba
Jan 15 15:38:09 [30790] bl460g1n6        cib: ( ipc_setup.c:484   )   debug: handle_new_connection: 	IPC credentials authenticated (30790-30791-11)
Jan 15 15:38:09 [30790] bl460g1n6        cib: (   ipc_shm.c:295   )   debug: qb_ipcs_shm_connect: 	connecting to client [30791]
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:09 [30790] bl460g1n6        cib: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:38:09 [30793] bl460g1n6      attrd: (      main.c:153   )   debug: attrd_cib_connect: 	CIB signon attempt 2
Jan 15 15:38:09 [30790] bl460g1n6        cib: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:09 [30790] bl460g1n6        cib: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_created:272 connection created
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_init_fn:306 lib_init_fn: conn=0x7f43857684d0
Jan 15 15:38:09 [30791] bl460g1n6 stonith-ng: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:38:09 [30791] bl460g1n6 stonith-ng: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:38:09 [30791] bl460g1n6 stonith-ng: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:09 [30790] bl460g1n6        cib: (       ipc.c:334   )    info: crm_client_new: 	Connecting 0x17f62e0 for uid=189 gid=189 pid=30793 id=f1049688-9613-4dc5-b406-51d46c6ad9c5
Jan 15 15:38:09 [30790] bl460g1n6        cib: ( ipc_setup.c:484   )   debug: handle_new_connection: 	IPC credentials authenticated (30790-30793-12)
Jan 15 15:38:09 [30790] bl460g1n6        cib: (   ipc_shm.c:295   )   debug: qb_ipcs_shm_connect: 	connecting to client [30793]
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (      ipcc.c:378   )   debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-request-30775-30795-33-header
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_dispatch_connection_request:757 HUP conn (30775-30795-33)
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-response-30775-30795-33-header
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_disconnect:605 qb_ipcs_disconnect(30775-30795-33) state:2
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] loop_poll_epoll.c:_del:117 epoll_ctl(del): Bad file descriptor (9)
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-event-30775-30795-33-header
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_closed:417 cs_ipcs_connection_closed() 
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_exit_fn:325 exit_fn for conn=0x7f43857684d0
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_destroyed:390 cs_ipcs_connection_destroyed() 
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-response-30775-30795-33-header
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (  corosync.c:134   )  notice: corosync_node_name: 	Unable to get node name for nodeid 3232261593
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (   cluster.c:350   )  notice: get_node_name: 	Could not obtain a node name for corosync nodeid 3232261593
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (membership.c:615   )  notice: crm_update_peer_state: 	pcmk_quorum_notification: Node (null)[3232261593] - state is now member (was (null))
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (membership.c:83    )   debug: post_cache_update: 	Updated cache after membership event 8.
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (membership.c:97    )   debug: post_cache_update: 	post_cache_update added action A_ELECTION_CHECK to the FSA
Jan 15 15:38:09 [30790] bl460g1n6        cib: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-event-30775-30795-33-header
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-request-30775-30795-33-header
Jan 15 15:38:09 [30790] bl460g1n6        cib: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:38:09 [30790] bl460g1n6        cib: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ipc_setup.c:handle_new_connection:484 IPC credentials authenticated (30775-30795-33)
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ipc_shm.c:qb_ipcs_shm_connect:295 connecting to client [30795]
Jan 15 15:38:09 [30793] bl460g1n6      attrd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:38:09 [30793] bl460g1n6      attrd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:38:09 [30793] bl460g1n6      attrd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:38:09 [30791] bl460g1n6 stonith-ng: (cib_native.c:268   )   debug: cib_native_signon_raw: 	Connection to CIB successful
Jan 15 15:38:09 [30793] bl460g1n6      attrd: (cib_native.c:268   )   debug: cib_native_signon_raw: 	Connection to CIB successful
Jan 15 15:38:09 [30793] bl460g1n6      attrd: (      main.c:163   )    info: attrd_cib_connect: 	Connected to the CIB after 2 attempts
Jan 15 15:38:09 [30790] bl460g1n6        cib: ( callbacks.c:193   )   debug: cib_common_callback_worker: 	Setting cib_diff_notify callbacks for crmd (9371ae0a-f42a-4124-b3e5-4b0c45c649ba): on
Jan 15 15:38:09 [30790] bl460g1n6        cib: ( callbacks.c:193   )   debug: cib_common_callback_worker: 	Setting cib_refresh_notify callbacks for attrd (f1049688-9613-4dc5-b406-51d46c6ad9c5): on
Jan 15 15:38:09 [30793] bl460g1n6      attrd: (      main.c:341   )    info: main: 	CIB connection active
Jan 15 15:38:09 [30791] bl460g1n6 stonith-ng: (      main.c:983   )  notice: setup_cib: 	Watching for stonith topology changes
Jan 15 15:38:09 [30793] bl460g1n6      attrd: (       cpg.c:378   )    info: pcmk_cpg_membership: 	Joined[0.0] attrd.3232261592 
Jan 15 15:38:09 [30791] bl460g1n6 stonith-ng: ( ipc_setup.c:377   )    info: qb_ipcs_us_publish: 	server name: stonith-ng
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:09 [30793] bl460g1n6      attrd: (       cpg.c:384   )    info: pcmk_cpg_membership: 	Member[0.0] attrd.3232261592 
Jan 15 15:38:09 [30791] bl460g1n6 stonith-ng: (      main.c:1213  )    info: main: 	Starting stonith-ng mainloop
Jan 15 15:38:09 [30793] bl460g1n6      attrd: (membership.c:400   )    info: crm_get_peer: 	Created entry fcb032c1-ac8f-4064-bc44-9eabc58057ed/0x1731300 for node (null)/3232261593 (2 total)
Jan 15 15:38:09 [30791] bl460g1n6 stonith-ng: (       cpg.c:378   )    info: pcmk_cpg_membership: 	Joined[0.0] stonith-ng.3232261592 
Jan 15 15:38:09 [30791] bl460g1n6 stonith-ng: (       cpg.c:384   )    info: pcmk_cpg_membership: 	Member[0.0] stonith-ng.3232261592 
Jan 15 15:38:09 [30791] bl460g1n6 stonith-ng: (membership.c:400   )    info: crm_get_peer: 	Created entry 1118be01-a601-485d-aee5-1422541a4372/0xb0b940 for node (null)/3232261593 (2 total)
Jan 15 15:38:09 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_query operation for section 'all': OK (rc=0, origin=local/crmd/2, version=0.0.0)
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_created:272 connection created
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_init_fn:306 lib_init_fn: conn=0x7f43857684d0
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ipc_setup.c:handle_new_connection:484 IPC credentials authenticated (30775-30793-34)
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ipc_shm.c:qb_ipcs_shm_connect:295 connecting to client [30793]
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_created:272 connection created
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_init_fn:306 lib_init_fn: conn=0x7f4385768c00
Jan 15 15:38:09 [30793] bl460g1n6      attrd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:09 [30793] bl460g1n6      attrd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:09 [30793] bl460g1n6      attrd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (      ipcc.c:378   )   debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-request-30775-30795-33-header
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-response-30775-30795-33-header
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-event-30775-30795-33-header
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ipc_setup.c:handle_new_connection:484 IPC credentials authenticated (30775-30791-35)
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ipc_shm.c:qb_ipcs_shm_connect:295 connecting to client [30791]
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (  corosync.c:134   )  notice: corosync_node_name: 	Unable to get node name for nodeid 3232261592
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (   cluster.c:338   )  notice: get_node_name: 	Defaulting to uname -n for the local corosync node name
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (   control.c:791   )    info: do_started: 	Delaying start, Config not read (0000000000000040)
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (  messages.c:90    )   debug: register_fsa_input_adv: 	Stalling the FSA pending further input: source=do_started cause=C_FSA_INTERNAL data=(nil) queue=0
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (       fsa.c:240   )   debug: s_crmd_fsa: 	Exiting the FSA: queue=0, fsa_actions=0x200000002, stalled=true
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (   control.c:920   )   debug: config_query_callback: 	Call 4 : Parsing CIB options
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (   control.c:944   )   debug: config_query_callback: 	Shutdown escalation occurs after: 1200000ms
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (   control.c:951   )   debug: config_query_callback: 	Checking for expired actions every 900000ms
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (   control.c:817   )   debug: do_started: 	Init server comms
Jan 15 15:38:09 [30795] bl460g1n6       crmd: ( ipc_setup.c:377   )    info: qb_ipcs_us_publish: 	server name: crmd
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (   control.c:832   )  notice: do_started: 	The local CRM is operational
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (  election.c:91    )   debug: do_election_check: 	Ignore election check: we not in an election
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (       fsa.c:193   )   debug: s_crmd_fsa: 	Processing I_PENDING: [ state=S_STARTING cause=C_FSA_INTERNAL origin=do_started ]
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (      misc.c:47    )    info: do_log: 	FSA: Input I_PENDING from do_started() received in state S_STARTING
Jan 15 15:38:09 [30795] bl460g1n6       crmd: (       fsa.c:502   )  notice: do_state_transition: 	State transition S_STARTING -> S_PENDING [ input=I_PENDING cause=C_FSA_INTERNAL origin=do_started ]
Jan 15 15:38:09 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_slave operation for section 'all': OK (rc=0, origin=local/crmd/5, version=0.0.0)
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_created:272 connection created
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_init_fn:306 lib_init_fn: conn=0x7f438576a440
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_dispatch_connection_request:757 HUP conn (30775-30795-33)
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_disconnect:605 qb_ipcs_disconnect(30775-30795-33) state:2
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] loop_poll_epoll.c:_del:117 epoll_ctl(del): Bad file descriptor (9)
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_closed:417 cs_ipcs_connection_closed() 
Jan 15 15:38:09 [30791] bl460g1n6 stonith-ng: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_exit_fn:325 exit_fn for conn=0x7f43857684d0
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_destroyed:390 cs_ipcs_connection_destroyed() 
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-response-30775-30795-33-header
Jan 15 15:38:09 [30791] bl460g1n6 stonith-ng: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:09 [30791] bl460g1n6 stonith-ng: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-event-30775-30795-33-header
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-request-30775-30795-33-header
Jan 15 15:38:09 [30793] bl460g1n6      attrd: (      ipcc.c:378   )   debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Jan 15 15:38:09 [30793] bl460g1n6      attrd: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-request-30775-30793-34-header
Jan 15 15:38:09 [30793] bl460g1n6      attrd: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-response-30775-30793-34-header
Jan 15 15:38:09 [30793] bl460g1n6      attrd: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-event-30775-30793-34-header
Jan 15 15:38:09 [30793] bl460g1n6      attrd: (  corosync.c:134   )  notice: corosync_node_name: 	Unable to get node name for nodeid 3232261593
Jan 15 15:38:09 [30793] bl460g1n6      attrd: (   cluster.c:350   )  notice: get_node_name: 	Could not obtain a node name for corosync nodeid 3232261593
Jan 15 15:38:09 [30793] bl460g1n6      attrd: (membership.c:439   )    info: crm_get_peer: 	Node 3232261593 has uuid 3232261593
Jan 15 15:38:09 [30793] bl460g1n6      attrd: (       cpg.c:384   )    info: pcmk_cpg_membership: 	Member[0.1] attrd.3232261593 
Jan 15 15:38:09 [30793] bl460g1n6      attrd: (membership.c:558   )    info: crm_update_peer_proc: 	pcmk_cpg_membership: Node (null)[3232261593] - corosync-cpg is now online
Jan 15 15:38:09 [30793] bl460g1n6      attrd: (membership.c:615   )  notice: crm_update_peer_state: 	attrd_peer_change_cb: Node (null)[3232261593] - state is now member (was (null))
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_dispatch_connection_request:757 HUP conn (30775-30793-34)
Jan 15 15:38:09 [30791] bl460g1n6 stonith-ng: (      ipcc.c:378   )   debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_disconnect:605 qb_ipcs_disconnect(30775-30793-34) state:2
Jan 15 15:38:09 [30791] bl460g1n6 stonith-ng: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-request-30775-30791-35-header
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] loop_poll_epoll.c:_del:117 epoll_ctl(del): Bad file descriptor (9)
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_closed:417 cs_ipcs_connection_closed() 
Jan 15 15:38:09 [30791] bl460g1n6 stonith-ng: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-response-30775-30791-35-header
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_exit_fn:325 exit_fn for conn=0x7f4385768c00
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_destroyed:390 cs_ipcs_connection_destroyed() 
Jan 15 15:38:09 [30791] bl460g1n6 stonith-ng: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-event-30775-30791-35-header
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-response-30775-30793-34-header
Jan 15 15:38:09 [30791] bl460g1n6 stonith-ng: (  corosync.c:134   )  notice: corosync_node_name: 	Unable to get node name for nodeid 3232261593
Jan 15 15:38:09 [30791] bl460g1n6 stonith-ng: (   cluster.c:350   )  notice: get_node_name: 	Could not obtain a node name for corosync nodeid 3232261593
Jan 15 15:38:09 [30791] bl460g1n6 stonith-ng: (membership.c:439   )    info: crm_get_peer: 	Node 3232261593 has uuid 3232261593
Jan 15 15:38:09 [30791] bl460g1n6 stonith-ng: (       cpg.c:384   )    info: pcmk_cpg_membership: 	Member[0.1] stonith-ng.3232261593 
Jan 15 15:38:09 [30791] bl460g1n6 stonith-ng: (membership.c:558   )    info: crm_update_peer_proc: 	pcmk_cpg_membership: Node (null)[3232261593] - corosync-cpg is now online
Jan 15 15:38:09 [30791] bl460g1n6 stonith-ng: (      main.c:1008  )   debug: st_peer_update_callback: 	Broadcasting our uname because of node 3232261593
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-event-30775-30793-34-header
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-request-30775-30793-34-header
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_dispatch_connection_request:757 HUP conn (30775-30791-35)
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_disconnect:605 qb_ipcs_disconnect(30775-30791-35) state:2
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] loop_poll_epoll.c:_del:117 epoll_ctl(del): Bad file descriptor (9)
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_closed:417 cs_ipcs_connection_closed() 
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_exit_fn:325 exit_fn for conn=0x7f438576a440
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_destroyed:390 cs_ipcs_connection_destroyed() 
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-response-30775-30791-35-header
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-event-30775-30791-35-header
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-request-30775-30791-35-header
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ipc_setup.c:handle_new_connection:484 IPC credentials authenticated (30775-30791-33)
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ipc_shm.c:qb_ipcs_shm_connect:295 connecting to client [30791]
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_created:272 connection created
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_init_fn:306 lib_init_fn: conn=0x7f438576de40
Jan 15 15:38:09 [30791] bl460g1n6 stonith-ng: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:09 [30791] bl460g1n6 stonith-ng: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:09 [30791] bl460g1n6 stonith-ng: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:09 [30791] bl460g1n6 stonith-ng: (      ipcc.c:378   )   debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Jan 15 15:38:09 [30791] bl460g1n6 stonith-ng: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-request-30775-30791-33-header
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_dispatch_connection_request:757 HUP conn (30775-30791-33)
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_disconnect:605 qb_ipcs_disconnect(30775-30791-33) state:2
Jan 15 15:38:09 [30791] bl460g1n6 stonith-ng: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-response-30775-30791-33-header
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] loop_poll_epoll.c:_del:117 epoll_ctl(del): Bad file descriptor (9)
Jan 15 15:38:09 [30791] bl460g1n6 stonith-ng: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-event-30775-30791-33-header
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_closed:417 cs_ipcs_connection_closed() 
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_exit_fn:325 exit_fn for conn=0x7f438576de40
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_destroyed:390 cs_ipcs_connection_destroyed() 
Jan 15 15:38:09 [30791] bl460g1n6 stonith-ng: (  corosync.c:134   )  notice: corosync_node_name: 	Unable to get node name for nodeid 3232261592
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-response-30775-30791-33-header
Jan 15 15:38:09 [30791] bl460g1n6 stonith-ng: (   cluster.c:338   )  notice: get_node_name: 	Defaulting to uname -n for the local corosync node name
Jan 15 15:38:09 [30791] bl460g1n6 stonith-ng: (      main.c:883   )    info: init_cib_cache_cb: 	Updating device list from the cib: init
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-event-30775-30791-33-header
Jan 15 15:38:09 [30791] bl460g1n6 stonith-ng: (    unpack.c:93    )   debug: unpack_config: 	STONITH timeout: 60000
Jan 15 15:38:09 [30791] bl460g1n6 stonith-ng: (    unpack.c:97    )   debug: unpack_config: 	STONITH of failed nodes is enabled
Jan 15 15:38:09 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-request-30775-30791-33-header
Jan 15 15:38:09 [30791] bl460g1n6 stonith-ng: (    unpack.c:104   )   debug: unpack_config: 	Stop all active resources: false
Jan 15 15:38:09 [30791] bl460g1n6 stonith-ng: (    unpack.c:108   )   debug: unpack_config: 	Cluster is symmetric - resources can run anywhere by default
Jan 15 15:38:09 [30791] bl460g1n6 stonith-ng: (    unpack.c:113   )   debug: unpack_config: 	Default stickiness: 0
Jan 15 15:38:09 [30791] bl460g1n6 stonith-ng: (    unpack.c:150   )   debug: unpack_config: 	On loss of CCM Quorum: Stop ALL resources
Jan 15 15:38:09 [30791] bl460g1n6 stonith-ng: (    unpack.c:196   )   debug: unpack_config: 	Node scores: 'red' = -INFINITY, 'yellow' = 0, 'green' = 0
Jan 15 15:38:09 [30791] bl460g1n6 stonith-ng: (    unpack.c:481   )    info: unpack_nodes: 	Creating a fake local node
Jan 15 15:38:09 [30791] bl460g1n6 stonith-ng: (    unpack.c:504   )   debug: unpack_domains: 	Unpacking domains
Jan 15 15:38:09 [30791] bl460g1n6 stonith-ng: (membership.c:411   )    info: crm_get_peer: 	Node 3232261593 is now known as bl460g1n7
Jan 15 15:38:09 [30791] bl460g1n6 stonith-ng: (      main.c:1008  )   debug: st_peer_update_callback: 	Broadcasting our uname because of node 3232261593
Jan 15 15:38:10 [30795] bl460g1n6       crmd: (join_client.:46    )   debug: do_cl_join_query: 	Querying for a DC
Jan 15 15:38:10 [30795] bl460g1n6       crmd: (     utils.c:192   )   debug: crm_timer_start: 	Started Election Trigger (I_DC_TIMEOUT:20000ms), src=18
Jan 15 15:38:10 [30795] bl460g1n6       crmd: (       cpg.c:378   )    info: pcmk_cpg_membership: 	Joined[0.0] crmd.3232261592 
Jan 15 15:38:10 [30795] bl460g1n6       crmd: (       cpg.c:384   )    info: pcmk_cpg_membership: 	Member[0.0] crmd.3232261592 
Jan 15 15:38:10 [30773] bl460g1n6 corosync debug   [QB    ] ipc_setup.c:handle_new_connection:484 IPC credentials authenticated (30775-30795-33)
Jan 15 15:38:10 [30773] bl460g1n6 corosync debug   [QB    ] ipc_shm.c:qb_ipcs_shm_connect:295 connecting to client [30795]
Jan 15 15:38:10 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:10 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:10 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:10 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_created:272 connection created
Jan 15 15:38:10 [30773] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_init_fn:306 lib_init_fn: conn=0x7f4385773ab0
Jan 15 15:38:10 [30795] bl460g1n6       crmd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:10 [30795] bl460g1n6       crmd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:10 [30795] bl460g1n6       crmd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:10 [30795] bl460g1n6       crmd: (      ipcc.c:378   )   debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Jan 15 15:38:10 [30795] bl460g1n6       crmd: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-request-30775-30795-33-header
Jan 15 15:38:10 [30773] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_dispatch_connection_request:757 HUP conn (30775-30795-33)
Jan 15 15:38:10 [30795] bl460g1n6       crmd: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-response-30775-30795-33-header
Jan 15 15:38:10 [30773] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_disconnect:605 qb_ipcs_disconnect(30775-30795-33) state:2
Jan 15 15:38:10 [30773] bl460g1n6 corosync debug   [QB    ] loop_poll_epoll.c:_del:117 epoll_ctl(del): Bad file descriptor (9)
Jan 15 15:38:10 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_closed:417 cs_ipcs_connection_closed() 
Jan 15 15:38:10 [30795] bl460g1n6       crmd: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-event-30775-30795-33-header
Jan 15 15:38:10 [30773] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_exit_fn:325 exit_fn for conn=0x7f4385773ab0
Jan 15 15:38:10 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_destroyed:390 cs_ipcs_connection_destroyed() 
Jan 15 15:38:10 [30795] bl460g1n6       crmd: (  corosync.c:134   )  notice: corosync_node_name: 	Unable to get node name for nodeid 3232261593
Jan 15 15:38:10 [30795] bl460g1n6       crmd: (   cluster.c:350   )  notice: get_node_name: 	Could not obtain a node name for corosync nodeid 3232261593
Jan 15 15:38:10 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-response-30775-30795-33-header
Jan 15 15:38:10 [30795] bl460g1n6       crmd: (       cpg.c:384   )    info: pcmk_cpg_membership: 	Member[0.1] crmd.3232261593 
Jan 15 15:38:10 [30795] bl460g1n6       crmd: (membership.c:558   )    info: crm_update_peer_proc: 	pcmk_cpg_membership: Node (null)[3232261593] - corosync-cpg is now online
Jan 15 15:38:10 [30795] bl460g1n6       crmd: (membership.c:411   )    info: crm_get_peer: 	Node 3232261593 is now known as bl460g1n7
Jan 15 15:38:10 [30795] bl460g1n6       crmd: ( callbacks.c:118   )    info: peer_update_callback: 	bl460g1n7 is now member
Jan 15 15:38:10 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-event-30775-30795-33-header
Jan 15 15:38:10 [30795] bl460g1n6       crmd: (  te_utils.c:249   )   debug: te_connect_stonith: 	Attempting connection to fencing daemon...
Jan 15 15:38:10 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-request-30775-30795-33-header
Jan 15 15:38:10 [30790] bl460g1n6        cib: (       ipc.c:334   )    info: crm_client_new: 	Connecting 0x17f6840 for uid=0 gid=0 pid=29122 id=875af07b-f85b-4f25-9e41-a417703178c0
Jan 15 15:38:10 [30790] bl460g1n6        cib: ( ipc_setup.c:484   )   debug: handle_new_connection: 	IPC credentials authenticated (30790-29122-13)
Jan 15 15:38:10 [30790] bl460g1n6        cib: (   ipc_shm.c:295   )   debug: qb_ipcs_shm_connect: 	connecting to client [29122]
Jan 15 15:38:10 [30790] bl460g1n6        cib: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:38:10 [30790] bl460g1n6        cib: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:38:10 [30790] bl460g1n6        cib: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:38:10 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_query operation for section 'all': OK (rc=0, origin=local/crm_mon/3, version=0.0.0)
Jan 15 15:38:10 [30790] bl460g1n6        cib: ( callbacks.c:193   )   debug: cib_common_callback_worker: 	Setting cib_diff_notify callbacks for crm_mon (875af07b-f85b-4f25-9e41-a417703178c0): off
Jan 15 15:38:10 [30790] bl460g1n6        cib: ( callbacks.c:193   )   debug: cib_common_callback_worker: 	Setting cib_diff_notify callbacks for crm_mon (875af07b-f85b-4f25-9e41-a417703178c0): on
Jan 15 15:38:11 [30791] bl460g1n6 stonith-ng: (       ipc.c:334   )    info: crm_client_new: 	Connecting 0xb0f310 for uid=189 gid=189 pid=30795 id=ba6342f3-f8b0-4b20-917d-e976d74e8389
Jan 15 15:38:11 [30791] bl460g1n6 stonith-ng: ( ipc_setup.c:484   )   debug: handle_new_connection: 	IPC credentials authenticated (30791-30795-9)
Jan 15 15:38:11 [30791] bl460g1n6 stonith-ng: (   ipc_shm.c:295   )   debug: qb_ipcs_shm_connect: 	connecting to client [30795]
Jan 15 15:38:11 [30791] bl460g1n6 stonith-ng: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Jan 15 15:38:11 [30791] bl460g1n6 stonith-ng: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Jan 15 15:38:11 [30791] bl460g1n6 stonith-ng: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Jan 15 15:38:11 [30795] bl460g1n6       crmd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Jan 15 15:38:11 [30795] bl460g1n6       crmd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Jan 15 15:38:11 [30795] bl460g1n6       crmd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Jan 15 15:38:11 [30791] bl460g1n6 stonith-ng: (  commands.c:2049  )   debug: stonith_command: 	Processing register 9 from crmd.30795 (               0)
Jan 15 15:38:11 [30791] bl460g1n6 stonith-ng: (  commands.c:2063  )    info: stonith_command: 	Processed register from crmd.30795: OK (0)
Jan 15 15:38:11 [30795] bl460g1n6       crmd: ( st_client.c:1639  )   debug: stonith_api_signon: 	Connection to STONITH successful
Jan 15 15:38:11 [30791] bl460g1n6 stonith-ng: (  commands.c:2049  )   debug: stonith_command: 	Processing st_notify 10 from crmd.30795 (               0)
Jan 15 15:38:11 [30791] bl460g1n6 stonith-ng: (  commands.c:1822  )   debug: handle_request: 	Setting st_notify_disconnect callbacks for crmd.30795 (ba6342f3-f8b0-4b20-917d-e976d74e8389): ON
Jan 15 15:38:11 [30791] bl460g1n6 stonith-ng: (  commands.c:2063  )    info: stonith_command: 	Processed st_notify from crmd.30795: OK (0)
Jan 15 15:38:11 [30791] bl460g1n6 stonith-ng: (  commands.c:2049  )   debug: stonith_command: 	Processing st_notify 11 from crmd.30795 (               0)
Jan 15 15:38:11 [30791] bl460g1n6 stonith-ng: (  commands.c:1822  )   debug: handle_request: 	Setting st_notify_fence callbacks for crmd.30795 (ba6342f3-f8b0-4b20-917d-e976d74e8389): ON
Jan 15 15:38:11 [30791] bl460g1n6 stonith-ng: (  commands.c:2063  )    info: stonith_command: 	Processed st_notify from crmd.30795: OK (0)
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (  election.c:359   )   debug: election_count_vote: 	Created voted hash
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (  election.c:178   )   debug: crm_uptime: 	Current CPU usage is: 0s, 21996us
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (  election.c:206   )   debug: crm_compare_age: 	Win: 0.21996 vs 0.9998 (usec)
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (  election.c:497   )    info: election_count_vote: 	Election 1 (owner: 3232261593) pass: vote from bl460g1n7 (Uptime)
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (  election.c:91    )   debug: do_election_check: 	Ignore election check: we not in an election
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (       fsa.c:193   )   debug: s_crmd_fsa: 	Processing I_ELECTION: [ state=S_PENDING cause=C_FSA_INTERNAL origin=do_election_count_vote ]
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (       fsa.c:502   )    info: do_state_transition: 	State transition S_PENDING -> S_ELECTION [ input=I_ELECTION cause=C_FSA_INTERNAL origin=do_election_count_vote ]
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (  election.c:249   )   debug: election_vote: 	Started election 1
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (  election.c:359   )   debug: election_count_vote: 	Created voted hash
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (  election.c:492   )   debug: election_count_vote: 	Election 1 (current: 1, owner: 3232261592): Processed vote from bl460g1n6 (Recorded)
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (  election.c:304   )   debug: election_check: 	Still waiting on 1 non-votes (2 total)
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (  election.c:492   )   debug: election_count_vote: 	Election 1 (current: 1, owner: 3232261592): Processed no-vote from bl460g1n7 (Recorded)
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (  election.c:47    )    info: election_complete: 	Election election-0 complete
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (   control.c:60    )    info: election_timeout_popped: 	Election failed: Declaring ourselves the winner
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (       fsa.c:193   )   debug: s_crmd_fsa: 	Processing I_ELECTION_DC: [ state=S_ELECTION cause=C_TIMER_POPPED origin=election_timeout_popped ]
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (      misc.c:47    )    info: do_log: 	FSA: Input I_ELECTION_DC from election_timeout_popped() received in state S_ELECTION
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (       fsa.c:502   )  notice: do_state_transition: 	State transition S_ELECTION -> S_INTEGRATION [ input=I_ELECTION_DC cause=C_TIMER_POPPED origin=election_timeout_popped ]
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (   tengine.c:107   )    info: do_te_control: 	Registering TE UUID: be72ea63-75a9-4de4-a591-e716f960743b
Jan 15 15:38:30 [30790] bl460g1n6        cib: ( callbacks.c:193   )   debug: cib_common_callback_worker: 	Setting cib_diff_notify callbacks for crmd (cc068cb9-d0e9-4926-8c5d-e7263aa1c9fe): on
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (     utils.c:72    )    info: set_graph_functions: 	Setting custom graph functions
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (   tengine.c:128   )   debug: do_te_control: 	Transitioner is now active
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (    unpack.c:230   )   debug: unpack_graph: 	Unpacked transition -1: 0 actions in 0 synapses
Jan 15 15:38:30 [30794] bl460g1n6    pengine: (       ipc.c:334   )    info: crm_client_new: 	Connecting 0x1fde950 for uid=189 gid=189 pid=30795 id=264993ce-afe0-4fb9-89eb-67e7abc7232f
Jan 15 15:38:30 [30794] bl460g1n6    pengine: ( ipc_setup.c:484   )   debug: handle_new_connection: 	IPC credentials authenticated (30794-30795-6)
Jan 15 15:38:30 [30794] bl460g1n6    pengine: (   ipc_shm.c:295   )   debug: qb_ipcs_shm_connect: 	connecting to client [30795]
Jan 15 15:38:30 [30794] bl460g1n6    pengine: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:5242893; real_size:5246976; rb->word_size:1311744
Jan 15 15:38:30 [30794] bl460g1n6    pengine: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:5242893; real_size:5246976; rb->word_size:1311744
Jan 15 15:38:30 [30794] bl460g1n6    pengine: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:5242893; real_size:5246976; rb->word_size:1311744
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:5242893; real_size:5246976; rb->word_size:1311744
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:5242893; real_size:5246976; rb->word_size:1311744
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:5242893; real_size:5246976; rb->word_size:1311744
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (     utils.c:192   )   debug: crm_timer_start: 	Started Integration Timer (I_INTEGRATED:180000ms), src=22
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (  election.c:178   )    info: do_dc_takeover: 	Taking over DC status for this partition
Jan 15 15:38:30 [30790] bl460g1n6        cib: (  messages.c:162   )    info: cib_process_readwrite: 	We are now in R/W mode
Jan 15 15:38:30 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_master operation for section 'all': OK (rc=0, origin=local/crmd/6, version=0.0.0)
Jan 15 15:38:30 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_modify operation for section cib: OK (rc=0, origin=local/crmd/7, version=0.0.1)
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: ( cib_utils.c:167   )   debug: Config update: 	Diff: --- 0.0.0
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: ( cib_utils.c:169   )   debug: Config update: 	Diff: +++ 0.0.1 335eff11d8e47ed96126ba44f4ec45e7
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-- <cib num_updates="0"/>
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++ <cib epoch="0" num_updates="1" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.8"/>
Jan 15 15:38:30 [30773] bl460g1n6 corosync debug   [QB    ] ipc_setup.c:handle_new_connection:484 IPC credentials authenticated (30775-30790-33)
Jan 15 15:38:30 [30773] bl460g1n6 corosync debug   [QB    ] ipc_shm.c:qb_ipcs_shm_connect:295 connecting to client [30790]
Jan 15 15:38:30 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:30 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:30 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:30 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_created:272 connection created
Jan 15 15:38:30 [30773] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_init_fn:306 lib_init_fn: conn=0x7f438576de40
Jan 15 15:38:30 [30790] bl460g1n6        cib: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:30 [30790] bl460g1n6        cib: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:30 [30790] bl460g1n6        cib: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:30 [30790] bl460g1n6        cib: (      ipcc.c:378   )   debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Jan 15 15:38:30 [30790] bl460g1n6        cib: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-request-30775-30790-33-header
Jan 15 15:38:30 [30773] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_dispatch_connection_request:757 HUP conn (30775-30790-33)
Jan 15 15:38:30 [30790] bl460g1n6        cib: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-response-30775-30790-33-header
Jan 15 15:38:30 [30773] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_disconnect:605 qb_ipcs_disconnect(30775-30790-33) state:2
Jan 15 15:38:30 [30790] bl460g1n6        cib: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-event-30775-30790-33-header
Jan 15 15:38:30 [30773] bl460g1n6 corosync debug   [QB    ] loop_poll_epoll.c:_del:117 epoll_ctl(del): Bad file descriptor (9)
Jan 15 15:38:30 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_closed:417 cs_ipcs_connection_closed() 
Jan 15 15:38:30 [30773] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_exit_fn:325 exit_fn for conn=0x7f438576de40
Jan 15 15:38:30 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_destroyed:390 cs_ipcs_connection_destroyed() 
Jan 15 15:38:30 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-response-30775-30790-33-header
Jan 15 15:38:30 [30790] bl460g1n6        cib: (  corosync.c:134   )  notice: corosync_node_name: 	Unable to get node name for nodeid 3232261592
Jan 15 15:38:30 [30790] bl460g1n6        cib: (   cluster.c:338   )  notice: get_node_name: 	Defaulting to uname -n for the local corosync node name
Jan 15 15:38:30 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-event-30775-30790-33-header
Jan 15 15:38:30 [30790] bl460g1n6        cib: (   cib_ops.c:905   )   debug: cib_process_xpath: 	cib_query: //cib/configuration/crm_config//cluster_property_set//nvpair[@name='dc-version'] does not exist
Jan 15 15:38:30 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_query operation for section //cib/configuration/crm_config//cluster_property_set//nvpair[@name='dc-version']: No such device or address (rc=-6, origin=local/crmd/8, version=0.0.1)
Jan 15 15:38:30 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-request-30775-30790-33-header
Jan 15 15:38:30 [30790] bl460g1n6        cib: (        io.c:596   )   debug: activateCibXml: 	Triggering CIB write for cib_modify op
Jan 15 15:38:30 [30790] bl460g1n6        cib: ( cib_utils.c:174   )  notice: log_cib_diff: 	cib:diff: Local-only Change: 0.1.1
Jan 15 15:38:30 [30790] bl460g1n6        cib: (       xml.c:1496  )  notice: cib:diff: 	-- <cib admin_epoch="0" epoch="0" num_updates="1"/>
Jan 15 15:38:30 [30790] bl460g1n6        cib: (       xml.c:1507  )  notice: cib:diff: 	++       <cluster_property_set id="cib-bootstrap-options">
Jan 15 15:38:30 [30790] bl460g1n6        cib: (       xml.c:1507  )  notice: cib:diff: 	++         <nvpair id="cib-bootstrap-options-dc-version" name="dc-version" value="1.1.11-0.27.b48276b.git.el6-b48276b"/>
Jan 15 15:38:30 [30790] bl460g1n6        cib: (       xml.c:1507  )  notice: cib:diff: 	++       </cluster_property_set>
Jan 15 15:38:30 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_modify operation for section crm_config: OK (rc=0, origin=local/crmd/9, version=0.1.1)
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: ( cib_utils.c:174   )   debug: log_cib_diff: 	Config update: Local-only Change: 0.1.1
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-- <cib admin_epoch="0" epoch="0" num_updates="1"/>
Jan 15 15:38:30 [30790] bl460g1n6        cib: (   cib_ops.c:905   )   debug: cib_process_xpath: 	cib_query: //cib/configuration/crm_config//cluster_property_set//nvpair[@name='cluster-infrastructure'] does not exist
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+  <cib epoch="1" num_updates="1" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.8" cib-last-written="Wed Jan 15 15:38:30 2014" update-origin="bl460g1n6" update-client="crmd">
Jan 15 15:38:30 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_query operation for section //cib/configuration/crm_config//cluster_property_set//nvpair[@name='cluster-infrastructure']: No such device or address (rc=-6, origin=local/crmd/10, version=0.1.1)
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+    <configuration>
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+      <crm_config>
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++       <cluster_property_set id="cib-bootstrap-options">
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++         <nvpair id="cib-bootstrap-options-dc-version" name="dc-version" value="1.1.11-0.27.b48276b.git.el6-b48276b"/>
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++       </cluster_property_set>
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+      </crm_config>
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+    </configuration>
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+  </cib>
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (   join_dc.c:82    )   debug: initialize_join: 	join-1: Initializing join data (flag=true)
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (   join_dc.c:125   )    info: join_make_offer: 	Making join offers based on membership 8
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (   join_dc.c:140   )    info: join_make_offer: 	join-1: Sending offer to bl460g1n7
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (   join_dc.c:66    )    info: crm_update_peer_join: 	join_make_offer: Node bl460g1n7[3232261593] - join-1 phase 0 -> 1
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (   join_dc.c:140   )    info: join_make_offer: 	join-1: Sending offer to bl460g1n6
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (   join_dc.c:66    )    info: crm_update_peer_join: 	join_make_offer: Node bl460g1n6[3232261592] - join-1 phase 0 -> 1
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (   join_dc.c:173   )    info: do_dc_join_offer_all: 	join-1: Waiting on 2 outstanding join acks
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (       fsa.c:193   )   debug: s_crmd_fsa: 	Processing I_ELECTION_DC: [ state=S_INTEGRATION cause=C_FSA_INTERNAL origin=do_election_check ]
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (      misc.c:47    ) warning: do_log: 	FSA: Input I_ELECTION_DC from do_election_check() received in state S_INTEGRATION
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (  election.c:249   )   debug: election_vote: 	Started election 2
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (   join_dc.c:82    )   debug: initialize_join: 	join-2: Initializing join data (flag=true)
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (   join_dc.c:61    )    info: crm_update_peer_join: 	initialize_join: Node bl460g1n7[3232261593] - join-2 phase 1 -> 0
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (   join_dc.c:61    )    info: crm_update_peer_join: 	initialize_join: Node bl460g1n6[3232261592] - join-2 phase 1 -> 0
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (   join_dc.c:140   )    info: join_make_offer: 	join-2: Sending offer to bl460g1n7
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (   join_dc.c:66    )    info: crm_update_peer_join: 	join_make_offer: Node bl460g1n7[3232261593] - join-2 phase 0 -> 1
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (   join_dc.c:140   )    info: join_make_offer: 	join-2: Sending offer to bl460g1n6
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (   join_dc.c:66    )    info: crm_update_peer_join: 	join_make_offer: Node bl460g1n6[3232261592] - join-2 phase 0 -> 1
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (   join_dc.c:173   )    info: do_dc_join_offer_all: 	join-2: Waiting on 2 outstanding join acks
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (  messages.c:729   )   debug: handle_request: 	Raising I_JOIN_OFFER: join-1
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (       fsa.c:193   )   debug: s_crmd_fsa: 	Processing I_JOIN_OFFER: [ state=S_INTEGRATION cause=C_HA_MESSAGE origin=route_message ]
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (     utils.c:981   )    info: update_dc: 	Set DC to bl460g1n6 (3.0.8)
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (join_client.:135   )   debug: do_cl_join_offer_respond: 	do_cl_join_offer_respond added action A_DC_TIMER_STOP to the FSA
Jan 15 15:38:30 [30790] bl460g1n6        cib: (        io.c:596   )   debug: activateCibXml: 	Triggering CIB write for cib_modify op
Jan 15 15:38:30 [30790] bl460g1n6        cib: ( cib_utils.c:174   )  notice: log_cib_diff: 	cib:diff: Local-only Change: 0.2.1
Jan 15 15:38:30 [30790] bl460g1n6        cib: (     utils.c:1222  )   debug: get_last_sequence: 	Series file /var/lib/pacemaker/cib/cib.last does not exist
Jan 15 15:38:30 [30790] bl460g1n6        cib: (       xml.c:1496  )  notice: cib:diff: 	-- <cib admin_epoch="0" epoch="1" num_updates="1"/>
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: ( cib_utils.c:174   )   debug: log_cib_diff: 	Config update: Local-only Change: 0.2.1
Jan 15 15:38:30 [30790] bl460g1n6        cib: (       xml.c:1507  )  notice: cib:diff: 	++         <nvpair id="cib-bootstrap-options-cluster-infrastructure" name="cluster-infrastructure" value="corosync"/>
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-- <cib admin_epoch="0" epoch="1" num_updates="1"/>
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+  <cib epoch="2" num_updates="1" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.8" cib-last-written="Wed Jan 15 15:38:30 2014" update-origin="bl460g1n6" update-client="crmd">
Jan 15 15:38:30 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_modify operation for section crm_config: OK (rc=0, origin=local/crmd/11, version=0.2.1)
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+    <configuration>
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+      <crm_config>
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+        <cluster_property_set id="cib-bootstrap-options">
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++         <nvpair id="cib-bootstrap-options-cluster-infrastructure" name="cluster-infrastructure" value="corosync"/>
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+        </cluster_property_set>
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+      </crm_config>
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+    </configuration>
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+  </cib>
Jan 15 15:38:30 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_query operation for section crm_config: OK (rc=0, origin=local/crmd/12, version=0.2.1)
Jan 15 15:38:30 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_query operation for section crm_config: OK (rc=0, origin=local/crmd/13, version=0.2.1)
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (   control.c:920   )   debug: config_query_callback: 	Call 12 : Parsing CIB options
Jan 15 15:38:30 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_query operation for section 'all': OK (rc=0, origin=local/crmd/14, version=0.2.1)
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (   control.c:944   )   debug: config_query_callback: 	Shutdown escalation occurs after: 1200000ms
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (   control.c:951   )   debug: config_query_callback: 	Checking for expired actions every 900000ms
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (   control.c:920   )   debug: config_query_callback: 	Call 13 : Parsing CIB options
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (   control.c:944   )   debug: config_query_callback: 	Shutdown escalation occurs after: 1200000ms
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (   control.c:951   )   debug: config_query_callback: 	Checking for expired actions every 900000ms
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (join_client.:157   )   debug: join_query_callback: 	Respond to join offer join-1
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (join_client.:158   )   debug: join_query_callback: 	Acknowledging bl460g1n6 as our DC
Jan 15 15:38:30 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_query operation for section crm_config: OK (rc=0, origin=local/crmd/15, version=0.2.1)
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (  election.c:359   )   debug: election_count_vote: 	Created voted hash
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (  election.c:492   )   debug: election_count_vote: 	Election 2 (current: 2, owner: 3232261592): Processed vote from bl460g1n6 (Recorded)
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (  election.c:91    )   debug: do_election_check: 	Ignore election check: we not in an election
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (   control.c:920   )   debug: config_query_callback: 	Call 15 : Parsing CIB options
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (   control.c:944   )   debug: config_query_callback: 	Shutdown escalation occurs after: 1200000ms
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (   control.c:951   )   debug: config_query_callback: 	Checking for expired actions every 900000ms
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (  messages.c:729   )   debug: handle_request: 	Raising I_JOIN_OFFER: join-2
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (       fsa.c:193   )   debug: s_crmd_fsa: 	Processing I_JOIN_OFFER: [ state=S_INTEGRATION cause=C_HA_MESSAGE origin=route_message ]
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (join_client.:135   )   debug: do_cl_join_offer_respond: 	do_cl_join_offer_respond added action A_DC_TIMER_STOP to the FSA
Jan 15 15:38:30 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_query operation for section 'all': OK (rc=0, origin=local/crmd/16, version=0.2.1)
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (join_client.:157   )   debug: join_query_callback: 	Respond to join offer join-2
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (join_client.:158   )   debug: join_query_callback: 	Acknowledging bl460g1n6 as our DC
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (       fsa.c:193   )   debug: s_crmd_fsa: 	Processing I_JOIN_REQUEST: [ state=S_INTEGRATION cause=C_HA_MESSAGE origin=route_message ]
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (   join_dc.c:282   )   debug: do_dc_join_filter_offer: 	Processing req from bl460g1n6
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (   join_dc.c:303   )   debug: do_dc_join_filter_offer: 	Invalid response from bl460g1n6: join-1 vs. join-2
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (   join_dc.c:579   )   debug: check_join_state: 	Invoked by do_dc_join_filter_offer in state: S_INTEGRATION
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (  election.c:492   )   debug: election_count_vote: 	Election 2 (current: 2, owner: 3232261592): Processed no-vote from bl460g1n7 (Recorded)
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (  election.c:91    )   debug: do_election_check: 	Ignore election check: we not in an election
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (       fsa.c:193   )   debug: s_crmd_fsa: 	Processing I_JOIN_REQUEST: [ state=S_INTEGRATION cause=C_HA_MESSAGE origin=route_message ]
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (   join_dc.c:282   )   debug: do_dc_join_filter_offer: 	Processing req from bl460g1n7
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (   join_dc.c:303   )   debug: do_dc_join_filter_offer: 	Invalid response from bl460g1n7: join-1 vs. join-2
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (   join_dc.c:579   )   debug: check_join_state: 	Invoked by do_dc_join_filter_offer in state: S_INTEGRATION
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (       fsa.c:193   )   debug: s_crmd_fsa: 	Processing I_JOIN_REQUEST: [ state=S_INTEGRATION cause=C_HA_MESSAGE origin=route_message ]
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (   join_dc.c:282   )   debug: do_dc_join_filter_offer: 	Processing req from bl460g1n7
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (   join_dc.c:341   )   debug: do_dc_join_filter_offer: 	join-2: Welcoming node bl460g1n7 (ref join_request-crmd-1389767910-6)
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (   join_dc.c:66    )    info: crm_update_peer_join: 	do_dc_join_filter_offer: Node bl460g1n7[3232261593] - join-2 phase 1 -> 2
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (membership.c:587   )    info: crm_update_peer_expected: 	do_dc_join_filter_offer: Node bl460g1n7[3232261593] - expected state is now member (was (null))
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (   join_dc.c:348   )   debug: do_dc_join_filter_offer: 	1 nodes have been integrated into join-2
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (   join_dc.c:579   )   debug: check_join_state: 	Invoked by do_dc_join_filter_offer in state: S_INTEGRATION
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (   join_dc.c:354   )   debug: do_dc_join_filter_offer: 	join-2: Still waiting on 1 outstanding offers
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (       fsa.c:193   )   debug: s_crmd_fsa: 	Processing I_JOIN_REQUEST: [ state=S_INTEGRATION cause=C_HA_MESSAGE origin=route_message ]
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (   join_dc.c:282   )   debug: do_dc_join_filter_offer: 	Processing req from bl460g1n6
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (   join_dc.c:321   )   debug: do_dc_join_filter_offer: 	bl460g1n6 has a better generation number than the current max bl460g1n7
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (   join_dc.c:323   )   debug: do_dc_join_filter_offer: 	Max generation   <generation_tuple epoch="1" num_updates="1" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.8" cib-last-written="Wed Jan 15 15:38:30 2014" update-origin="bl460g1n6" update-client="crmd"/>
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (   join_dc.c:325   )   debug: do_dc_join_filter_offer: 	Their generation   <generation_tuple epoch="2" num_updates="1" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.8" cib-last-written="Wed Jan 15 15:38:30 2014" update-origin="bl460g1n6" update-client="crmd"/>
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (   join_dc.c:341   )   debug: do_dc_join_filter_offer: 	join-2: Welcoming node bl460g1n6 (ref join_request-crmd-1389767910-9)
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (   join_dc.c:66    )    info: crm_update_peer_join: 	do_dc_join_filter_offer: Node bl460g1n6[3232261592] - join-2 phase 1 -> 2
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (membership.c:587   )    info: crm_update_peer_expected: 	do_dc_join_filter_offer: Node bl460g1n6[3232261592] - expected state is now member (was (null))
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (   join_dc.c:348   )   debug: do_dc_join_filter_offer: 	2 nodes have been integrated into join-2
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (   join_dc.c:579   )   debug: check_join_state: 	Invoked by do_dc_join_filter_offer in state: S_INTEGRATION
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (   join_dc.c:593   )   debug: check_join_state: 	join-2: Integration of 2 peers complete: do_dc_join_filter_offer
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (       fsa.c:193   )   debug: s_crmd_fsa: 	Processing I_INTEGRATED: [ state=S_INTEGRATION cause=C_FSA_INTERNAL origin=check_join_state ]
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (       fsa.c:502   )    info: do_state_transition: 	State transition S_INTEGRATION -> S_FINALIZE_JOIN [ input=I_INTEGRATED cause=C_FSA_INTERNAL origin=check_join_state ]
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (       fsa.c:590   )   debug: do_state_transition: 	All 2 cluster nodes responded to the join offer.
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (     utils.c:192   )   debug: crm_timer_start: 	Started Finalization Timer (I_ELECTION:1800000ms), src=30
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (   join_dc.c:372   )   debug: do_dc_join_finalize: 	Finializing join-2 for 2 clients
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (   join_dc.c:682   )    info: crmd_join_phase_log: 	join-2: bl460g1n7=integrated
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (   join_dc.c:682   )    info: crmd_join_phase_log: 	join-2: bl460g1n6=integrated
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (   join_dc.c:410   )    info: do_dc_join_finalize: 	join-2: Syncing our CIB to the rest of the cluster
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (   join_dc.c:411   )   debug: do_dc_join_finalize: 	Requested version   <generation_tuple epoch="2" num_updates="1" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.8" cib-last-written="Wed Jan 15 15:38:30 2014" update-origin="bl460g1n6" update-client="crmd"/>
Jan 15 15:38:30 [30790] bl460g1n6        cib: (  messages.c:435   )   debug: sync_our_cib: 	Syncing CIB to all peers
Jan 15 15:38:30 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_sync operation for section 'all': OK (rc=0, origin=local/crmd/17, version=0.2.1)
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (   join_dc.c:579   )   debug: check_join_state: 	Invoked by finalize_sync_callback in state: S_FINALIZE_JOIN
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (   join_dc.c:610   )   debug: check_join_state: 	join-2: Still waiting on 2 integrated nodes
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (   join_dc.c:682   )   debug: crmd_join_phase_log: 	join-2: bl460g1n7=integrated
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (   join_dc.c:682   )   debug: crmd_join_phase_log: 	join-2: bl460g1n6=integrated
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (   join_dc.c:438   )   debug: finalize_sync_callback: 	Notifying 2 clients of join-2 results
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (   join_dc.c:562   )   debug: finalize_join_for: 	join-2: ACK'ing join request from bl460g1n7
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (   join_dc.c:66    )    info: crm_update_peer_join: 	finalize_join_for: Node bl460g1n7[3232261593] - join-2 phase 2 -> 3
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (   join_dc.c:562   )   debug: finalize_join_for: 	join-2: ACK'ing join request from bl460g1n6
Jan 15 15:38:30 [30790] bl460g1n6        cib: (        io.c:596   )   debug: activateCibXml: 	Triggering CIB write for cib_modify op
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (   join_dc.c:66    )    info: crm_update_peer_join: 	finalize_join_for: Node bl460g1n6[3232261592] - join-2 phase 2 -> 3
Jan 15 15:38:30 [30790] bl460g1n6        cib: ( cib_utils.c:174   )  notice: log_cib_diff: 	cib:diff: Local-only Change: 0.3.1
Jan 15 15:38:30 [30790] bl460g1n6        cib: (       xml.c:1496  )  notice: cib:diff: 	-- <cib admin_epoch="0" epoch="2" num_updates="1"/>
Jan 15 15:38:30 [30790] bl460g1n6        cib: (       xml.c:1507  )  notice: cib:diff: 	++       <node id="3232261593" uname="bl460g1n7"/>
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: ( cib_utils.c:174   )   debug: log_cib_diff: 	Config update: Local-only Change: 0.3.1
Jan 15 15:38:30 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_modify operation for section nodes: OK (rc=0, origin=local/crmd/18, version=0.3.1)
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-- <cib admin_epoch="0" epoch="2" num_updates="1"/>
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+  <cib epoch="3" num_updates="1" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.8" cib-last-written="Wed Jan 15 15:38:30 2014" update-origin="bl460g1n6" update-client="crmd">
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+    <configuration>
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+      <nodes>
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++       <node id="3232261593" uname="bl460g1n7"/>
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+      </nodes>
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+    </configuration>
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+  </cib>
Jan 15 15:38:30 [30790] bl460g1n6        cib: (        io.c:596   )   debug: activateCibXml: 	Triggering CIB write for cib_modify op
Jan 15 15:38:30 [30790] bl460g1n6        cib: (        io.c:738   )    info: write_cib_contents: 	Archived previous version as /var/lib/pacemaker/cib/cib-0.raw
Jan 15 15:38:30 [30790] bl460g1n6        cib: (        io.c:748   )   debug: write_cib_contents: 	Writing CIB to disk
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: ( cib_utils.c:174   )   debug: log_cib_diff: 	Config update: Local-only Change: 0.4.1
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-- <cib admin_epoch="0" epoch="3" num_updates="1"/>
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+  <cib epoch="4" num_updates="1" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.8" cib-last-written="Wed Jan 15 15:38:30 2014" update-origin="bl460g1n6" update-client="crmd">
Jan 15 15:38:30 [30790] bl460g1n6        cib: ( cib_utils.c:174   )  notice: log_cib_diff: 	cib:diff: Local-only Change: 0.4.1
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+    <configuration>
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+      <nodes>
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++       <node id="3232261592" uname="bl460g1n6"/>
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+      </nodes>
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+    </configuration>
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+  </cib>
Jan 15 15:38:30 [30790] bl460g1n6        cib: (       xml.c:1496  )  notice: cib:diff: 	-- <cib admin_epoch="0" epoch="3" num_updates="1"/>
Jan 15 15:38:30 [30790] bl460g1n6        cib: (       xml.c:1507  )  notice: cib:diff: 	++       <node id="3232261592" uname="bl460g1n6"/>
Jan 15 15:38:30 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_modify operation for section nodes: OK (rc=0, origin=local/crmd/19, version=0.4.1)
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (  messages.c:733   )   debug: handle_request: 	Raising I_JOIN_RESULT: join-2
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (       fsa.c:193   )   debug: s_crmd_fsa: 	Processing I_JOIN_RESULT: [ state=S_FINALIZE_JOIN cause=C_HA_MESSAGE origin=route_message ]
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (join_client.:231   )   debug: do_cl_join_finalize_respond: 	Confirming join join-2: join_ack_nack
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (join_client.:240   )   debug: do_cl_join_finalize_respond: 	join-2: Join complete.  Sending local LRM status to bl460g1n6
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (     utils.c:1011  )    info: erase_status_tag: 	Deleting xpath: //node_state[@uname='bl460g1n6']/transient_attributes
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (     utils.c:1032  )    info: update_attrd_helper: 	Connecting to attrd... 5 retries remaining
Jan 15 15:38:30 [30790] bl460g1n6        cib: (   cib_ops.c:902   )   debug: cib_process_xpath: 	//node_state[@uname='bl460g1n6']/transient_attributes was already removed
Jan 15 15:38:30 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_delete operation for section //node_state[@uname='bl460g1n6']/transient_attributes: OK (rc=0, origin=local/crmd/20, version=0.4.1)
Jan 15 15:38:30 [30793] bl460g1n6      attrd: (       ipc.c:334   )    info: crm_client_new: 	Connecting 0x172e6f0 for uid=189 gid=189 pid=30795 id=ae87db0f-35c1-4af2-b486-2948b0b726b4
Jan 15 15:38:30 [30793] bl460g1n6      attrd: ( ipc_setup.c:484   )   debug: handle_new_connection: 	IPC credentials authenticated (30793-30795-9)
Jan 15 15:38:30 [30793] bl460g1n6      attrd: (   ipc_shm.c:295   )   debug: qb_ipcs_shm_connect: 	connecting to client [30795]
Jan 15 15:38:30 [30793] bl460g1n6      attrd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Jan 15 15:38:30 [30793] bl460g1n6      attrd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Jan 15 15:38:30 [30793] bl460g1n6      attrd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (     utils.c:2023  )   debug: attrd_update_delegate: 	Sent update: terminate=(null) for bl460g1n6
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (     utils.c:2023  )   debug: attrd_update_delegate: 	Sent update: shutdown=0 for bl460g1n6
Jan 15 15:38:30 [30793] bl460g1n6      attrd: (  commands.c:240   )    info: attrd_client_message: 	Starting an election to determine the writer
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (   join_dc.c:482   )   debug: do_dc_join_ack: 	Ignoring op=join_ack_nack message from bl460g1n6
Jan 15 15:38:30 [30793] bl460g1n6      attrd: (  election.c:178   )   debug: crm_uptime: 	Current CPU usage is: 0s, 6998us
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (     utils.c:998   )   debug: erase_xpath_callback: 	Deletion of "//node_state[@uname='bl460g1n6']/transient_attributes": OK (rc=0)
Jan 15 15:38:30 [30773] bl460g1n6 corosync debug   [QB    ] ipc_setup.c:handle_new_connection:484 IPC credentials authenticated (30775-30793-33)
Jan 15 15:38:30 [30773] bl460g1n6 corosync debug   [QB    ] ipc_shm.c:qb_ipcs_shm_connect:295 connecting to client [30793]
Jan 15 15:38:30 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:30 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:30 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:30 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_created:272 connection created
Jan 15 15:38:30 [30773] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_init_fn:306 lib_init_fn: conn=0x7f438576de40
Jan 15 15:38:30 [30793] bl460g1n6      attrd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:30 [30793] bl460g1n6      attrd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:30 [30793] bl460g1n6      attrd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (       fsa.c:193   )   debug: s_crmd_fsa: 	Processing I_JOIN_RESULT: [ state=S_FINALIZE_JOIN cause=C_HA_MESSAGE origin=route_message ]
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (   join_dc.c:66    )    info: crm_update_peer_join: 	do_dc_join_ack: Node bl460g1n6[3232261592] - join-2 phase 3 -> 4
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (   join_dc.c:504   )    info: do_dc_join_ack: 	join-2: Updating node state to member for bl460g1n6
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (     utils.c:1011  )    info: erase_status_tag: 	Deleting xpath: //node_state[@uname='bl460g1n6']/lrm
Jan 15 15:38:30 [30790] bl460g1n6        cib: (   cib_ops.c:902   )   debug: cib_process_xpath: 	//node_state[@uname='bl460g1n6']/lrm was already removed
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (   join_dc.c:514   )   debug: do_dc_join_ack: 	join-2: Registered callback for LRM update 22
Jan 15 15:38:30 [30793] bl460g1n6      attrd: (      ipcc.c:378   )   debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Jan 15 15:38:30 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_delete operation for section //node_state[@uname='bl460g1n6']/lrm: OK (rc=0, origin=local/crmd/21, version=0.4.1)
Jan 15 15:38:30 [30773] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_dispatch_connection_request:757 HUP conn (30775-30793-33)
Jan 15 15:38:30 [30793] bl460g1n6      attrd: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-request-30775-30793-33-header
Jan 15 15:38:30 [30773] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_disconnect:605 qb_ipcs_disconnect(30775-30793-33) state:2
Jan 15 15:38:30 [30773] bl460g1n6 corosync debug   [QB    ] loop_poll_epoll.c:_del:117 epoll_ctl(del): Bad file descriptor (9)
Jan 15 15:38:30 [30793] bl460g1n6      attrd: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-response-30775-30793-33-header
Jan 15 15:38:30 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_closed:417 cs_ipcs_connection_closed() 
Jan 15 15:38:30 [30773] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_exit_fn:325 exit_fn for conn=0x7f438576de40
Jan 15 15:38:30 [30793] bl460g1n6      attrd: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cmap-event-30775-30793-33-header
Jan 15 15:38:30 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_destroyed:390 cs_ipcs_connection_destroyed() 
Jan 15 15:38:30 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-response-30775-30793-33-header
Jan 15 15:38:30 [30793] bl460g1n6      attrd: (  corosync.c:134   )  notice: corosync_node_name: 	Unable to get node name for nodeid 3232261592
Jan 15 15:38:30 [30793] bl460g1n6      attrd: (   cluster.c:338   )  notice: get_node_name: 	Defaulting to uname -n for the local corosync node name
Jan 15 15:38:30 [30793] bl460g1n6      attrd: (  election.c:249   )   debug: election_vote: 	Started election 1
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (     utils.c:998   )   debug: erase_xpath_callback: 	Deletion of "//node_state[@uname='bl460g1n6']/lrm": OK (rc=0)
Jan 15 15:38:30 [30793] bl460g1n6      attrd: (  commands.c:244   )    info: attrd_client_message: 	Broadcasting terminate[bl460g1n6] = (null)
Jan 15 15:38:30 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=local/crmd/22, version=0.4.2)
Jan 15 15:38:30 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-event-30775-30793-33-header
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: ( cib_utils.c:167   )   debug: Config update: 	Diff: --- 0.4.1
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: ( cib_utils.c:169   )   debug: Config update: 	Diff: +++ 0.4.2 3dc2cd62588b97d3c339b18bd7aacff6
Jan 15 15:38:30 [30793] bl460g1n6      attrd: (  commands.c:244   )    info: attrd_client_message: 	Broadcasting shutdown[bl460g1n6] = 0
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-- <cib num_updates="1"/>
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+  <cib epoch="4" num_updates="2" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.8" cib-last-written="Wed Jan 15 15:38:30 2014" update-origin="bl460g1n6" update-client="crmd">
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+    <status>
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++     <node_state id="3232261592" uname="bl460g1n6" in_ccm="true" crmd="online" crm-debug-origin="do_lrm_query_internal" join="member" expected="member">
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++       <lrm id="3232261592">
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++         <lrm_resources/>
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++       </lrm>
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++     </node_state>
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+    </status>
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+  </cib>
Jan 15 15:38:30 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-request-30775-30793-33-header
Jan 15 15:38:30 [30790] bl460g1n6        cib: (membership.c:411   )    info: crm_get_peer: 	Node 3232261593 is now known as bl460g1n7
Jan 15 15:38:30 [30790] bl460g1n6        cib: (   cib_ops.c:902   )   debug: cib_process_xpath: 	//node_state[@uname='bl460g1n7']/transient_attributes was already removed
Jan 15 15:38:30 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_delete operation for section //node_state[@uname='bl460g1n7']/transient_attributes: OK (rc=0, origin=bl460g1n7/crmd/9, version=0.4.2)
Jan 15 15:38:30 [30793] bl460g1n6      attrd: (  election.c:359   )   debug: election_count_vote: 	Created voted hash
Jan 15 15:38:30 [30793] bl460g1n6      attrd: (  election.c:492   )   debug: election_count_vote: 	Election 1 (current: 1, owner: 3232261592): Processed vote from bl460g1n6 (Recorded)
Jan 15 15:38:30 [30793] bl460g1n6      attrd: (  election.c:304   )   debug: election_check: 	Still waiting on 1 non-votes (2 total)
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (   join_dc.c:456   )   debug: join_update_complete_callback: 	Join update 22 complete
Jan 15 15:38:30 [30793] bl460g1n6      attrd: (membership.c:411   )    info: crm_get_peer: 	Node 3232261593 is now known as bl460g1n7
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (   join_dc.c:579   )   debug: check_join_state: 	Invoked by join_update_complete_callback in state: S_FINALIZE_JOIN
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (   join_dc.c:615   )   debug: check_join_state: 	join-2: Still waiting on 1 finalized nodes
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (   join_dc.c:682   )   debug: crmd_join_phase_log: 	join-2: bl460g1n7=finalized
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (   join_dc.c:682   )   debug: crmd_join_phase_log: 	join-2: bl460g1n6=confirmed
Jan 15 15:38:30 [30793] bl460g1n6      attrd: (  election.c:206   )   debug: crm_compare_age: 	Win: 0.6998 vs 0.5999 (usec)
Jan 15 15:38:30 [30793] bl460g1n6      attrd: (  election.c:497   )    info: election_count_vote: 	Election 1 (owner: 3232261593) pass: vote from bl460g1n7 (Uptime)
Jan 15 15:38:30 [30793] bl460g1n6      attrd: (  election.c:249   )   debug: election_vote: 	Started election 2
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (       fsa.c:193   )   debug: s_crmd_fsa: 	Processing I_JOIN_RESULT: [ state=S_FINALIZE_JOIN cause=C_HA_MESSAGE origin=route_message ]
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (   join_dc.c:66    )    info: crm_update_peer_join: 	do_dc_join_ack: Node bl460g1n7[3232261593] - join-2 phase 3 -> 4
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (   join_dc.c:504   )    info: do_dc_join_ack: 	join-2: Updating node state to member for bl460g1n7
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (     utils.c:1011  )    info: erase_status_tag: 	Deleting xpath: //node_state[@uname='bl460g1n7']/lrm
Jan 15 15:38:30 [30793] bl460g1n6      attrd: (  election.c:359   )   debug: election_count_vote: 	Created voted hash
Jan 15 15:38:30 [30793] bl460g1n6      attrd: (  election.c:304   )   debug: election_check: 	Still waiting on 2 non-votes (2 total)
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (   join_dc.c:514   )   debug: do_dc_join_ack: 	join-2: Registered callback for LRM update 24
Jan 15 15:38:30 [30790] bl460g1n6        cib: (   cib_ops.c:902   )   debug: cib_process_xpath: 	//node_state[@uname='bl460g1n7']/lrm was already removed
Jan 15 15:38:30 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_delete operation for section //node_state[@uname='bl460g1n7']/lrm: OK (rc=0, origin=local/crmd/23, version=0.4.2)
Jan 15 15:38:30 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=local/crmd/24, version=0.4.3)
Jan 15 15:38:30 [30793] bl460g1n6      attrd: (  election.c:492   )   debug: election_count_vote: 	Election 2 (current: 2, owner: 3232261592): Processed vote from bl460g1n6 (Recorded)
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: ( cib_utils.c:167   )   debug: Config update: 	Diff: --- 0.4.2
Jan 15 15:38:30 [30793] bl460g1n6      attrd: (  election.c:304   )   debug: election_check: 	Still waiting on 1 non-votes (2 total)
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: ( cib_utils.c:169   )   debug: Config update: 	Diff: +++ 0.4.3 e12ed5b35b69eedd145e2db1944d3e73
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-- <cib num_updates="2"/>
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+  <cib epoch="4" num_updates="3" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.8" cib-last-written="Wed Jan 15 15:38:30 2014" update-origin="bl460g1n6" update-client="crmd">
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+    <status>
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++     <node_state id="3232261593" uname="bl460g1n7" in_ccm="true" crmd="online" crm-debug-origin="do_lrm_query_internal" join="member" expected="member">
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++       <lrm id="3232261593">
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++         <lrm_resources/>
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++       </lrm>
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++     </node_state>
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+    </status>
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+  </cib>
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (     utils.c:998   )   debug: erase_xpath_callback: 	Deletion of "//node_state[@uname='bl460g1n7']/lrm": OK (rc=0)
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (   join_dc.c:456   )   debug: join_update_complete_callback: 	Join update 24 complete
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (   join_dc.c:579   )   debug: check_join_state: 	Invoked by join_update_complete_callback in state: S_FINALIZE_JOIN
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (   join_dc.c:619   )   debug: check_join_state: 	join-2 complete: join_update_complete_callback
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (       fsa.c:193   )   debug: s_crmd_fsa: 	Processing I_FINALIZED: [ state=S_FINALIZE_JOIN cause=C_FSA_INTERNAL origin=check_join_state ]
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (       fsa.c:502   )    info: do_state_transition: 	State transition S_FINALIZE_JOIN -> S_POLICY_ENGINE [ input=I_FINALIZED cause=C_FSA_INTERNAL origin=check_join_state ]
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (       fsa.c:610   )   debug: do_state_transition: 	All 2 cluster nodes are eligible to run resources.
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (   join_dc.c:634   )   debug: do_dc_join_final: 	Ensuring DC, quorum and node attributes are up-to-date
Jan 15 15:38:30 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_modify operation for section nodes: OK (rc=0, origin=local/crmd/25, version=0.4.3)
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (     utils.c:2023  )   debug: attrd_update_delegate: 	Sent update: (null)=(null) for localhost
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (membership.c:317   )   debug: crm_update_quorum: 	Updating quorum status to true (call=27)
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (   tengine.c:150   )   debug: do_te_invoke: 	Cancelling the transition: inactive
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (  te_utils.c:431   )    info: abort_transition_graph: 	do_te_invoke:151 - Triggered transition abort (complete=1) : Peer Cancelled
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (       fsa.c:193   )   debug: s_crmd_fsa: 	Processing I_PE_CALC: [ state=S_POLICY_ENGINE cause=C_FSA_INTERNAL origin=abort_transition_graph ]
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (   pengine.c:231   )   debug: do_pe_invoke: 	Query 28: Requesting the current CIB: S_POLICY_ENGINE
Jan 15 15:38:30 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=local/crmd/26, version=0.4.3)
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: ( cib_utils.c:167   )   debug: Config update: 	Diff: --- 0.4.3
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: ( cib_utils.c:169   )   debug: Config update: 	Diff: +++ 0.4.4 0f0656cceabc246b17d0a433ce381f8e
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-- <cib num_updates="3"/>
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (te_callbacks:122   )   debug: te_update_diff: 	Processing diff (cib_modify): 0.4.3 -> 0.4.4 (S_POLICY_ENGINE)
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++ <cib epoch="4" num_updates="4" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.8" cib-last-written="Wed Jan 15 15:38:30 2014" update-origin="bl460g1n6" update-client="crmd" have-quorum="1" dc-uuid="3232261592"/>
Jan 15 15:38:30 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_modify operation for section cib: OK (rc=0, origin=local/crmd/27, version=0.4.4)
Jan 15 15:38:30 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_query operation for section 'all': OK (rc=0, origin=local/crmd/28, version=0.4.4)
Jan 15 15:38:30 [30794] bl460g1n6    pengine: (    unpack.c:93    )   debug: unpack_config: 	STONITH timeout: 60000
Jan 15 15:38:30 [30794] bl460g1n6    pengine: (    unpack.c:97    )   debug: unpack_config: 	STONITH of failed nodes is enabled
Jan 15 15:38:30 [30794] bl460g1n6    pengine: (    unpack.c:104   )   debug: unpack_config: 	Stop all active resources: false
Jan 15 15:38:30 [30794] bl460g1n6    pengine: (    unpack.c:108   )   debug: unpack_config: 	Cluster is symmetric - resources can run anywhere by default
Jan 15 15:38:30 [30794] bl460g1n6    pengine: (    unpack.c:113   )   debug: unpack_config: 	Default stickiness: 0
Jan 15 15:38:30 [30794] bl460g1n6    pengine: (    unpack.c:150   )   debug: unpack_config: 	On loss of CCM Quorum: Stop ALL resources
Jan 15 15:38:30 [30794] bl460g1n6    pengine: (    unpack.c:196   )   debug: unpack_config: 	Node scores: 'red' = -INFINITY, 'yellow' = 0, 'green' = 0
Jan 15 15:38:30 [30794] bl460g1n6    pengine: (    unpack.c:504   )   debug: unpack_domains: 	Unpacking domains
Jan 15 15:38:30 [30794] bl460g1n6    pengine: (    unpack.c:722   )   error: unpack_resources: 	Resource start-up disabled since no STONITH resources have been defined
Jan 15 15:38:30 [30794] bl460g1n6    pengine: (    unpack.c:723   )   error: unpack_resources: 	Either configure some or disable STONITH with the stonith-enabled option
Jan 15 15:38:30 [30794] bl460g1n6    pengine: (    unpack.c:724   )   error: unpack_resources: 	NOTE: Clusters with shared data need STONITH to ensure data integrity
Jan 15 15:38:30 [30794] bl460g1n6    pengine: (    unpack.c:1215  )    info: determine_online_status_fencing: 	Node bl460g1n6 is active
Jan 15 15:38:30 [30794] bl460g1n6    pengine: (    unpack.c:1331  )    info: determine_online_status: 	Node bl460g1n6 is online
Jan 15 15:38:30 [30794] bl460g1n6    pengine: (    unpack.c:1215  )    info: determine_online_status_fencing: 	Node bl460g1n7 is active
Jan 15 15:38:30 [30794] bl460g1n6    pengine: (    unpack.c:1331  )    info: determine_online_status: 	Node bl460g1n7 is online
Jan 15 15:38:30 [30794] bl460g1n6    pengine: (  allocate.c:1332  )  notice: stage6: 	Delaying fencing operations until there are resources to manage
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (   pengine.c:299   )   debug: do_pe_invoke_callback: 	Invoking the PE: query=28, ref=pe_calc-dc-1389767910-13, seq=8, quorate=1
Jan 15 15:38:30 [30794] bl460g1n6    pengine: (     utils.c:1222  )   debug: get_last_sequence: 	Series file /var/lib/pacemaker/pengine/pe-input.last does not exist
Jan 15 15:38:30 [30794] bl460g1n6    pengine: (   pengine.c:178   )  notice: process_pe_message: 	Calculated Transition 0: /var/lib/pacemaker/pengine/pe-input-0.bz2
Jan 15 15:38:30 [30790] bl460g1n6        cib: (        io.c:773   )    info: write_cib_contents: 	Wrote version 0.1.0 of the CIB to disk (digest: 95284c32320f2298eff00c4881b9db37)
Jan 15 15:38:30 [30794] bl460g1n6    pengine: (   pengine.c:183   )  notice: process_pe_message: 	Configuration ERRORs found during PE processing.  Please run "crm_verify -L" to identify issues.
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (       fsa.c:193   )   debug: s_crmd_fsa: 	Processing I_PE_SUCCESS: [ state=S_POLICY_ENGINE cause=C_IPC_MESSAGE origin=handle_response ]
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (       fsa.c:502   )    info: do_state_transition: 	State transition S_POLICY_ENGINE -> S_TRANSITION_ENGINE [ input=I_PE_SUCCESS cause=C_IPC_MESSAGE origin=handle_response ]
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (    unpack.c:230   )   debug: unpack_graph: 	Unpacked transition 0: 2 actions in 2 synapses
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (   tengine.c:208   )    info: do_te_invoke: 	Processing graph 0 (ref=pe_calc-dc-1389767910-13) derived from /var/lib/pacemaker/pengine/pe-input-0.bz2
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (te_actions.c:416   )  notice: te_rsc_command: 	Initiating action 3: probe_complete probe_complete on bl460g1n7 - no waiting
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (te_actions.c:454   )    info: te_rsc_command: 	Action 3 confirmed - no wait
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (te_actions.c:416   )  notice: te_rsc_command: 	Initiating action 2: probe_complete probe_complete on bl460g1n6 (local) - no waiting
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (     utils.c:2023  )   debug: attrd_update_delegate: 	Sent update: probe_complete=true for bl460g1n6
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (te_actions.c:454   )    info: te_rsc_command: 	Action 2 confirmed - no wait
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (     graph.c:336   )   debug: run_graph: 	Transition 0 (Complete=0, Pending=0, Fired=2, Skipped=0, Incomplete=0, Source=/var/lib/pacemaker/pengine/pe-input-0.bz2): In-progress
Jan 15 15:38:30 [30793] bl460g1n6      attrd: (  commands.c:244   )    info: attrd_client_message: 	Broadcasting probe_complete[bl460g1n6] = true
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (     graph.c:336   )  notice: run_graph: 	Transition 0 (Complete=2, Pending=0, Fired=0, Skipped=0, Incomplete=0, Source=/var/lib/pacemaker/pengine/pe-input-0.bz2): Complete
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (  te_utils.c:355   )   debug: te_graph_trigger: 	Transition 0 is now complete
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (te_actions.c:654   )   debug: notify_crmd: 	Processing transition completion in state S_TRANSITION_ENGINE
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (te_actions.c:699   )   debug: notify_crmd: 	Transition 0 status: done - <null>
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (       fsa.c:193   )   debug: s_crmd_fsa: 	Processing I_TE_SUCCESS: [ state=S_TRANSITION_ENGINE cause=C_FSA_INTERNAL origin=notify_crmd ]
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (      misc.c:47    )    info: do_log: 	FSA: Input I_TE_SUCCESS from notify_crmd() received in state S_TRANSITION_ENGINE
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (       fsa.c:502   )  notice: do_state_transition: 	State transition S_TRANSITION_ENGINE -> S_IDLE [ input=I_TE_SUCCESS cause=C_FSA_INTERNAL origin=notify_crmd ]
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (       fsa.c:645   )   debug: do_state_transition: 	Starting PEngine Recheck Timer
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (     utils.c:192   )   debug: crm_timer_start: 	Started PEngine Recheck Timer (I_PE_CALC:900000ms), src=41
Jan 15 15:38:30 [30793] bl460g1n6      attrd: (  election.c:492   )   debug: election_count_vote: 	Election 2 (current: 2, owner: 3232261592): Processed no-vote from bl460g1n7 (Recorded)
Jan 15 15:38:30 [30793] bl460g1n6      attrd: (  election.c:47    )    info: election_complete: 	Election election-attrd complete
Jan 15 15:38:30 [30793] bl460g1n6      attrd: (  commands.c:371   )   debug: attrd_peer_sync: 	Syncing shutdown[bl460g1n6] = 0 to everyone
Jan 15 15:38:30 [30793] bl460g1n6      attrd: (  commands.c:371   )   debug: attrd_peer_sync: 	Syncing shutdown[bl460g1n7] = 0 to everyone
Jan 15 15:38:30 [30793] bl460g1n6      attrd: (  commands.c:371   )   debug: attrd_peer_sync: 	Syncing terminate[bl460g1n6] = (null) to everyone
Jan 15 15:38:30 [30793] bl460g1n6      attrd: (  commands.c:371   )   debug: attrd_peer_sync: 	Syncing terminate[bl460g1n7] = (null) to everyone
Jan 15 15:38:30 [30793] bl460g1n6      attrd: (  commands.c:371   )   debug: attrd_peer_sync: 	Syncing probe_complete[bl460g1n6] = true to everyone
Jan 15 15:38:30 [30793] bl460g1n6      attrd: (  commands.c:376   )   debug: attrd_peer_sync: 	Syncing values to everyone
Jan 15 15:38:30 [30793] bl460g1n6      attrd: (  commands.c:739   )   debug: write_attribute: 	Update: bl460g1n6[shutdown]=0 (3232261592 3232261592 3232261592 bl460g1n6)
Jan 15 15:38:30 [30793] bl460g1n6      attrd: (  commands.c:739   )   debug: write_attribute: 	Update: bl460g1n7[shutdown]=0 (3232261593 3232261593 3232261593 bl460g1n7)
Jan 15 15:38:30 [30793] bl460g1n6      attrd: (  commands.c:765   )  notice: write_attribute: 	Sent update 2 with 2 changes for shutdown, id=<n/a>, set=(null)
Jan 15 15:38:30 [30793] bl460g1n6      attrd: (  commands.c:739   )   debug: write_attribute: 	Update: bl460g1n6[terminate]=(null) (3232261592 3232261592 3232261592 bl460g1n6)
Jan 15 15:38:30 [30793] bl460g1n6      attrd: (  commands.c:739   )   debug: write_attribute: 	Update: bl460g1n7[terminate]=(null) (3232261593 3232261593 3232261593 bl460g1n7)
Jan 15 15:38:30 [30793] bl460g1n6      attrd: (  commands.c:765   )  notice: write_attribute: 	Sent update 3 with 2 changes for terminate, id=<n/a>, set=(null)
Jan 15 15:38:30 [30793] bl460g1n6      attrd: (  commands.c:739   )   debug: write_attribute: 	Update: bl460g1n6[probe_complete]=true (3232261592 3232261592 3232261592 bl460g1n6)
Jan 15 15:38:30 [30793] bl460g1n6      attrd: (  commands.c:765   )  notice: write_attribute: 	Sent update 4 with 1 changes for probe_complete, id=<n/a>, set=(null)
Jan 15 15:38:30 [30793] bl460g1n6      attrd: (  commands.c:708   )    info: write_attribute: 	Write out of 'probe_complete' delayed: update 4 in progress
Jan 15 15:38:30 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=local/attrd/2, version=0.4.5)
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: ( cib_utils.c:167   )   debug: Config update: 	Diff: --- 0.4.4
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: ( cib_utils.c:169   )   debug: Config update: 	Diff: +++ 0.4.5 105dd2ace834bb7f52e578442fda1ca3
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-- <cib num_updates="4"/>
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+  <cib epoch="4" num_updates="5" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.8" cib-last-written="Wed Jan 15 15:38:30 2014" update-origin="bl460g1n6" update-client="crmd" have-quorum="1" dc-uuid="3232261592">
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+    <status>
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+      <node_state id="3232261592" uname="bl460g1n6" in_ccm="true" crmd="online" crm-debug-origin="do_state_transition" join="member" expected="member">
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++       <transient_attributes id="3232261592">
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++         <instance_attributes id="status-3232261592">
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++           <nvpair id="status-3232261592-shutdown" name="shutdown" value="0"/>
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++         </instance_attributes>
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++       </transient_attributes>
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+      </node_state>
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+      <node_state id="3232261593" uname="bl460g1n7" in_ccm="true" crmd="online" crm-debug-origin="do_state_transition" join="member" expected="member">
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++       <transient_attributes id="3232261593">
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++         <instance_attributes id="status-3232261593">
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++           <nvpair id="status-3232261593-shutdown" name="shutdown" value="0"/>
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++         </instance_attributes>
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++       </transient_attributes>
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+      </node_state>
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+    </status>
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (te_callbacks:122   )   debug: te_update_diff: 	Processing diff (cib_modify): 0.4.4 -> 0.4.5 (S_IDLE)
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+  </cib>
Jan 15 15:38:30 [30790] bl460g1n6        cib: (   cib_ops.c:368   )   debug: cib_process_modify: 	Destroying /cib/status/node_state[1]/transient_attributes/instance_attributes/nvpair[2]
Jan 15 15:38:30 [30790] bl460g1n6        cib: (   cib_ops.c:368   )   debug: cib_process_modify: 	Destroying /cib/status/node_state[2]/transient_attributes/instance_attributes/nvpair[2]
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (  te_utils.c:413   )    info: abort_transition_graph: 	te_update_diff:172 - Triggered transition abort (complete=1, node=bl460g1n6, tag=nvpair, id=status-3232261592-shutdown, name=shutdown, value=0, magic=NA, cib=0.4.5) : Transient attribute: update
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause   <nvpair id="status-3232261592-shutdown" name="shutdown" value="0"/>
Jan 15 15:38:30 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=local/attrd/3, version=0.4.5)
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (       fsa.c:193   )   debug: s_crmd_fsa: 	Processing I_PE_CALC: [ state=S_IDLE cause=C_FSA_INTERNAL origin=abort_transition_graph ]
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (       fsa.c:502   )  notice: do_state_transition: 	State transition S_IDLE -> S_POLICY_ENGINE [ input=I_PE_CALC cause=C_FSA_INTERNAL origin=abort_transition_graph ]
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (       fsa.c:610   )   debug: do_state_transition: 	All 2 cluster nodes are eligible to run resources.
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (   pengine.c:231   )   debug: do_pe_invoke: 	Query 29: Requesting the current CIB: S_POLICY_ENGINE
Jan 15 15:38:30 [30793] bl460g1n6      attrd: (  commands.c:590   )    info: attrd_cib_callback: 	Update 3 for terminate: OK (0)
Jan 15 15:38:30 [30793] bl460g1n6      attrd: (  commands.c:594   )  notice: attrd_cib_callback: 	Update 3 for terminate[bl460g1n6]=(null): OK (0)
Jan 15 15:38:30 [30793] bl460g1n6      attrd: (  commands.c:594   )  notice: attrd_cib_callback: 	Update 3 for terminate[bl460g1n7]=(null): OK (0)
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (te_callbacks:122   )   debug: te_update_diff: 	Processing diff (cib_modify): 0.4.5 -> 0.4.6 (S_POLICY_ENGINE)
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: ( cib_utils.c:167   )   debug: Config update: 	Diff: --- 0.4.5
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: ( cib_utils.c:169   )   debug: Config update: 	Diff: +++ 0.4.6 e5ca680d23c5b5ecbafb5f9f85ba12aa
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-- <cib num_updates="5"/>
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+  <cib epoch="4" num_updates="6" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.8" cib-last-written="Wed Jan 15 15:38:30 2014" update-origin="bl460g1n6" update-client="crmd" have-quorum="1" dc-uuid="3232261592">
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+    <status>
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+      <node_state id="3232261592" uname="bl460g1n6" in_ccm="true" crmd="online" crm-debug-origin="do_state_transition" join="member" expected="member">
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+        <transient_attributes id="3232261592">
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+          <instance_attributes id="status-3232261592">
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++           <nvpair id="status-3232261592-probe_complete" name="probe_complete" value="true"/>
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+          </instance_attributes>
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+        </transient_attributes>
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+      </node_state>
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+    </status>
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+  </cib>
Jan 15 15:38:30 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=local/attrd/4, version=0.4.6)
Jan 15 15:38:30 [30793] bl460g1n6      attrd: (  commands.c:590   )    info: attrd_cib_callback: 	Update 2 for shutdown: OK (0)
Jan 15 15:38:30 [30793] bl460g1n6      attrd: (  commands.c:594   )  notice: attrd_cib_callback: 	Update 2 for shutdown[bl460g1n6]=0: OK (0)
Jan 15 15:38:30 [30793] bl460g1n6      attrd: (  commands.c:594   )  notice: attrd_cib_callback: 	Update 2 for shutdown[bl460g1n7]=0: OK (0)
Jan 15 15:38:30 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_query operation for section 'all': OK (rc=0, origin=local/crmd/29, version=0.4.6)
Jan 15 15:38:30 [30793] bl460g1n6      attrd: (  commands.c:590   )    info: attrd_cib_callback: 	Update 4 for probe_complete: OK (0)
Jan 15 15:38:30 [30793] bl460g1n6      attrd: (  commands.c:594   )  notice: attrd_cib_callback: 	Update 4 for probe_complete[bl460g1n6]=true: OK (0)
Jan 15 15:38:30 [30793] bl460g1n6      attrd: (  commands.c:594   )  notice: attrd_cib_callback: 	Update 4 for probe_complete[bl460g1n7]=(null): OK (0)
Jan 15 15:38:30 [30793] bl460g1n6      attrd: (  commands.c:739   )   debug: write_attribute: 	Update: bl460g1n6[probe_complete]=true (3232261592 3232261592 3232261592 bl460g1n6)
Jan 15 15:38:30 [30793] bl460g1n6      attrd: (  commands.c:739   )   debug: write_attribute: 	Update: bl460g1n7[probe_complete]=true (3232261593 3232261593 3232261593 bl460g1n7)
Jan 15 15:38:30 [30793] bl460g1n6      attrd: (  commands.c:765   )  notice: write_attribute: 	Sent update 5 with 2 changes for probe_complete, id=<n/a>, set=(null)
Jan 15 15:38:30 [30794] bl460g1n6    pengine: (    unpack.c:93    )   debug: unpack_config: 	STONITH timeout: 60000
Jan 15 15:38:30 [30794] bl460g1n6    pengine: (    unpack.c:97    )   debug: unpack_config: 	STONITH of failed nodes is enabled
Jan 15 15:38:30 [30794] bl460g1n6    pengine: (    unpack.c:104   )   debug: unpack_config: 	Stop all active resources: false
Jan 15 15:38:30 [30794] bl460g1n6    pengine: (    unpack.c:108   )   debug: unpack_config: 	Cluster is symmetric - resources can run anywhere by default
Jan 15 15:38:30 [30794] bl460g1n6    pengine: (    unpack.c:113   )   debug: unpack_config: 	Default stickiness: 0
Jan 15 15:38:30 [30794] bl460g1n6    pengine: (    unpack.c:150   )   debug: unpack_config: 	On loss of CCM Quorum: Stop ALL resources
Jan 15 15:38:30 [30794] bl460g1n6    pengine: (    unpack.c:196   )   debug: unpack_config: 	Node scores: 'red' = -INFINITY, 'yellow' = 0, 'green' = 0
Jan 15 15:38:30 [30794] bl460g1n6    pengine: (    unpack.c:504   )   debug: unpack_domains: 	Unpacking domains
Jan 15 15:38:30 [30794] bl460g1n6    pengine: (    unpack.c:722   )   error: unpack_resources: 	Resource start-up disabled since no STONITH resources have been defined
Jan 15 15:38:30 [30794] bl460g1n6    pengine: (    unpack.c:723   )   error: unpack_resources: 	Either configure some or disable STONITH with the stonith-enabled option
Jan 15 15:38:30 [30794] bl460g1n6    pengine: (    unpack.c:724   )   error: unpack_resources: 	NOTE: Clusters with shared data need STONITH to ensure data integrity
Jan 15 15:38:30 [30794] bl460g1n6    pengine: (    unpack.c:1215  )    info: determine_online_status_fencing: 	Node bl460g1n6 is active
Jan 15 15:38:30 [30794] bl460g1n6    pengine: (    unpack.c:1331  )    info: determine_online_status: 	Node bl460g1n6 is online
Jan 15 15:38:30 [30794] bl460g1n6    pengine: (    unpack.c:1215  )    info: determine_online_status_fencing: 	Node bl460g1n7 is active
Jan 15 15:38:30 [30794] bl460g1n6    pengine: (    unpack.c:1331  )    info: determine_online_status: 	Node bl460g1n7 is online
Jan 15 15:38:30 [30794] bl460g1n6    pengine: (  allocate.c:1332  )  notice: stage6: 	Delaying fencing operations until there are resources to manage
Jan 15 15:38:30 [30794] bl460g1n6    pengine: (   pengine.c:178   )  notice: process_pe_message: 	Calculated Transition 1: /var/lib/pacemaker/pengine/pe-input-1.bz2
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (   pengine.c:299   )   debug: do_pe_invoke_callback: 	Invoking the PE: query=29, ref=pe_calc-dc-1389767910-16, seq=8, quorate=1
Jan 15 15:38:30 [30794] bl460g1n6    pengine: (   pengine.c:183   )  notice: process_pe_message: 	Configuration ERRORs found during PE processing.  Please run "crm_verify -L" to identify issues.
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (       fsa.c:193   )   debug: s_crmd_fsa: 	Processing I_PE_SUCCESS: [ state=S_POLICY_ENGINE cause=C_IPC_MESSAGE origin=handle_response ]
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: ( cib_utils.c:167   )   debug: Config update: 	Diff: --- 0.4.6
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: ( cib_utils.c:169   )   debug: Config update: 	Diff: +++ 0.4.7 2242d96e5991e7fd55419171f47bba8d
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (       fsa.c:502   )    info: do_state_transition: 	State transition S_POLICY_ENGINE -> S_TRANSITION_ENGINE [ input=I_PE_SUCCESS cause=C_IPC_MESSAGE origin=handle_response ]
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-- <cib num_updates="6"/>
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+  <cib epoch="4" num_updates="7" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.8" cib-last-written="Wed Jan 15 15:38:30 2014" update-origin="bl460g1n6" update-client="crmd" have-quorum="1" dc-uuid="3232261592">
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+    <status>
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+      <node_state id="3232261593" uname="bl460g1n7" in_ccm="true" crmd="online" crm-debug-origin="do_state_transition" join="member" expected="member">
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+        <transient_attributes id="3232261593">
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (    unpack.c:230   )   debug: unpack_graph: 	Unpacked transition 1: 1 actions in 1 synapses
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+          <instance_attributes id="status-3232261593">
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++           <nvpair id="status-3232261593-probe_complete" name="probe_complete" value="true"/>
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (   tengine.c:208   )    info: do_te_invoke: 	Processing graph 1 (ref=pe_calc-dc-1389767910-16) derived from /var/lib/pacemaker/pengine/pe-input-1.bz2
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+          </instance_attributes>
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+        </transient_attributes>
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+      </node_state>
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+    </status>
Jan 15 15:38:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+  </cib>
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (te_callbacks:122   )   debug: te_update_diff: 	Processing diff (cib_modify): 0.4.6 -> 0.4.7 (S_TRANSITION_ENGINE)
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (te_actions.c:416   )  notice: te_rsc_command: 	Initiating action 3: probe_complete probe_complete on bl460g1n7 - no waiting
Jan 15 15:38:30 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=local/attrd/5, version=0.4.7)
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (te_actions.c:454   )    info: te_rsc_command: 	Action 3 confirmed - no wait
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (     graph.c:336   )   debug: run_graph: 	Transition 1 (Complete=0, Pending=0, Fired=1, Skipped=0, Incomplete=0, Source=/var/lib/pacemaker/pengine/pe-input-1.bz2): In-progress
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (     graph.c:336   )  notice: run_graph: 	Transition 1 (Complete=1, Pending=0, Fired=0, Skipped=0, Incomplete=0, Source=/var/lib/pacemaker/pengine/pe-input-1.bz2): Complete
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (  te_utils.c:355   )   debug: te_graph_trigger: 	Transition 1 is now complete
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (te_actions.c:654   )   debug: notify_crmd: 	Processing transition completion in state S_TRANSITION_ENGINE
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (te_actions.c:699   )   debug: notify_crmd: 	Transition 1 status: done - <null>
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (       fsa.c:193   )   debug: s_crmd_fsa: 	Processing I_TE_SUCCESS: [ state=S_TRANSITION_ENGINE cause=C_FSA_INTERNAL origin=notify_crmd ]
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (      misc.c:47    )    info: do_log: 	FSA: Input I_TE_SUCCESS from notify_crmd() received in state S_TRANSITION_ENGINE
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (       fsa.c:502   )  notice: do_state_transition: 	State transition S_TRANSITION_ENGINE -> S_IDLE [ input=I_TE_SUCCESS cause=C_FSA_INTERNAL origin=notify_crmd ]
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (       fsa.c:645   )   debug: do_state_transition: 	Starting PEngine Recheck Timer
Jan 15 15:38:30 [30795] bl460g1n6       crmd: (     utils.c:192   )   debug: crm_timer_start: 	Started PEngine Recheck Timer (I_PE_CALC:900000ms), src=43
Jan 15 15:38:30 [30790] bl460g1n6        cib: (        io.c:781   )   debug: write_cib_contents: 	Wrote digest 95284c32320f2298eff00c4881b9db37 to disk
Jan 15 15:38:30 [30790] bl460g1n6        cib: (        io.c:259   )    info: retrieveCib: 	Reading cluster configuration from: /var/lib/pacemaker/cib/cib.am6WU2 (digest: /var/lib/pacemaker/cib/cib.lVkKJz)
Jan 15 15:38:30 [30793] bl460g1n6      attrd: (  commands.c:590   )    info: attrd_cib_callback: 	Update 5 for probe_complete: OK (0)
Jan 15 15:38:30 [30793] bl460g1n6      attrd: (  commands.c:594   )  notice: attrd_cib_callback: 	Update 5 for probe_complete[bl460g1n6]=true: OK (0)
Jan 15 15:38:30 [30793] bl460g1n6      attrd: (  commands.c:594   )  notice: attrd_cib_callback: 	Update 5 for probe_complete[bl460g1n7]=true: OK (0)
Jan 15 15:38:30 [30790] bl460g1n6        cib: (        io.c:786   )   debug: write_cib_contents: 	Activating /var/lib/pacemaker/cib/cib.am6WU2
Jan 15 15:38:30 [30790] bl460g1n6        cib: (        io.c:738   )    info: write_cib_contents: 	Archived previous version as /var/lib/pacemaker/cib/cib-1.raw
Jan 15 15:38:30 [30790] bl460g1n6        cib: (        io.c:748   )   debug: write_cib_contents: 	Writing CIB to disk
Jan 15 15:38:30 [30790] bl460g1n6        cib: (        io.c:773   )    info: write_cib_contents: 	Wrote version 0.4.0 of the CIB to disk (digest: 034d1fed1360797812b0c9fb59cc7300)
Jan 15 15:38:30 [30790] bl460g1n6        cib: (        io.c:781   )   debug: write_cib_contents: 	Wrote digest 034d1fed1360797812b0c9fb59cc7300 to disk
Jan 15 15:38:30 [30790] bl460g1n6        cib: (        io.c:259   )    info: retrieveCib: 	Reading cluster configuration from: /var/lib/pacemaker/cib/cib.8yLEVf (digest: /var/lib/pacemaker/cib/cib.cPS3aN)
Jan 15 15:38:30 [30790] bl460g1n6        cib: (        io.c:786   )   debug: write_cib_contents: 	Activating /var/lib/pacemaker/cib/cib.8yLEVf
Jan 15 15:38:39 [30795] bl460g1n6       crmd: (  throttle.c:263   )   debug: throttle_cib_load: 	Init 10 + 8 ticks at 1389767919 (100 tps)
Jan 15 15:38:39 [30795] bl460g1n6       crmd: (  throttle.c:302   )   debug: throttle_load_avg: 	Current load is 0.000000 (full: 0.00 0.02 0.00 1/399 30807)
Jan 15 15:38:39 [30795] bl460g1n6       crmd: (  throttle.c:382   )   debug: throttle_io_load: 	Current IO load is 0.000000
Jan 15 15:38:39 [30795] bl460g1n6       crmd: (  throttle.c:542   )   debug: throttle_timer_cb: 	New throttle mode: 0000 (was ffffffff)
Jan 15 15:38:39 [30795] bl460g1n6       crmd: (  throttle.c:516   )    info: throttle_send_command: 	Updated throttle state to 0000
Jan 15 15:38:39 [30795] bl460g1n6       crmd: (  throttle.c:713   )   debug: throttle_update: 	Host bl460g1n7 supports a maximum of 16 jobs and throttle mode 0000.  New job limit is 16
Jan 15 15:38:39 [30795] bl460g1n6       crmd: (  throttle.c:713   )   debug: throttle_update: 	Host bl460g1n6 supports a maximum of 16 jobs and throttle mode 0000.  New job limit is 16
Jan 15 15:38:46 [30790] bl460g1n6        cib: (       ipc.c:334   )    info: crm_client_new: 	Connecting 0x19816b0 for uid=0 gid=0 pid=30814 id=6c9c9595-0b4f-4ff9-955a-89dcd57d2667
Jan 15 15:38:46 [30790] bl460g1n6        cib: ( ipc_setup.c:484   )   debug: handle_new_connection: 	IPC credentials authenticated (30790-30814-14)
Jan 15 15:38:46 [30790] bl460g1n6        cib: (   ipc_shm.c:295   )   debug: qb_ipcs_shm_connect: 	connecting to client [30814]
Jan 15 15:38:46 [30790] bl460g1n6        cib: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:38:46 [30790] bl460g1n6        cib: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:38:46 [30790] bl460g1n6        cib: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:38:46 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_query operation for section 'all': OK (rc=0, origin=local/cibadmin/2, version=0.4.7)
Jan 15 15:38:46 [30790] bl460g1n6        cib: (      ipcs.c:757   )   debug: qb_ipcs_dispatch_connection_request: 	HUP conn (30790-30814-14)
Jan 15 15:38:46 [30790] bl460g1n6        cib: (      ipcs.c:605   )   debug: qb_ipcs_disconnect: 	qb_ipcs_disconnect(30790-30814-14) state:2
Jan 15 15:38:46 [30790] bl460g1n6        cib: (       ipc.c:368   )    info: crm_client_destroy: 	Destroying 0 events
Jan 15 15:38:46 [30790] bl460g1n6        cib: (ringbuffer.c:299   )   debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-response-30790-30814-14-header
Jan 15 15:38:46 [30790] bl460g1n6        cib: (ringbuffer.c:299   )   debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-event-30790-30814-14-header
Jan 15 15:38:46 [30790] bl460g1n6        cib: (ringbuffer.c:299   )   debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-request-30790-30814-14-header
Jan 15 15:38:46 [30790] bl460g1n6        cib: (       ipc.c:334   )    info: crm_client_new: 	Connecting 0x19816b0 for uid=0 gid=0 pid=30815 id=8f5f8321-a9c8-4cf1-b9e5-8d53897971b7
Jan 15 15:38:46 [30790] bl460g1n6        cib: ( ipc_setup.c:484   )   debug: handle_new_connection: 	IPC credentials authenticated (30790-30815-14)
Jan 15 15:38:46 [30790] bl460g1n6        cib: (   ipc_shm.c:295   )   debug: qb_ipcs_shm_connect: 	connecting to client [30815]
Jan 15 15:38:46 [30790] bl460g1n6        cib: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:38:46 [30790] bl460g1n6        cib: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:38:46 [30790] bl460g1n6        cib: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:38:46 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_query operation for section 'all': OK (rc=0, origin=local/cibadmin/2, version=0.4.7)
Jan 15 15:38:46 [30790] bl460g1n6        cib: (      ipcs.c:757   )   debug: qb_ipcs_dispatch_connection_request: 	HUP conn (30790-30815-14)
Jan 15 15:38:46 [30790] bl460g1n6        cib: (      ipcs.c:605   )   debug: qb_ipcs_disconnect: 	qb_ipcs_disconnect(30790-30815-14) state:2
Jan 15 15:38:46 [30790] bl460g1n6        cib: (       ipc.c:368   )    info: crm_client_destroy: 	Destroying 0 events
Jan 15 15:38:46 [30790] bl460g1n6        cib: (ringbuffer.c:299   )   debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-response-30790-30815-14-header
Jan 15 15:38:46 [30790] bl460g1n6        cib: (ringbuffer.c:299   )   debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-event-30790-30815-14-header
Jan 15 15:38:46 [30790] bl460g1n6        cib: (ringbuffer.c:299   )   debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-request-30790-30815-14-header
Jan 15 15:38:46 [30790] bl460g1n6        cib: (       ipc.c:334   )    info: crm_client_new: 	Connecting 0x19816b0 for uid=0 gid=0 pid=30836 id=cd4959a7-2817-46b9-ab1b-f20e26df76a7
Jan 15 15:38:46 [30790] bl460g1n6        cib: ( ipc_setup.c:484   )   debug: handle_new_connection: 	IPC credentials authenticated (30790-30836-14)
Jan 15 15:38:46 [30790] bl460g1n6        cib: (   ipc_shm.c:295   )   debug: qb_ipcs_shm_connect: 	connecting to client [30836]
Jan 15 15:38:46 [30790] bl460g1n6        cib: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:38:46 [30790] bl460g1n6        cib: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:38:46 [30790] bl460g1n6        cib: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:38:46 [30790] bl460g1n6        cib: (        io.c:596   )   debug: activateCibXml: 	Triggering CIB write for cib_replace op
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (te_callbacks:122   )   debug: te_update_diff: 	Processing diff (cib_replace): 0.4.7 -> 0.5.1 (S_IDLE)
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  te_utils.c:419   )    info: abort_transition_graph: 	te_update_diff:126 - Triggered transition abort (complete=1, node=, tag=diff, id=(null), magic=NA, cib=0.5.1) : Non-status change
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause   <diff crm_feature_set="3.0.8" digest="2b4b612c2449664636a2d704509d07d1">
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause     <diff-removed admin_epoch="0" epoch="4" num_updates="7">
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause       <cib admin_epoch="0" epoch="4" num_updates="7">
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause         <configuration>
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause           <crm_config>
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause             <cluster_property_set id="cib-bootstrap-options">
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause               <nvpair id="cib-bootstrap-options-dc-version" name="dc-version" value="1.1.11-0.27.b48276b.git.el6-b48276b" __crm_diff_marker__="removed:top"/>
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause               <nvpair id="cib-bootstrap-options-cluster-infrastructure" name="cluster-infrastructure" value="corosync" __crm_diff_marker__="removed:top"/>
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause             </cluster_property_set>
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause           </crm_config>
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause         </configuration>
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause       </cib>
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause     </diff-removed>
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause     <diff-added>
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause       <cib epoch="5" num_updates="1" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.8" cib-last-written="Wed Jan 15 15:38:46 2014" update-origin="bl460g1n6" update-client="cibadmin" have-quorum="1" dc-uuid="3232261592">
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause         <configuration>
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause           <crm_config>
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause             <cluster_property_set id="cib-bootstrap-options">
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause               <nvpair name="no-quorum-policy" value="ignore" id="cib-bootstrap-options-no-quorum-policy" __crm_diff_marker__="added:top"/>
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause               <nvpair name="stonith-enabled" value="false" id="cib-bootstrap-options-stonith-enabled" __crm_diff_marker__="added:top"/>
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause               <nvpair name="startup-fencing" value="false" id="cib-bootstrap-options-startup-fencing" __crm_diff_marker__="added:top"/>
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause               <nvpair name="crmd-transition-delay" value="2s" id="cib-bootstrap-options-crmd-transition-delay" __crm_diff_marker__="added:top"/>
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause             </cluster_property_set>
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause           </crm_config>
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause           <resources>
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause             <primitive id="prmVM2" class="ocf" provider="heartbeat" type="VirtualDomain" __crm_diff_marker__="added:top">
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause               <meta_attributes id="prmVM2-meta_attributes">
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause                 <nvpair name="allow-migrate" value="true" id="prmVM2-meta_attributes-allow-migrate"/>
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause               </meta_attributes>
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause               <instance_attributes id="prmVM2-instance_attributes">
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause                 <nvpair name="config" value="/migrate_test/config/vm2.xml" id="prmVM2-instance_attributes-config"/>
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause                 <nvpair name="hypervisor" value="qemu:///system" id="prmVM2-instance_attributes-hypervisor"/>
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause                 <nvpair name="migration_transport" value="ssh" id="prmVM2-instance_attributes-migration_transport"/>
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause               </instance_attributes>
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause               <operations>
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause                 <op name="start" interval="0s" timeout="120s" on-fail="restart" id="prmVM2-start-0s"/>
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause                 <op name="monitor" interval="10s" timeout="30s" on-fail="restart" id="prmVM2-monitor-10s"/>
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause                 <op name="stop" interval="0s" timeout="120s" on-fail="block" id="prmVM2-stop-0s"/>
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause                 <op name="migrate_to" interval="0s" timeout="120s" on-fail="restart" id="prmVM2-migrate_to-0s"/>
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause                 <op name="migrate_from" interval="0s" timeout="120s" on-fail="restart" id="prmVM2-migrate_from-0s"/>
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause               </operations>
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause             </primitive>
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause             <clone id="clnPing" __crm_diff_marker__="added:top">
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause               <primitive id="prmPing" class="ocf" provider="pacemaker" type="ping">
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause                 <instance_attributes id="prmPing-instance_attributes">
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause                   <nvpair name="name" value="default_ping_set" id="prmPing-instance_attributes-name"/>
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause                   <nvpair name="host_list" value="192.168.201.254" id="prmPing-instance_attributes-host_list"/>
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause                   <nvpair name="multiplier" value="100" id="prmPing-instance_attributes-multiplier"/>
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause                   <nvpair name="attempts" value="2" id="prmPing-instance_attributes-attempts"/>
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause                   <nvpair name="timeout" value="2" id="prmPing-instance_attributes-timeout"/>
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause                 </instance_attributes>
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause                 <operations>
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause                   <op name="start" interval="0s" timeout="60s" on-fail="restart" id="prmPing-start-0s"/>
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause                   <op name="monitor" interval="10s" timeout="60s" on-fail="restart" id="prmPing-monitor-10s"/>
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause                   <op name="stop" interval="0s" timeout="60s" on-fail="ignore" id="prmPing-stop-0s"/>
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause                 </operations>
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause               </primitive>
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause             </clone>
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause           </resources>
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause           <constraints>
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause             <rsc_location id="l2" rsc="prmVM2" __crm_diff_marker__="added:top">
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause               <rule score="200" id="l2-rule">
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause                 <expression attribute="#uname" operation="eq" value="bl460g1n6" id="l2-expression"/>
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause               </rule>
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause               <rule score="100" id="l2-rule-0">
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause                 <expression attribute="#uname" operation="eq" value="bl460g1n7" id="l2-expression-0"/>
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause               </rule>
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause               <rule score="-INFINITY" boolean-op="or" id="l2-rule-1">
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause                 <expression operation="not_defined" attribute="default_ping_set" id="l2-expression-1"/>
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause                 <expression attribute="default_ping_set" operation="lt" value="100" id="l2-expression-2"/>
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause               </rule>
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause             </rsc_location>
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause             <rsc_colocation id="c4" score="INFINITY" rsc="prmVM2" with-rsc="clnPing" __crm_diff_marker__="added:top"/>
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause             <rsc_order id="o4" score="0" first="clnPing" then="prmVM2" symmetrical="false" __crm_diff_marker__="added:top"/>
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause           </constraints>
Jan 15 15:38:46 [30790] bl460g1n6        cib: (    notify.c:362   )    info: cib_replace_notify: 	Replaced: 0.4.7 -> 0.5.1 from bl460g1n6
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause           <rsc_defaults __crm_diff_marker__="added:top">
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause             <meta_attributes id="rsc-options">
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause               <nvpair name="resource-stickiness" value="INFINITY" id="rsc-options-resource-stickiness"/>
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause               <nvpair name="migration-threshold" value="1" id="rsc-options-migration-threshold"/>
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause             </meta_attributes>
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause           </rsc_defaults>
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause         </configuration>
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause       </cib>
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause     </diff-added>
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause   </diff>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: ( cib_utils.c:167   )   debug: Config update: 	Diff: --- 0.4.7
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: ( cib_utils.c:169   )   debug: Config update: 	Diff: +++ 0.5.1 2b4b612c2449664636a2d704509d07d1
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (       fsa.c:193   )   debug: s_crmd_fsa: 	Processing I_PE_CALC: [ state=S_IDLE cause=C_FSA_INTERNAL origin=abort_transition_graph ]
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-  <cib admin_epoch="0" epoch="4" num_updates="7">
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-    <configuration>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-      <crm_config>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-        <cluster_property_set id="cib-bootstrap-options">
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (       fsa.c:502   )  notice: do_state_transition: 	State transition S_IDLE -> S_POLICY_ENGINE [ input=I_PE_CALC cause=C_FSA_INTERNAL origin=abort_transition_graph ]
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	--         <nvpair id="cib-bootstrap-options-dc-version" name="dc-version" value="1.1.11-0.27.b48276b.git.el6-b48276b"/>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	--         <nvpair id="cib-bootstrap-options-cluster-infrastructure" name="cluster-infrastructure" value="corosync"/>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-        </cluster_property_set>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-      </crm_config>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-    </configuration>
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (       fsa.c:610   )   debug: do_state_transition: 	All 2 cluster nodes are eligible to run resources.
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-  </cib>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+  <cib epoch="5" num_updates="1" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.8" cib-last-written="Wed Jan 15 15:38:46 2014" update-origin="bl460g1n6" update-client="cibadmin" have-quorum="1" dc-uuid="3232261592">
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+    <configuration>
Jan 15 15:38:46 [30793] bl460g1n6      attrd: (      main.c:110   )  notice: attrd_cib_replaced_cb: 	Updating all attributes after cib_refresh_notify event
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+      <crm_config>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+        <cluster_property_set id="cib-bootstrap-options">
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (   pengine.c:231   )   debug: do_pe_invoke: 	Query 30: Requesting the current CIB: S_POLICY_ENGINE
Jan 15 15:38:46 [30793] bl460g1n6      attrd: (  commands.c:739   )   debug: write_attribute: 	Update: bl460g1n6[shutdown]=0 (3232261592 3232261592 3232261592 bl460g1n6)
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++         <nvpair name="no-quorum-policy" value="ignore" id="cib-bootstrap-options-no-quorum-policy"/>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++         <nvpair name="stonith-enabled" value="false" id="cib-bootstrap-options-stonith-enabled"/>
Jan 15 15:38:46 [30793] bl460g1n6      attrd: (  commands.c:739   )   debug: write_attribute: 	Update: bl460g1n7[shutdown]=0 (3232261593 3232261593 3232261593 bl460g1n7)
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++         <nvpair name="startup-fencing" value="false" id="cib-bootstrap-options-startup-fencing"/>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++         <nvpair name="crmd-transition-delay" value="2s" id="cib-bootstrap-options-crmd-transition-delay"/>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+        </cluster_property_set>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+      </crm_config>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+      <resources>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++       <primitive id="prmVM2" class="ocf" provider="heartbeat" type="VirtualDomain">
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++         <meta_attributes id="prmVM2-meta_attributes">
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++           <nvpair name="allow-migrate" value="true" id="prmVM2-meta_attributes-allow-migrate"/>
Jan 15 15:38:46 [30793] bl460g1n6      attrd: (  commands.c:765   )  notice: write_attribute: 	Sent update 6 with 2 changes for shutdown, id=<n/a>, set=(null)
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++         </meta_attributes>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++         <instance_attributes id="prmVM2-instance_attributes">
Jan 15 15:38:46 [30793] bl460g1n6      attrd: (  commands.c:739   )   debug: write_attribute: 	Update: bl460g1n6[terminate]=(null) (3232261592 3232261592 3232261592 bl460g1n6)
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++           <nvpair name="config" value="/migrate_test/config/vm2.xml" id="prmVM2-instance_attributes-config"/>
Jan 15 15:38:46 [30793] bl460g1n6      attrd: (  commands.c:739   )   debug: write_attribute: 	Update: bl460g1n7[terminate]=(null) (3232261593 3232261593 3232261593 bl460g1n7)
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++           <nvpair name="hypervisor" value="qemu:///system" id="prmVM2-instance_attributes-hypervisor"/>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++           <nvpair name="migration_transport" value="ssh" id="prmVM2-instance_attributes-migration_transport"/>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++         </instance_attributes>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++         <operations>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++           <op name="start" interval="0s" timeout="120s" on-fail="restart" id="prmVM2-start-0s"/>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++           <op name="monitor" interval="10s" timeout="30s" on-fail="restart" id="prmVM2-monitor-10s"/>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++           <op name="stop" interval="0s" timeout="120s" on-fail="block" id="prmVM2-stop-0s"/>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++           <op name="migrate_to" interval="0s" timeout="120s" on-fail="restart" id="prmVM2-migrate_to-0s"/>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++           <op name="migrate_from" interval="0s" timeout="120s" on-fail="restart" id="prmVM2-migrate_from-0s"/>
Jan 15 15:38:46 [30793] bl460g1n6      attrd: (  commands.c:765   )  notice: write_attribute: 	Sent update 7 with 2 changes for terminate, id=<n/a>, set=(null)
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++         </operations>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++       </primitive>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++       <clone id="clnPing">
Jan 15 15:38:46 [30793] bl460g1n6      attrd: (  commands.c:739   )   debug: write_attribute: 	Update: bl460g1n6[probe_complete]=true (3232261592 3232261592 3232261592 bl460g1n6)
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++         <primitive id="prmPing" class="ocf" provider="pacemaker" type="ping">
Jan 15 15:38:46 [30793] bl460g1n6      attrd: (  commands.c:739   )   debug: write_attribute: 	Update: bl460g1n7[probe_complete]=true (3232261593 3232261593 3232261593 bl460g1n7)
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++           <instance_attributes id="prmPing-instance_attributes">
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++             <nvpair name="name" value="default_ping_set" id="prmPing-instance_attributes-name"/>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++             <nvpair name="host_list" value="192.168.201.254" id="prmPing-instance_attributes-host_list"/>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++             <nvpair name="multiplier" value="100" id="prmPing-instance_attributes-multiplier"/>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++             <nvpair name="attempts" value="2" id="prmPing-instance_attributes-attempts"/>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++             <nvpair name="timeout" value="2" id="prmPing-instance_attributes-timeout"/>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++           </instance_attributes>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++           <operations>
Jan 15 15:38:46 [30793] bl460g1n6      attrd: (  commands.c:765   )  notice: write_attribute: 	Sent update 8 with 2 changes for probe_complete, id=<n/a>, set=(null)
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++             <op name="start" interval="0s" timeout="60s" on-fail="restart" id="prmPing-start-0s"/>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++             <op name="monitor" interval="10s" timeout="60s" on-fail="restart" id="prmPing-monitor-10s"/>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++             <op name="stop" interval="0s" timeout="60s" on-fail="ignore" id="prmPing-stop-0s"/>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++           </operations>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++         </primitive>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++       </clone>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+      </resources>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+      <constraints>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++       <rsc_location id="l2" rsc="prmVM2">
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++         <rule score="200" id="l2-rule">
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++           <expression attribute="#uname" operation="eq" value="bl460g1n6" id="l2-expression"/>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++         </rule>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++         <rule score="100" id="l2-rule-0">
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++           <expression attribute="#uname" operation="eq" value="bl460g1n7" id="l2-expression-0"/>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++         </rule>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++         <rule score="-INFINITY" boolean-op="or" id="l2-rule-1">
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++           <expression operation="not_defined" attribute="default_ping_set" id="l2-expression-1"/>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++           <expression attribute="default_ping_set" operation="lt" value="100" id="l2-expression-2"/>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++         </rule>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++       </rsc_location>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++       <rsc_colocation id="c4" score="INFINITY" rsc="prmVM2" with-rsc="clnPing"/>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++       <rsc_order id="o4" score="0" first="clnPing" then="prmVM2" symmetrical="false"/>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+      </constraints>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++     <rsc_defaults>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++       <meta_attributes id="rsc-options">
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++         <nvpair name="resource-stickiness" value="INFINITY" id="rsc-options-resource-stickiness"/>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++         <nvpair name="migration-threshold" value="1" id="rsc-options-migration-threshold"/>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++       </meta_attributes>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++     </rsc_defaults>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+    </configuration>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+  </cib>
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (       cib.c:110   )   debug: do_cib_replaced: 	Updating the CIB after a replace: DC=true
Jan 15 15:38:46 [30790] bl460g1n6        cib: ( cib_utils.c:167   )  notice: cib:diff: 	Diff: --- 0.4.7
Jan 15 15:38:46 [30790] bl460g1n6        cib: ( cib_utils.c:169   )  notice: cib:diff: 	Diff: +++ 0.5.1 2b4b612c2449664636a2d704509d07d1
Jan 15 15:38:46 [30790] bl460g1n6        cib: (       xml.c:1496  )  notice: cib:diff: 	--         <nvpair id="cib-bootstrap-options-dc-version" name="dc-version" value="1.1.11-0.27.b48276b.git.el6-b48276b"/>
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (       fsa.c:193   )   debug: s_crmd_fsa: 	Processing I_ELECTION: [ state=S_POLICY_ENGINE cause=C_FSA_INTERNAL origin=do_cib_replaced ]
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (       fsa.c:502   )    info: do_state_transition: 	State transition S_POLICY_ENGINE -> S_ELECTION [ input=I_ELECTION cause=C_FSA_INTERNAL origin=do_cib_replaced ]
Jan 15 15:38:46 [30790] bl460g1n6        cib: (       xml.c:1496  )  notice: cib:diff: 	--         <nvpair id="cib-bootstrap-options-cluster-infrastructure" name="cluster-infrastructure" value="corosync"/>
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (     utils.c:984   )    info: update_dc: 	Unset DC. Was bl460g1n6
Jan 15 15:38:46 [30790] bl460g1n6        cib: (       xml.c:1507  )  notice: cib:diff: 	++         <nvpair name="no-quorum-policy" value="ignore" id="cib-bootstrap-options-no-quorum-policy"/>
Jan 15 15:38:46 [30790] bl460g1n6        cib: (       xml.c:1507  )  notice: cib:diff: 	++         <nvpair name="stonith-enabled" value="false" id="cib-bootstrap-options-stonith-enabled"/>
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  election.c:178   )   debug: crm_uptime: 	Current CPU usage is: 0s, 38994us
Jan 15 15:38:46 [30790] bl460g1n6        cib: (       xml.c:1507  )  notice: cib:diff: 	++         <nvpair name="startup-fencing" value="false" id="cib-bootstrap-options-startup-fencing"/>
Jan 15 15:38:46 [30790] bl460g1n6        cib: (       xml.c:1507  )  notice: cib:diff: 	++         <nvpair name="crmd-transition-delay" value="2s" id="cib-bootstrap-options-crmd-transition-delay"/>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (      main.c:800   )    info: update_cib_stonith_devices: 	Updating device list from the cib: new location constraint
Jan 15 15:38:46 [30790] bl460g1n6        cib: (       xml.c:1507  )  notice: cib:diff: 	++       <primitive id="prmVM2" class="ocf" provider="heartbeat" type="VirtualDomain">
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  election.c:249   )   debug: election_vote: 	Started election 3
Jan 15 15:38:46 [30790] bl460g1n6        cib: (       xml.c:1507  )  notice: cib:diff: 	++         <meta_attributes id="prmVM2-meta_attributes">
Jan 15 15:38:46 [30790] bl460g1n6        cib: (       xml.c:1507  )  notice: cib:diff: 	++           <nvpair name="allow-migrate" value="true" id="prmVM2-meta_attributes-allow-migrate"/>
Jan 15 15:38:46 [30790] bl460g1n6        cib: (       xml.c:1507  )  notice: cib:diff: 	++         </meta_attributes>
Jan 15 15:38:46 [30790] bl460g1n6        cib: (       xml.c:1507  )  notice: cib:diff: 	++         <instance_attributes id="prmVM2-instance_attributes">
Jan 15 15:38:46 [30790] bl460g1n6        cib: (       xml.c:1507  )  notice: cib:diff: 	++           <nvpair name="config" value="/migrate_test/config/vm2.xml" id="prmVM2-instance_attributes-config"/>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (    unpack.c:93    )   debug: unpack_config: 	STONITH timeout: 60000
Jan 15 15:38:46 [30790] bl460g1n6        cib: (       xml.c:1507  )  notice: cib:diff: 	++           <nvpair name="hypervisor" value="qemu:///system" id="prmVM2-instance_attributes-hypervisor"/>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (    unpack.c:97    )   debug: unpack_config: 	STONITH of failed nodes is disabled
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (    unpack.c:104   )   debug: unpack_config: 	Stop all active resources: false
Jan 15 15:38:46 [30790] bl460g1n6        cib: (       xml.c:1507  )  notice: cib:diff: 	++           <nvpair name="migration_transport" value="ssh" id="prmVM2-instance_attributes-migration_transport"/>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (    unpack.c:108   )   debug: unpack_config: 	Cluster is symmetric - resources can run anywhere by default
Jan 15 15:38:46 [30790] bl460g1n6        cib: (       xml.c:1507  )  notice: cib:diff: 	++         </instance_attributes>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (    unpack.c:113   )   debug: unpack_config: 	Default stickiness: 0
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (    unpack.c:156   )  notice: unpack_config: 	On loss of CCM Quorum: Ignore
Jan 15 15:38:46 [30790] bl460g1n6        cib: (       xml.c:1507  )  notice: cib:diff: 	++         <operations>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (    unpack.c:196   )   debug: unpack_config: 	Node scores: 'red' = -INFINITY, 'yellow' = 0, 'green' = 0
Jan 15 15:38:46 [30790] bl460g1n6        cib: (       xml.c:1507  )  notice: cib:diff: 	++           <op name="start" interval="0s" timeout="120s" on-fail="restart" id="prmVM2-start-0s"/>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (    unpack.c:413   ) warning: handle_startup_fencing: 	Blind faith: not fencing unseen nodes
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (    unpack.c:504   )   debug: unpack_domains: 	Unpacking domains
Jan 15 15:38:46 [30790] bl460g1n6        cib: (       xml.c:1507  )  notice: cib:diff: 	++           <op name="monitor" interval="10s" timeout="30s" on-fail="restart" id="prmVM2-monitor-10s"/>
Jan 15 15:38:46 [30790] bl460g1n6        cib: (       xml.c:1507  )  notice: cib:diff: 	++           <op name="stop" interval="0s" timeout="120s" on-fail="block" id="prmVM2-stop-0s"/>
Jan 15 15:38:46 [30790] bl460g1n6        cib: (       xml.c:1507  )  notice: cib:diff: 	++           <op name="migrate_to" interval="0s" timeout="120s" on-fail="restart" id="prmVM2-migrate_to-0s"/>
Jan 15 15:38:46 [30790] bl460g1n6        cib: (       xml.c:1507  )  notice: cib:diff: 	++           <op name="migrate_from" interval="0s" timeout="120s" on-fail="restart" id="prmVM2-migrate_from-0s"/>
Jan 15 15:38:46 [30790] bl460g1n6        cib: (       xml.c:1507  )  notice: cib:diff: 	++         </operations>
Jan 15 15:38:46 [30790] bl460g1n6        cib: (       xml.c:1507  )  notice: cib:diff: 	++       </primitive>
Jan 15 15:38:46 [30790] bl460g1n6        cib: (       xml.c:1507  )  notice: cib:diff: 	++       <clone id="clnPing">
Jan 15 15:38:46 [30790] bl460g1n6        cib: (       xml.c:1507  )  notice: cib:diff: 	++         <primitive id="prmPing" class="ocf" provider="pacemaker" type="ping">
Jan 15 15:38:46 [30790] bl460g1n6        cib: (       xml.c:1507  )  notice: cib:diff: 	++           <instance_attributes id="prmPing-instance_attributes">
Jan 15 15:38:46 [30790] bl460g1n6        cib: (       xml.c:1507  )  notice: cib:diff: 	++             <nvpair name="name" value="default_ping_set" id="prmPing-instance_attributes-name"/>
Jan 15 15:38:46 [30790] bl460g1n6        cib: (       xml.c:1507  )  notice: cib:diff: 	++             <nvpair name="host_list" value="192.168.201.254" id="prmPing-instance_attributes-host_list"/>
Jan 15 15:38:46 [30790] bl460g1n6        cib: (       xml.c:1507  )  notice: cib:diff: 	++             <nvpair name="multiplier" value="100" id="prmPing-instance_attributes-multiplier"/>
Jan 15 15:38:46 [30790] bl460g1n6        cib: (       xml.c:1507  )  notice: cib:diff: 	++             <nvpair name="attempts" value="2" id="prmPing-instance_attributes-attempts"/>
Jan 15 15:38:46 [30790] bl460g1n6        cib: (       xml.c:1507  )  notice: cib:diff: 	++             <nvpair name="timeout" value="2" id="prmPing-instance_attributes-timeout"/>
Jan 15 15:38:46 [30790] bl460g1n6        cib: (       xml.c:1507  )  notice: cib:diff: 	++           </instance_attributes>
Jan 15 15:38:46 [30790] bl460g1n6        cib: (       xml.c:1507  )  notice: cib:diff: 	++           <operations>
Jan 15 15:38:46 [30790] bl460g1n6        cib: (       xml.c:1507  )  notice: cib:diff: 	++             <op name="start" interval="0s" timeout="60s" on-fail="restart" id="prmPing-start-0s"/>
Jan 15 15:38:46 [30790] bl460g1n6        cib: (       xml.c:1507  )  notice: cib:diff: 	++             <op name="monitor" interval="10s" timeout="60s" on-fail="restart" id="prmPing-monitor-10s"/>
Jan 15 15:38:46 [30790] bl460g1n6        cib: (       xml.c:1507  )  notice: cib:diff: 	++             <op name="stop" interval="0s" timeout="60s" on-fail="ignore" id="prmPing-stop-0s"/>
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  election.c:359   )   debug: election_count_vote: 	Created voted hash
Jan 15 15:38:46 [30790] bl460g1n6        cib: (       xml.c:1507  )  notice: cib:diff: 	++           </operations>
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  election.c:492   )   debug: election_count_vote: 	Election 3 (current: 3, owner: 3232261592): Processed vote from bl460g1n6 (Recorded)
Jan 15 15:38:46 [30790] bl460g1n6        cib: (       xml.c:1507  )  notice: cib:diff: 	++         </primitive>
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  election.c:304   )   debug: election_check: 	Still waiting on 1 non-votes (2 total)
Jan 15 15:38:46 [30790] bl460g1n6        cib: (       xml.c:1507  )  notice: cib:diff: 	++       </clone>
Jan 15 15:38:46 [30790] bl460g1n6        cib: (       xml.c:1507  )  notice: cib:diff: 	++       <rsc_location id="l2" rsc="prmVM2">
Jan 15 15:38:46 [30790] bl460g1n6        cib: (       xml.c:1507  )  notice: cib:diff: 	++         <rule score="200" id="l2-rule">
Jan 15 15:38:46 [30790] bl460g1n6        cib: (       xml.c:1507  )  notice: cib:diff: 	++           <expression attribute="#uname" operation="eq" value="bl460g1n6" id="l2-expression"/>
Jan 15 15:38:46 [30790] bl460g1n6        cib: (       xml.c:1507  )  notice: cib:diff: 	++         </rule>
Jan 15 15:38:46 [30790] bl460g1n6        cib: (       xml.c:1507  )  notice: cib:diff: 	++         <rule score="100" id="l2-rule-0">
Jan 15 15:38:46 [30790] bl460g1n6        cib: (       xml.c:1507  )  notice: cib:diff: 	++           <expression attribute="#uname" operation="eq" value="bl460g1n7" id="l2-expression-0"/>
Jan 15 15:38:46 [30790] bl460g1n6        cib: (       xml.c:1507  )  notice: cib:diff: 	++         </rule>
Jan 15 15:38:46 [30790] bl460g1n6        cib: (       xml.c:1507  )  notice: cib:diff: 	++         <rule score="-INFINITY" boolean-op="or" id="l2-rule-1">
Jan 15 15:38:46 [30790] bl460g1n6        cib: (       xml.c:1507  )  notice: cib:diff: 	++           <expression operation="not_defined" attribute="default_ping_set" id="l2-expression-1"/>
Jan 15 15:38:46 [30790] bl460g1n6        cib: (       xml.c:1507  )  notice: cib:diff: 	++           <expression attribute="default_ping_set" operation="lt" value="100" id="l2-expression-2"/>
Jan 15 15:38:46 [30790] bl460g1n6        cib: (       xml.c:1507  )  notice: cib:diff: 	++         </rule>
Jan 15 15:38:46 [30790] bl460g1n6        cib: (       xml.c:1507  )  notice: cib:diff: 	++       </rsc_location>
Jan 15 15:38:46 [30790] bl460g1n6        cib: (       xml.c:1507  )  notice: cib:diff: 	++       <rsc_colocation id="c4" score="INFINITY" rsc="prmVM2" with-rsc="clnPing"/>
Jan 15 15:38:46 [30790] bl460g1n6        cib: (       xml.c:1507  )  notice: cib:diff: 	++       <rsc_order id="o4" score="0" first="clnPing" then="prmVM2" symmetrical="false"/>
Jan 15 15:38:46 [30790] bl460g1n6        cib: (       xml.c:1507  )  notice: cib:diff: 	++     <rsc_defaults>
Jan 15 15:38:46 [30790] bl460g1n6        cib: (       xml.c:1507  )  notice: cib:diff: 	++       <meta_attributes id="rsc-options">
Jan 15 15:38:46 [30790] bl460g1n6        cib: (       xml.c:1507  )  notice: cib:diff: 	++         <nvpair name="resource-stickiness" value="INFINITY" id="rsc-options-resource-stickiness"/>
Jan 15 15:38:46 [30790] bl460g1n6        cib: (       xml.c:1507  )  notice: cib:diff: 	++         <nvpair name="migration-threshold" value="1" id="rsc-options-migration-threshold"/>
Jan 15 15:38:46 [30790] bl460g1n6        cib: (       xml.c:1507  )  notice: cib:diff: 	++       </meta_attributes>
Jan 15 15:38:46 [30790] bl460g1n6        cib: (       xml.c:1507  )  notice: cib:diff: 	++     </rsc_defaults>
Jan 15 15:38:46 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_replace operation for section 'all': OK (rc=0, origin=local/cibadmin/2, version=0.5.1)
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  election.c:492   )   debug: election_count_vote: 	Election 3 (current: 3, owner: 3232261592): Processed no-vote from bl460g1n7 (Recorded)
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  election.c:47    )    info: election_complete: 	Election election-0 complete
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (   control.c:60    )    info: election_timeout_popped: 	Election failed: Declaring ourselves the winner
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (       fsa.c:193   )   debug: s_crmd_fsa: 	Processing I_ELECTION_DC: [ state=S_ELECTION cause=C_TIMER_POPPED origin=election_timeout_popped ]
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (      misc.c:47    )    info: do_log: 	FSA: Input I_ELECTION_DC from election_timeout_popped() received in state S_ELECTION
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (       fsa.c:502   )  notice: do_state_transition: 	State transition S_ELECTION -> S_INTEGRATION [ input=I_ELECTION_DC cause=C_TIMER_POPPED origin=election_timeout_popped ]
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (   tengine.c:98    )   debug: do_te_control: 	The transitioner is already active
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (     utils.c:192   )   debug: crm_timer_start: 	Started Integration Timer (I_INTEGRATED:180000ms), src=49
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  election.c:178   )    info: do_dc_takeover: 	Taking over DC status for this partition
Jan 15 15:38:46 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_query operation for section 'all': OK (rc=0, origin=local/crmd/30, version=0.5.1)
Jan 15 15:38:46 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_query operation for section crm_config: OK (rc=0, origin=local/crmd/31, version=0.5.1)
Jan 15 15:38:46 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_modify operation for section nodes: OK (rc=0, origin=local/crmd/32, version=0.5.1)
Jan 15 15:38:46 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=local/crmd/33, version=0.5.1)
Jan 15 15:38:46 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=local/attrd/6, version=0.5.1)
Jan 15 15:38:46 [30793] bl460g1n6      attrd: (  commands.c:590   )    info: attrd_cib_callback: 	Update 6 for shutdown: OK (0)
Jan 15 15:38:46 [30793] bl460g1n6      attrd: (  commands.c:594   )  notice: attrd_cib_callback: 	Update 6 for shutdown[bl460g1n6]=0: OK (0)
Jan 15 15:38:46 [30793] bl460g1n6      attrd: (  commands.c:594   )  notice: attrd_cib_callback: 	Update 6 for shutdown[bl460g1n7]=0: OK (0)
Jan 15 15:38:46 [30790] bl460g1n6        cib: (   cib_ops.c:368   )   debug: cib_process_modify: 	Destroying /cib/status/node_state[1]/transient_attributes/instance_attributes/nvpair[3]
Jan 15 15:38:46 [30790] bl460g1n6        cib: (   cib_ops.c:368   )   debug: cib_process_modify: 	Destroying /cib/status/node_state[2]/transient_attributes/instance_attributes/nvpair[3]
Jan 15 15:38:46 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=local/attrd/7, version=0.5.1)
Jan 15 15:38:46 [30793] bl460g1n6      attrd: (  commands.c:590   )    info: attrd_cib_callback: 	Update 7 for terminate: OK (0)
Jan 15 15:38:46 [30793] bl460g1n6      attrd: (  commands.c:594   )  notice: attrd_cib_callback: 	Update 7 for terminate[bl460g1n6]=(null): OK (0)
Jan 15 15:38:46 [30793] bl460g1n6      attrd: (  commands.c:594   )  notice: attrd_cib_callback: 	Update 7 for terminate[bl460g1n7]=(null): OK (0)
Jan 15 15:38:46 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=local/attrd/8, version=0.5.1)
Jan 15 15:38:46 [30793] bl460g1n6      attrd: (  commands.c:590   )    info: attrd_cib_callback: 	Update 8 for probe_complete: OK (0)
Jan 15 15:38:46 [30793] bl460g1n6      attrd: (  commands.c:594   )  notice: attrd_cib_callback: 	Update 8 for probe_complete[bl460g1n6]=true: OK (0)
Jan 15 15:38:46 [30793] bl460g1n6      attrd: (  commands.c:594   )  notice: attrd_cib_callback: 	Update 8 for probe_complete[bl460g1n7]=true: OK (0)
Jan 15 15:38:46 [30790] bl460g1n6        cib: (  messages.c:167   )   debug: cib_process_readwrite: 	We are still in R/W mode
Jan 15 15:38:46 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_master operation for section 'all': OK (rc=0, origin=local/crmd/34, version=0.5.1)
Jan 15 15:38:46 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_modify operation for section cib: OK (rc=0, origin=local/crmd/35, version=0.5.1)
Jan 15 15:38:46 [30790] bl460g1n6        cib: (   cib_ops.c:905   )   debug: cib_process_xpath: 	cib_query: //cib/configuration/crm_config//cluster_property_set//nvpair[@name='dc-version'] does not exist
Jan 15 15:38:46 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_query operation for section //cib/configuration/crm_config//cluster_property_set//nvpair[@name='dc-version']: No such device or address (rc=-6, origin=local/crmd/36, version=0.5.1)
Jan 15 15:38:46 [30790] bl460g1n6        cib: (      ipcs.c:757   )   debug: qb_ipcs_dispatch_connection_request: 	HUP conn (30790-30836-14)
Jan 15 15:38:46 [30790] bl460g1n6        cib: (      ipcs.c:605   )   debug: qb_ipcs_disconnect: 	qb_ipcs_disconnect(30790-30836-14) state:2
Jan 15 15:38:46 [30790] bl460g1n6        cib: (       ipc.c:368   )    info: crm_client_destroy: 	Destroying 0 events
Jan 15 15:38:46 [30790] bl460g1n6        cib: (ringbuffer.c:299   )   debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-response-30790-30836-14-header
Jan 15 15:38:46 [30790] bl460g1n6        cib: (ringbuffer.c:299   )   debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-event-30790-30836-14-header
Jan 15 15:38:46 [30790] bl460g1n6        cib: (ringbuffer.c:299   )   debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-request-30790-30836-14-header
Jan 15 15:38:46 [30790] bl460g1n6        cib: (        io.c:596   )   debug: activateCibXml: 	Triggering CIB write for cib_modify op
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: ( cib_utils.c:174   )   debug: log_cib_diff: 	Config update: Local-only Change: 0.6.1
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-- <cib admin_epoch="0" epoch="5" num_updates="1"/>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+  <cib epoch="6" num_updates="1" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.8" cib-last-written="Wed Jan 15 15:38:46 2014" update-origin="bl460g1n6" update-client="crmd" have-quorum="1" dc-uuid="3232261592">
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+    <configuration>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+      <crm_config>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+        <cluster_property_set id="cib-bootstrap-options">
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++         <nvpair id="cib-bootstrap-options-dc-version" name="dc-version" value="1.1.11-0.27.b48276b.git.el6-b48276b"/>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+        </cluster_property_set>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+      </crm_config>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+    </configuration>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+  </cib>
Jan 15 15:38:46 [30790] bl460g1n6        cib: ( cib_utils.c:174   )  notice: log_cib_diff: 	cib:diff: Local-only Change: 0.6.1
Jan 15 15:38:46 [30790] bl460g1n6        cib: (       xml.c:1496  )  notice: cib:diff: 	-- <cib admin_epoch="0" epoch="5" num_updates="1"/>
Jan 15 15:38:46 [30790] bl460g1n6        cib: (       xml.c:1507  )  notice: cib:diff: 	++         <nvpair id="cib-bootstrap-options-dc-version" name="dc-version" value="1.1.11-0.27.b48276b.git.el6-b48276b"/>
Jan 15 15:38:46 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_modify operation for section crm_config: OK (rc=0, origin=local/crmd/37, version=0.6.1)
Jan 15 15:38:46 [30790] bl460g1n6        cib: (   cib_ops.c:905   )   debug: cib_process_xpath: 	cib_query: //cib/configuration/crm_config//cluster_property_set//nvpair[@name='cluster-infrastructure'] does not exist
Jan 15 15:38:46 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_query operation for section //cib/configuration/crm_config//cluster_property_set//nvpair[@name='cluster-infrastructure']: No such device or address (rc=-6, origin=local/crmd/38, version=0.6.1)
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (   join_dc.c:82    )   debug: initialize_join: 	join-3: Initializing join data (flag=true)
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (   join_dc.c:61    )    info: crm_update_peer_join: 	initialize_join: Node bl460g1n7[3232261593] - join-3 phase 4 -> 0
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (   join_dc.c:61    )    info: crm_update_peer_join: 	initialize_join: Node bl460g1n6[3232261592] - join-3 phase 4 -> 0
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (   join_dc.c:140   )    info: join_make_offer: 	join-3: Sending offer to bl460g1n7
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (   join_dc.c:66    )    info: crm_update_peer_join: 	join_make_offer: Node bl460g1n7[3232261593] - join-3 phase 0 -> 1
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (   join_dc.c:140   )    info: join_make_offer: 	join-3: Sending offer to bl460g1n6
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (   join_dc.c:66    )    info: crm_update_peer_join: 	join_make_offer: Node bl460g1n6[3232261592] - join-3 phase 0 -> 1
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (   join_dc.c:173   )    info: do_dc_join_offer_all: 	join-3: Waiting on 2 outstanding join acks
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (       fsa.c:193   )   debug: s_crmd_fsa: 	Processing I_ELECTION_DC: [ state=S_INTEGRATION cause=C_FSA_INTERNAL origin=do_election_check ]
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (      misc.c:47    ) warning: do_log: 	FSA: Input I_ELECTION_DC from do_election_check() received in state S_INTEGRATION
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  election.c:249   )   debug: election_vote: 	Started election 4
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (   join_dc.c:82    )   debug: initialize_join: 	join-4: Initializing join data (flag=true)
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (   join_dc.c:61    )    info: crm_update_peer_join: 	initialize_join: Node bl460g1n7[3232261593] - join-4 phase 1 -> 0
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (   join_dc.c:61    )    info: crm_update_peer_join: 	initialize_join: Node bl460g1n6[3232261592] - join-4 phase 1 -> 0
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (   join_dc.c:140   )    info: join_make_offer: 	join-4: Sending offer to bl460g1n7
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (   join_dc.c:66    )    info: crm_update_peer_join: 	join_make_offer: Node bl460g1n7[3232261593] - join-4 phase 0 -> 1
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (   join_dc.c:140   )    info: join_make_offer: 	join-4: Sending offer to bl460g1n6
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (   join_dc.c:66    )    info: crm_update_peer_join: 	join_make_offer: Node bl460g1n6[3232261592] - join-4 phase 0 -> 1
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (   join_dc.c:173   )    info: do_dc_join_offer_all: 	join-4: Waiting on 2 outstanding join acks
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (   pengine.c:260   )   debug: do_pe_invoke_callback: 	Discarding PE request in state: S_INTEGRATION
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (   control.c:920   )   debug: config_query_callback: 	Call 31 : Parsing CIB options
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (   control.c:944   )   debug: config_query_callback: 	Shutdown escalation occurs after: 1200000ms
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (   control.c:951   )   debug: config_query_callback: 	Checking for expired actions every 900000ms
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  messages.c:729   )   debug: handle_request: 	Raising I_JOIN_OFFER: join-3
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (       fsa.c:193   )   debug: s_crmd_fsa: 	Processing I_JOIN_OFFER: [ state=S_INTEGRATION cause=C_HA_MESSAGE origin=route_message ]
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (     utils.c:981   )    info: update_dc: 	Set DC to bl460g1n6 (3.0.8)
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (join_client.:135   )   debug: do_cl_join_offer_respond: 	do_cl_join_offer_respond added action A_DC_TIMER_STOP to the FSA
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  election.c:359   )   debug: election_count_vote: 	Created voted hash
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  election.c:492   )   debug: election_count_vote: 	Election 4 (current: 4, owner: 3232261592): Processed vote from bl460g1n6 (Recorded)
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  election.c:91    )   debug: do_election_check: 	Ignore election check: we not in an election
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (       fsa.c:193   )   debug: s_crmd_fsa: 	Processing I_JOIN_REQUEST: [ state=S_INTEGRATION cause=C_HA_MESSAGE origin=route_message ]
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (   join_dc.c:282   )   debug: do_dc_join_filter_offer: 	Processing req from bl460g1n7
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (   join_dc.c:303   )   debug: do_dc_join_filter_offer: 	Invalid response from bl460g1n7: join-3 vs. join-4
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (   join_dc.c:579   )   debug: check_join_state: 	Invoked by do_dc_join_filter_offer in state: S_INTEGRATION
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  messages.c:729   )   debug: handle_request: 	Raising I_JOIN_OFFER: join-4
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (       fsa.c:193   )   debug: s_crmd_fsa: 	Processing I_JOIN_OFFER: [ state=S_INTEGRATION cause=C_HA_MESSAGE origin=route_message ]
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (join_client.:135   )   debug: do_cl_join_offer_respond: 	do_cl_join_offer_respond added action A_DC_TIMER_STOP to the FSA
Jan 15 15:38:46 [30790] bl460g1n6        cib: (        io.c:596   )   debug: activateCibXml: 	Triggering CIB write for cib_modify op
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  election.c:492   )   debug: election_count_vote: 	Election 4 (current: 4, owner: 3232261592): Processed no-vote from bl460g1n7 (Recorded)
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  election.c:91    )   debug: do_election_check: 	Ignore election check: we not in an election
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (       fsa.c:193   )   debug: s_crmd_fsa: 	Processing I_JOIN_REQUEST: [ state=S_INTEGRATION cause=C_HA_MESSAGE origin=route_message ]
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (   join_dc.c:282   )   debug: do_dc_join_filter_offer: 	Processing req from bl460g1n7
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: ( cib_utils.c:174   )   debug: log_cib_diff: 	Config update: Local-only Change: 0.7.1
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (   join_dc.c:341   )   debug: do_dc_join_filter_offer: 	join-4: Welcoming node bl460g1n7 (ref join_request-crmd-1389767926-12)
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-- <cib admin_epoch="0" epoch="6" num_updates="1"/>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+  <cib epoch="7" num_updates="1" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.8" cib-last-written="Wed Jan 15 15:38:46 2014" update-origin="bl460g1n6" update-client="crmd" have-quorum="1" dc-uuid="3232261592">
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+    <configuration>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+      <crm_config>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+        <cluster_property_set id="cib-bootstrap-options">
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (   join_dc.c:66    )    info: crm_update_peer_join: 	do_dc_join_filter_offer: Node bl460g1n7[3232261593] - join-4 phase 1 -> 2
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++         <nvpair id="cib-bootstrap-options-cluster-infrastructure" name="cluster-infrastructure" value="corosync"/>
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (   join_dc.c:348   )   debug: do_dc_join_filter_offer: 	1 nodes have been integrated into join-4
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+        </cluster_property_set>
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (   join_dc.c:579   )   debug: check_join_state: 	Invoked by do_dc_join_filter_offer in state: S_INTEGRATION
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+      </crm_config>
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (   join_dc.c:354   )   debug: do_dc_join_filter_offer: 	join-4: Still waiting on 1 outstanding offers
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+    </configuration>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+  </cib>
Jan 15 15:38:46 [30790] bl460g1n6        cib: ( cib_utils.c:174   )  notice: log_cib_diff: 	cib:diff: Local-only Change: 0.7.1
Jan 15 15:38:46 [30790] bl460g1n6        cib: (       xml.c:1496  )  notice: cib:diff: 	-- <cib admin_epoch="0" epoch="6" num_updates="1"/>
Jan 15 15:38:46 [30790] bl460g1n6        cib: (       xml.c:1507  )  notice: cib:diff: 	++         <nvpair id="cib-bootstrap-options-cluster-infrastructure" name="cluster-infrastructure" value="corosync"/>
Jan 15 15:38:46 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_modify operation for section crm_config: OK (rc=0, origin=local/crmd/39, version=0.7.1)
Jan 15 15:38:46 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_query operation for section crm_config: OK (rc=0, origin=local/crmd/40, version=0.7.1)
Jan 15 15:38:46 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_query operation for section crm_config: OK (rc=0, origin=local/crmd/41, version=0.7.1)
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (   control.c:920   )   debug: config_query_callback: 	Call 40 : Parsing CIB options
Jan 15 15:38:46 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_query operation for section 'all': OK (rc=0, origin=local/crmd/42, version=0.7.1)
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (   control.c:944   )   debug: config_query_callback: 	Shutdown escalation occurs after: 1200000ms
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (   control.c:951   )   debug: config_query_callback: 	Checking for expired actions every 900000ms
Jan 15 15:38:46 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_query operation for section 'all': OK (rc=0, origin=local/crmd/43, version=0.7.1)
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (   control.c:920   )   debug: config_query_callback: 	Call 41 : Parsing CIB options
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (   control.c:944   )   debug: config_query_callback: 	Shutdown escalation occurs after: 1200000ms
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (   control.c:951   )   debug: config_query_callback: 	Checking for expired actions every 900000ms
Jan 15 15:38:46 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_query operation for section crm_config: OK (rc=0, origin=local/crmd/44, version=0.7.1)
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (join_client.:157   )   debug: join_query_callback: 	Respond to join offer join-4
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (join_client.:158   )   debug: join_query_callback: 	Acknowledging bl460g1n6 as our DC
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (   control.c:920   )   debug: config_query_callback: 	Call 44 : Parsing CIB options
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (   control.c:944   )   debug: config_query_callback: 	Shutdown escalation occurs after: 1200000ms
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (   control.c:951   )   debug: config_query_callback: 	Checking for expired actions every 900000ms
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (       fsa.c:193   )   debug: s_crmd_fsa: 	Processing I_JOIN_REQUEST: [ state=S_INTEGRATION cause=C_HA_MESSAGE origin=route_message ]
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (   join_dc.c:282   )   debug: do_dc_join_filter_offer: 	Processing req from bl460g1n6
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (   join_dc.c:321   )   debug: do_dc_join_filter_offer: 	bl460g1n6 has a better generation number than the current max bl460g1n7
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (   join_dc.c:323   )   debug: do_dc_join_filter_offer: 	Max generation   <generation_tuple epoch="6" num_updates="1" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.8" cib-last-written="Wed Jan 15 15:38:46 2014" update-origin="bl460g1n6" update-client="crmd" have-quorum="1" dc-uuid="3232261592"/>
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (   join_dc.c:325   )   debug: do_dc_join_filter_offer: 	Their generation   <generation_tuple epoch="7" num_updates="1" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.8" cib-last-written="Wed Jan 15 15:38:46 2014" update-origin="bl460g1n6" update-client="crmd" have-quorum="1" dc-uuid="3232261592"/>
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (   join_dc.c:341   )   debug: do_dc_join_filter_offer: 	join-4: Welcoming node bl460g1n6 (ref join_request-crmd-1389767926-25)
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (   join_dc.c:66    )    info: crm_update_peer_join: 	do_dc_join_filter_offer: Node bl460g1n6[3232261592] - join-4 phase 1 -> 2
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (   join_dc.c:348   )   debug: do_dc_join_filter_offer: 	2 nodes have been integrated into join-4
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (   join_dc.c:579   )   debug: check_join_state: 	Invoked by do_dc_join_filter_offer in state: S_INTEGRATION
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (   join_dc.c:593   )   debug: check_join_state: 	join-4: Integration of 2 peers complete: do_dc_join_filter_offer
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (       fsa.c:193   )   debug: s_crmd_fsa: 	Processing I_INTEGRATED: [ state=S_INTEGRATION cause=C_FSA_INTERNAL origin=check_join_state ]
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (       fsa.c:502   )    info: do_state_transition: 	State transition S_INTEGRATION -> S_FINALIZE_JOIN [ input=I_INTEGRATED cause=C_FSA_INTERNAL origin=check_join_state ]
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (       fsa.c:590   )   debug: do_state_transition: 	All 2 cluster nodes responded to the join offer.
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (     utils.c:192   )   debug: crm_timer_start: 	Started Finalization Timer (I_ELECTION:1800000ms), src=57
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (   join_dc.c:372   )   debug: do_dc_join_finalize: 	Finializing join-4 for 2 clients
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (   join_dc.c:682   )    info: crmd_join_phase_log: 	join-4: bl460g1n7=integrated
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (   join_dc.c:682   )    info: crmd_join_phase_log: 	join-4: bl460g1n6=integrated
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (   join_dc.c:410   )    info: do_dc_join_finalize: 	join-4: Syncing our CIB to the rest of the cluster
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (   join_dc.c:411   )   debug: do_dc_join_finalize: 	Requested version   <generation_tuple epoch="7" num_updates="1" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.8" cib-last-written="Wed Jan 15 15:38:46 2014" update-origin="bl460g1n6" update-client="crmd" have-quorum="1" dc-uuid="3232261592"/>
Jan 15 15:38:46 [30790] bl460g1n6        cib: (  messages.c:435   )   debug: sync_our_cib: 	Syncing CIB to all peers
Jan 15 15:38:46 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_sync operation for section 'all': OK (rc=0, origin=local/crmd/45, version=0.7.1)
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (   join_dc.c:579   )   debug: check_join_state: 	Invoked by finalize_sync_callback in state: S_FINALIZE_JOIN
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (   join_dc.c:610   )   debug: check_join_state: 	join-4: Still waiting on 2 integrated nodes
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (   join_dc.c:682   )   debug: crmd_join_phase_log: 	join-4: bl460g1n7=integrated
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (   join_dc.c:682   )   debug: crmd_join_phase_log: 	join-4: bl460g1n6=integrated
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (   join_dc.c:438   )   debug: finalize_sync_callback: 	Notifying 2 clients of join-4 results
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (   join_dc.c:562   )   debug: finalize_join_for: 	join-4: ACK'ing join request from bl460g1n7
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (   join_dc.c:66    )    info: crm_update_peer_join: 	finalize_join_for: Node bl460g1n7[3232261593] - join-4 phase 2 -> 3
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (   join_dc.c:562   )   debug: finalize_join_for: 	join-4: ACK'ing join request from bl460g1n6
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (   join_dc.c:66    )    info: crm_update_peer_join: 	finalize_join_for: Node bl460g1n6[3232261592] - join-4 phase 2 -> 3
Jan 15 15:38:46 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_modify operation for section nodes: OK (rc=0, origin=local/crmd/46, version=0.7.1)
Jan 15 15:38:46 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_modify operation for section nodes: OK (rc=0, origin=local/crmd/47, version=0.7.1)
Jan 15 15:38:46 [30790] bl460g1n6        cib: (       ipc.c:334   )    info: crm_client_new: 	Connecting 0x19816b0 for uid=0 gid=0 pid=30837 id=655c30c1-9783-498b-bb07-b47b8bb605de
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  messages.c:733   )   debug: handle_request: 	Raising I_JOIN_RESULT: join-4
Jan 15 15:38:46 [30790] bl460g1n6        cib: ( ipc_setup.c:484   )   debug: handle_new_connection: 	IPC credentials authenticated (30790-30837-14)
Jan 15 15:38:46 [30790] bl460g1n6        cib: (   ipc_shm.c:295   )   debug: qb_ipcs_shm_connect: 	connecting to client [30837]
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (       fsa.c:193   )   debug: s_crmd_fsa: 	Processing I_JOIN_RESULT: [ state=S_FINALIZE_JOIN cause=C_HA_MESSAGE origin=route_message ]
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (join_client.:231   )   debug: do_cl_join_finalize_respond: 	Confirming join join-4: join_ack_nack
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (join_client.:240   )   debug: do_cl_join_finalize_respond: 	join-4: Join complete.  Sending local LRM status to bl460g1n6
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (   join_dc.c:482   )   debug: do_dc_join_ack: 	Ignoring op=join_ack_nack message from bl460g1n6
Jan 15 15:38:46 [30790] bl460g1n6        cib: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (       fsa.c:193   )   debug: s_crmd_fsa: 	Processing I_JOIN_RESULT: [ state=S_FINALIZE_JOIN cause=C_HA_MESSAGE origin=route_message ]
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (   join_dc.c:66    )    info: crm_update_peer_join: 	do_dc_join_ack: Node bl460g1n6[3232261592] - join-4 phase 3 -> 4
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (   join_dc.c:504   )    info: do_dc_join_ack: 	join-4: Updating node state to member for bl460g1n6
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (     utils.c:1011  )    info: erase_status_tag: 	Deleting xpath: //node_state[@uname='bl460g1n6']/lrm
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (   join_dc.c:514   )   debug: do_dc_join_ack: 	join-4: Registered callback for LRM update 49
Jan 15 15:38:46 [30790] bl460g1n6        cib: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:38:46 [30790] bl460g1n6        cib: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:38:46 [30790] bl460g1n6        cib: (   cib_ops.c:923   )   debug: cib_process_xpath: 	Processing cib_delete op for //node_state[@uname='bl460g1n6']/lrm (/cib/status/node_state[1]/lrm)
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: ( cib_utils.c:167   )   debug: Config update: 	Diff: --- 0.7.1
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: ( cib_utils.c:169   )   debug: Config update: 	Diff: +++ 0.7.2 1aa0432f52141570abe870c06137ebdd
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-  <cib num_updates="1">
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-    <status>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-      <node_state id="3232261592">
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	--       <lrm id="3232261592">
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	--         <lrm_resources/>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	--       </lrm>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-      </node_state>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-    </status>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-  </cib>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++ <cib epoch="7" num_updates="2" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.8" cib-last-written="Wed Jan 15 15:38:46 2014" update-origin="bl460g1n6" update-client="crmd" have-quorum="1" dc-uuid="3232261592"/>
Jan 15 15:38:46 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_delete operation for section //node_state[@uname='bl460g1n6']/lrm: OK (rc=0, origin=local/crmd/48, version=0.7.2)
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: ( cib_utils.c:167   )   debug: Config update: 	Diff: --- 0.7.2
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: ( cib_utils.c:169   )   debug: Config update: 	Diff: +++ 0.7.3 68ea7919887f0d3104771c9c0ea71a0d
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-- <cib num_updates="2"/>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+  <cib epoch="7" num_updates="3" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.8" cib-last-written="Wed Jan 15 15:38:46 2014" update-origin="bl460g1n6" update-client="crmd" have-quorum="1" dc-uuid="3232261592">
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+    <status>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+      <node_state id="3232261592" uname="bl460g1n6" in_ccm="true" crmd="online" crm-debug-origin="do_lrm_query_internal" join="member" expected="member">
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++       <lrm id="3232261592">
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++         <lrm_resources/>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++       </lrm>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+      </node_state>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+    </status>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+  </cib>
Jan 15 15:38:46 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=local/crmd/49, version=0.7.3)
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (     utils.c:998   )   debug: erase_xpath_callback: 	Deletion of "//node_state[@uname='bl460g1n6']/lrm": OK (rc=0)
Jan 15 15:38:46 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_query operation for section 'all': OK (rc=0, origin=local/cibadmin/2, version=0.7.3)
Jan 15 15:38:46 [30790] bl460g1n6        cib: (      ipcs.c:757   )   debug: qb_ipcs_dispatch_connection_request: 	HUP conn (30790-30837-14)
Jan 15 15:38:46 [30790] bl460g1n6        cib: (      ipcs.c:605   )   debug: qb_ipcs_disconnect: 	qb_ipcs_disconnect(30790-30837-14) state:2
Jan 15 15:38:46 [30790] bl460g1n6        cib: (       ipc.c:368   )    info: crm_client_destroy: 	Destroying 0 events
Jan 15 15:38:46 [30790] bl460g1n6        cib: (ringbuffer.c:299   )   debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-response-30790-30837-14-header
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (   join_dc.c:456   )   debug: join_update_complete_callback: 	Join update 49 complete
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (   join_dc.c:579   )   debug: check_join_state: 	Invoked by join_update_complete_callback in state: S_FINALIZE_JOIN
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (   join_dc.c:615   )   debug: check_join_state: 	join-4: Still waiting on 1 finalized nodes
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (   join_dc.c:682   )   debug: crmd_join_phase_log: 	join-4: bl460g1n7=finalized
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (   join_dc.c:682   )   debug: crmd_join_phase_log: 	join-4: bl460g1n6=confirmed
Jan 15 15:38:46 [30790] bl460g1n6        cib: (ringbuffer.c:299   )   debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-event-30790-30837-14-header
Jan 15 15:38:46 [30790] bl460g1n6        cib: (ringbuffer.c:299   )   debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-request-30790-30837-14-header
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (       fsa.c:193   )   debug: s_crmd_fsa: 	Processing I_JOIN_RESULT: [ state=S_FINALIZE_JOIN cause=C_HA_MESSAGE origin=route_message ]
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (   join_dc.c:66    )    info: crm_update_peer_join: 	do_dc_join_ack: Node bl460g1n7[3232261593] - join-4 phase 3 -> 4
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (   join_dc.c:504   )    info: do_dc_join_ack: 	join-4: Updating node state to member for bl460g1n7
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (     utils.c:1011  )    info: erase_status_tag: 	Deleting xpath: //node_state[@uname='bl460g1n7']/lrm
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (   join_dc.c:514   )   debug: do_dc_join_ack: 	join-4: Registered callback for LRM update 51
Jan 15 15:38:46 [30790] bl460g1n6        cib: (   cib_ops.c:923   )   debug: cib_process_xpath: 	Processing cib_delete op for //node_state[@uname='bl460g1n7']/lrm (/cib/status/node_state[2]/lrm)
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: ( cib_utils.c:167   )   debug: Config update: 	Diff: --- 0.7.3
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: ( cib_utils.c:169   )   debug: Config update: 	Diff: +++ 0.7.4 742b521cb3d73a2151cd16739c0ed8ef
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-  <cib num_updates="3">
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-    <status>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-      <node_state id="3232261593">
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	--       <lrm id="3232261593">
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	--         <lrm_resources/>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	--       </lrm>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-      </node_state>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-    </status>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-  </cib>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++ <cib epoch="7" num_updates="4" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.8" cib-last-written="Wed Jan 15 15:38:46 2014" update-origin="bl460g1n6" update-client="crmd" have-quorum="1" dc-uuid="3232261592"/>
Jan 15 15:38:46 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_delete operation for section //node_state[@uname='bl460g1n7']/lrm: OK (rc=0, origin=local/crmd/50, version=0.7.4)
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: ( cib_utils.c:167   )   debug: Config update: 	Diff: --- 0.7.4
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: ( cib_utils.c:169   )   debug: Config update: 	Diff: +++ 0.7.5 23058e3f198b7b5b17f898b3a8863ee6
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-- <cib num_updates="4"/>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+  <cib epoch="7" num_updates="5" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.8" cib-last-written="Wed Jan 15 15:38:46 2014" update-origin="bl460g1n6" update-client="crmd" have-quorum="1" dc-uuid="3232261592">
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+    <status>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+      <node_state id="3232261593" uname="bl460g1n7" in_ccm="true" crmd="online" crm-debug-origin="do_lrm_query_internal" join="member" expected="member">
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++       <lrm id="3232261593">
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++         <lrm_resources/>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++       </lrm>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+      </node_state>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+    </status>
Jan 15 15:38:46 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+  </cib>
Jan 15 15:38:46 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=local/crmd/51, version=0.7.5)
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (     utils.c:998   )   debug: erase_xpath_callback: 	Deletion of "//node_state[@uname='bl460g1n7']/lrm": OK (rc=0)
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (   join_dc.c:456   )   debug: join_update_complete_callback: 	Join update 51 complete
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (   join_dc.c:579   )   debug: check_join_state: 	Invoked by join_update_complete_callback in state: S_FINALIZE_JOIN
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (   join_dc.c:619   )   debug: check_join_state: 	join-4 complete: join_update_complete_callback
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (       fsa.c:193   )   debug: s_crmd_fsa: 	Processing I_FINALIZED: [ state=S_FINALIZE_JOIN cause=C_FSA_INTERNAL origin=check_join_state ]
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (       fsa.c:502   )    info: do_state_transition: 	State transition S_FINALIZE_JOIN -> S_POLICY_ENGINE [ input=I_FINALIZED cause=C_FSA_INTERNAL origin=check_join_state ]
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (       fsa.c:610   )   debug: do_state_transition: 	All 2 cluster nodes are eligible to run resources.
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (   join_dc.c:634   )   debug: do_dc_join_final: 	Ensuring DC, quorum and node attributes are up-to-date
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (     utils.c:2023  )   debug: attrd_update_delegate: 	Sent update: (null)=(null) for localhost
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (membership.c:317   )   debug: crm_update_quorum: 	Updating quorum status to true (call=54)
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (   tengine.c:150   )   debug: do_te_invoke: 	Cancelling the transition: inactive
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (  te_utils.c:431   )    info: abort_transition_graph: 	do_te_invoke:151 - Triggered transition abort (complete=1) : Peer Cancelled
Jan 15 15:38:46 [30795] bl460g1n6       crmd: (     utils.c:192   )   debug: crm_timer_start: 	Started New Transition Timer (I_PE_CALC:2000ms), src=66
Jan 15 15:38:46 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_modify operation for section nodes: OK (rc=0, origin=local/crmd/52, version=0.7.5)
Jan 15 15:38:46 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=local/crmd/53, version=0.7.5)
Jan 15 15:38:46 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_modify operation for section cib: OK (rc=0, origin=local/crmd/54, version=0.7.5)
Jan 15 15:38:46 [30790] bl460g1n6        cib: (        io.c:738   )    info: write_cib_contents: 	Archived previous version as /var/lib/pacemaker/cib/cib-2.raw
Jan 15 15:38:46 [30790] bl460g1n6        cib: (        io.c:748   )   debug: write_cib_contents: 	Writing CIB to disk
Jan 15 15:38:47 [30790] bl460g1n6        cib: (        io.c:773   )    info: write_cib_contents: 	Wrote version 0.6.0 of the CIB to disk (digest: 56cebba7f790f44312b7b5bc366984aa)
Jan 15 15:38:47 [30790] bl460g1n6        cib: (        io.c:781   )   debug: write_cib_contents: 	Wrote digest 56cebba7f790f44312b7b5bc366984aa to disk
Jan 15 15:38:47 [30790] bl460g1n6        cib: (        io.c:259   )    info: retrieveCib: 	Reading cluster configuration from: /var/lib/pacemaker/cib/cib.etB44w (digest: /var/lib/pacemaker/cib/cib.YvRWmN)
Jan 15 15:38:47 [30790] bl460g1n6        cib: (        io.c:786   )   debug: write_cib_contents: 	Activating /var/lib/pacemaker/cib/cib.etB44w
Jan 15 15:38:47 [30790] bl460g1n6        cib: (        io.c:738   )    info: write_cib_contents: 	Archived previous version as /var/lib/pacemaker/cib/cib-3.raw
Jan 15 15:38:47 [30790] bl460g1n6        cib: (        io.c:748   )   debug: write_cib_contents: 	Writing CIB to disk
Jan 15 15:38:47 [30790] bl460g1n6        cib: (        io.c:773   )    info: write_cib_contents: 	Wrote version 0.7.0 of the CIB to disk (digest: 5a361672811793a573b8a0a00219f249)
Jan 15 15:38:47 [30790] bl460g1n6        cib: (        io.c:781   )   debug: write_cib_contents: 	Wrote digest 5a361672811793a573b8a0a00219f249 to disk
Jan 15 15:38:47 [30790] bl460g1n6        cib: (        io.c:259   )    info: retrieveCib: 	Reading cluster configuration from: /var/lib/pacemaker/cib/cib.csxy1x (digest: /var/lib/pacemaker/cib/cib.KXaXFO)
Jan 15 15:38:47 [30790] bl460g1n6        cib: (        io.c:786   )   debug: write_cib_contents: 	Activating /var/lib/pacemaker/cib/cib.csxy1x
Jan 15 15:38:48 [30795] bl460g1n6       crmd: (     utils.c:120   )    info: crm_timer_popped: 	New Transition Timer (I_PE_CALC) just popped (2000ms)
Jan 15 15:38:48 [30795] bl460g1n6       crmd: (       fsa.c:193   )   debug: s_crmd_fsa: 	Processing I_PE_CALC: [ state=S_POLICY_ENGINE cause=C_TIMER_POPPED origin=crm_timer_popped ]
Jan 15 15:38:48 [30795] bl460g1n6       crmd: (   pengine.c:231   )   debug: do_pe_invoke: 	Query 55: Requesting the current CIB: S_POLICY_ENGINE
Jan 15 15:38:48 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_query operation for section 'all': OK (rc=0, origin=local/crmd/55, version=0.7.5)
Jan 15 15:38:48 [30795] bl460g1n6       crmd: (   pengine.c:299   )   debug: do_pe_invoke_callback: 	Invoking the PE: query=55, ref=pe_calc-dc-1389767928-29, seq=8, quorate=1
Jan 15 15:38:48 [30794] bl460g1n6    pengine: (    unpack.c:93    )   debug: unpack_config: 	STONITH timeout: 60000
Jan 15 15:38:48 [30794] bl460g1n6    pengine: (    unpack.c:97    )   debug: unpack_config: 	STONITH of failed nodes is disabled
Jan 15 15:38:48 [30794] bl460g1n6    pengine: (    unpack.c:104   )   debug: unpack_config: 	Stop all active resources: false
Jan 15 15:38:48 [30794] bl460g1n6    pengine: (    unpack.c:108   )   debug: unpack_config: 	Cluster is symmetric - resources can run anywhere by default
Jan 15 15:38:48 [30794] bl460g1n6    pengine: (    unpack.c:113   )   debug: unpack_config: 	Default stickiness: 0
Jan 15 15:38:48 [30794] bl460g1n6    pengine: (    unpack.c:156   )  notice: unpack_config: 	On loss of CCM Quorum: Ignore
Jan 15 15:38:48 [30794] bl460g1n6    pengine: (    unpack.c:196   )   debug: unpack_config: 	Node scores: 'red' = -INFINITY, 'yellow' = 0, 'green' = 0
Jan 15 15:38:48 [30794] bl460g1n6    pengine: (    unpack.c:504   )   debug: unpack_domains: 	Unpacking domains
Jan 15 15:38:48 [30794] bl460g1n6    pengine: (    unpack.c:1331  )    info: determine_online_status: 	Node bl460g1n6 is online
Jan 15 15:38:48 [30794] bl460g1n6    pengine: (    unpack.c:1331  )    info: determine_online_status: 	Node bl460g1n7 is online
Jan 15 15:38:48 [30794] bl460g1n6    pengine: (    native.c:446   )    info: native_print: 	prmVM2	(ocf::heartbeat:VirtualDomain):	Stopped 
Jan 15 15:38:48 [30794] bl460g1n6    pengine: (     clone.c:417   )    info: clone_print: 	 Clone Set: clnPing [prmPing]
Jan 15 15:38:48 [30794] bl460g1n6    pengine: (     clone.c:311   )    info: short_print: 	     Stopped: [ bl460g1n6 bl460g1n7 ]
Jan 15 15:38:48 [30794] bl460g1n6    pengine: (     utils.c:386   )   debug: native_assign_node: 	Assigning bl460g1n6 to prmPing:0
Jan 15 15:38:48 [30794] bl460g1n6    pengine: (     utils.c:386   )   debug: native_assign_node: 	Assigning bl460g1n7 to prmPing:1
Jan 15 15:38:48 [30794] bl460g1n6    pengine: (     clone.c:625   )   debug: clone_color: 	Allocated 2 clnPing instances of a possible 2
Jan 15 15:38:48 [30794] bl460g1n6    pengine: (     utils.c:339   )   debug: native_assign_node: 	All nodes for resource prmVM2 are unavailable, unclean or shutting down (bl460g1n6: 1, -1000000)
Jan 15 15:38:48 [30794] bl460g1n6    pengine: (     utils.c:356   )   debug: native_assign_node: 	Could not allocate a node for prmVM2
Jan 15 15:38:48 [30794] bl460g1n6    pengine: (    native.c:564   )    info: native_color: 	Resource prmVM2 cannot run anywhere
Jan 15 15:38:48 [30794] bl460g1n6    pengine: (    native.c:2665  )   debug: native_create_probe: 	Probing prmVM2 on bl460g1n6 (Stopped)
Jan 15 15:38:48 [30794] bl460g1n6    pengine: (    native.c:2665  )   debug: native_create_probe: 	Probing prmPing:0 on bl460g1n6 (Stopped)
Jan 15 15:38:48 [30794] bl460g1n6    pengine: (    native.c:2665  )   debug: native_create_probe: 	Probing prmVM2 on bl460g1n7 (Stopped)
Jan 15 15:38:48 [30794] bl460g1n6    pengine: (    native.c:2665  )   debug: native_create_probe: 	Probing prmPing:1 on bl460g1n7 (Stopped)
Jan 15 15:38:48 [30794] bl460g1n6    pengine: (    native.c:790   )    info: RecurringOp: 	 Start recurring monitor (10s) for prmPing:0 on bl460g1n6
Jan 15 15:38:48 [30794] bl460g1n6    pengine: (    native.c:790   )    info: RecurringOp: 	 Start recurring monitor (10s) for prmPing:1 on bl460g1n7
Jan 15 15:38:48 [30794] bl460g1n6    pengine: (    native.c:2083  )    info: LogActions: 	Leave   prmVM2	(Stopped)
Jan 15 15:38:48 [30794] bl460g1n6    pengine: (    native.c:2238  )  notice: LogActions: 	Start   prmPing:0	(bl460g1n6)
Jan 15 15:38:48 [30794] bl460g1n6    pengine: (    native.c:2238  )  notice: LogActions: 	Start   prmPing:1	(bl460g1n7)
Jan 15 15:38:48 [30794] bl460g1n6    pengine: (     graph.c:769   )   debug: action2xml: 	Using anonymous clone name prmPing for prmPing:0 (aka. (null))
Jan 15 15:38:48 [30794] bl460g1n6    pengine: (     graph.c:769   )   debug: action2xml: 	Using anonymous clone name prmPing for prmPing:0 (aka. (null))
Jan 15 15:38:48 [30794] bl460g1n6    pengine: (     graph.c:769   )   debug: action2xml: 	Using anonymous clone name prmPing for prmPing:0 (aka. (null))
Jan 15 15:38:48 [30794] bl460g1n6    pengine: (     graph.c:769   )   debug: action2xml: 	Using anonymous clone name prmPing for prmPing:1 (aka. (null))
Jan 15 15:38:48 [30794] bl460g1n6    pengine: (     graph.c:769   )   debug: action2xml: 	Using anonymous clone name prmPing for prmPing:1 (aka. (null))
Jan 15 15:38:48 [30794] bl460g1n6    pengine: (     graph.c:769   )   debug: action2xml: 	Using anonymous clone name prmPing for prmPing:1 (aka. (null))
Jan 15 15:38:48 [30794] bl460g1n6    pengine: (   pengine.c:178   )  notice: process_pe_message: 	Calculated Transition 2: /var/lib/pacemaker/pengine/pe-input-2.bz2
Jan 15 15:38:48 [30795] bl460g1n6       crmd: (       fsa.c:193   )   debug: s_crmd_fsa: 	Processing I_PE_SUCCESS: [ state=S_POLICY_ENGINE cause=C_IPC_MESSAGE origin=handle_response ]
Jan 15 15:38:48 [30795] bl460g1n6       crmd: (       fsa.c:502   )    info: do_state_transition: 	State transition S_POLICY_ENGINE -> S_TRANSITION_ENGINE [ input=I_PE_SUCCESS cause=C_IPC_MESSAGE origin=handle_response ]
Jan 15 15:38:48 [30795] bl460g1n6       crmd: (    unpack.c:230   )   debug: unpack_graph: 	Unpacked transition 2: 13 actions in 13 synapses
Jan 15 15:38:48 [30795] bl460g1n6       crmd: (   tengine.c:208   )    info: do_te_invoke: 	Processing graph 2 (ref=pe_calc-dc-1389767928-29) derived from /var/lib/pacemaker/pengine/pe-input-2.bz2
Jan 15 15:38:48 [30795] bl460g1n6       crmd: (te_actions.c:416   )  notice: te_rsc_command: 	Initiating action 7: monitor prmVM2_monitor_0 on bl460g1n7
Jan 15 15:38:48 [30795] bl460g1n6       crmd: (te_actions.c:416   )  notice: te_rsc_command: 	Initiating action 4: monitor prmVM2_monitor_0 on bl460g1n6 (local)
Jan 15 15:38:48 [30792] bl460g1n6       lrmd: (      lrmd.c:1072  )    info: process_lrmd_get_rsc_info: 	Resource 'prmVM2' not found (0 active resources)
Jan 15 15:38:48 [30792] bl460g1n6       lrmd: (      lrmd.c:1313  )   debug: process_lrmd_message: 	Processed lrmd_rsc_info operation from ea951299-8a4d-4fd6-8900-d6588e07ac38: rc=0, reply=0, notify=0, exit=4201864
Jan 15 15:38:48 [30792] bl460g1n6       lrmd: (      lrmd.c:1047  )    info: process_lrmd_rsc_register: 	Added 'prmVM2' to the rsc list (1 active resources)
Jan 15 15:38:48 [30792] bl460g1n6       lrmd: (      lrmd.c:1313  )   debug: process_lrmd_message: 	Processed lrmd_rsc_register operation from ea951299-8a4d-4fd6-8900-d6588e07ac38: rc=0, reply=1, notify=1, exit=4201864
Jan 15 15:38:48 [30792] bl460g1n6       lrmd: (      lrmd.c:1313  )   debug: process_lrmd_message: 	Processed lrmd_rsc_info operation from ea951299-8a4d-4fd6-8900-d6588e07ac38: rc=0, reply=0, notify=0, exit=4201864
Jan 15 15:38:48 [30795] bl460g1n6       crmd: (       lrm.c:1784  )    info: do_lrm_rsc_op: 	Performing key=4:2:7:be72ea63-75a9-4de4-a591-e716f960743b op=prmVM2_monitor_0
Jan 15 15:38:48 [30792] bl460g1n6       lrmd: (      lrmd.c:1313  )   debug: process_lrmd_message: 	Processed lrmd_rsc_exec operation from ea951299-8a4d-4fd6-8900-d6588e07ac38: rc=5, reply=1, notify=0, exit=4201864
Jan 15 15:38:48 [30792] bl460g1n6       lrmd: (      lrmd.c:122   )   debug: log_execute: 	executing - rsc:prmVM2 action:monitor call_id:5
Jan 15 15:38:48 [30795] bl460g1n6       crmd: (te_actions.c:416   )  notice: te_rsc_command: 	Initiating action 5: monitor prmPing:0_monitor_0 on bl460g1n6 (local)
Jan 15 15:38:48 [30792] bl460g1n6       lrmd: (      lrmd.c:1072  )    info: process_lrmd_get_rsc_info: 	Resource 'prmPing' not found (1 active resources)
Jan 15 15:38:48 [30792] bl460g1n6       lrmd: (      lrmd.c:1313  )   debug: process_lrmd_message: 	Processed lrmd_rsc_info operation from ea951299-8a4d-4fd6-8900-d6588e07ac38: rc=0, reply=0, notify=0, exit=4201864
Jan 15 15:38:48 [30792] bl460g1n6       lrmd: (      lrmd.c:1072  )    info: process_lrmd_get_rsc_info: 	Resource 'prmPing:0' not found (1 active resources)
Jan 15 15:38:48 [30792] bl460g1n6       lrmd: (      lrmd.c:1313  )   debug: process_lrmd_message: 	Processed lrmd_rsc_info operation from ea951299-8a4d-4fd6-8900-d6588e07ac38: rc=0, reply=0, notify=0, exit=4201864
Jan 15 15:38:48 [30792] bl460g1n6       lrmd: (      lrmd.c:1047  )    info: process_lrmd_rsc_register: 	Added 'prmPing' to the rsc list (2 active resources)
Jan 15 15:38:48 [30792] bl460g1n6       lrmd: (      lrmd.c:1313  )   debug: process_lrmd_message: 	Processed lrmd_rsc_register operation from ea951299-8a4d-4fd6-8900-d6588e07ac38: rc=0, reply=1, notify=1, exit=4201864
Jan 15 15:38:48 [30792] bl460g1n6       lrmd: (      lrmd.c:1313  )   debug: process_lrmd_message: 	Processed lrmd_rsc_info operation from ea951299-8a4d-4fd6-8900-d6588e07ac38: rc=0, reply=0, notify=0, exit=4201864
Jan 15 15:38:48 [30795] bl460g1n6       crmd: (       lrm.c:1784  )    info: do_lrm_rsc_op: 	Performing key=5:2:7:be72ea63-75a9-4de4-a591-e716f960743b op=prmPing_monitor_0
Jan 15 15:38:48 [30792] bl460g1n6       lrmd: (      lrmd.c:1313  )   debug: process_lrmd_message: 	Processed lrmd_rsc_exec operation from ea951299-8a4d-4fd6-8900-d6588e07ac38: rc=10, reply=1, notify=0, exit=4201864
Jan 15 15:38:48 [30792] bl460g1n6       lrmd: (      lrmd.c:122   )   debug: log_execute: 	executing - rsc:prmPing action:monitor call_id:10
Jan 15 15:38:48 [30795] bl460g1n6       crmd: (te_actions.c:416   )  notice: te_rsc_command: 	Initiating action 8: monitor prmPing:1_monitor_0 on bl460g1n7
Jan 15 15:38:48 [30795] bl460g1n6       crmd: (te_actions.c:55    )   debug: te_pseudo_action: 	Pseudo action 13 fired and confirmed
Jan 15 15:38:48 [30795] bl460g1n6       crmd: (     graph.c:336   )   debug: run_graph: 	Transition 2 (Complete=0, Pending=4, Fired=5, Skipped=0, Incomplete=8, Source=/var/lib/pacemaker/pengine/pe-input-2.bz2): In-progress
Jan 15 15:38:48 [30795] bl460g1n6       crmd: (     graph.c:336   )   debug: run_graph: 	Transition 2 (Complete=1, Pending=4, Fired=0, Skipped=0, Incomplete=8, Source=/var/lib/pacemaker/pengine/pe-input-2.bz2): In-progress
Jan 15 15:38:49 [30792] bl460g1n6       lrmd: (services_lin:301   )   debug: operation_finished: 	prmPing_monitor_0:30841 - exited with rc=7
Jan 15 15:38:49 [30792] bl460g1n6       lrmd: (services_lin:306   )   debug: operation_finished: 	prmPing_monitor_0:30841:stderr [ -- empty -- ]
Jan 15 15:38:49 [30792] bl460g1n6       lrmd: (services_lin:310   )   debug: operation_finished: 	prmPing_monitor_0:30841:stdout [ -- empty -- ]
Jan 15 15:38:49 [30792] bl460g1n6       lrmd: (      lrmd.c:104   )   debug: log_finished: 	finished - rsc:prmPing action:monitor call_id:10 pid:30841 exit-code:7 exec-time:20ms queue-time:0ms
Jan 15 15:38:49 [30795] bl460g1n6       crmd: (     utils.c:2126  )   debug: create_operation_update: 	do_update_resource: Updating resource prmPing after monitor op complete (interval=0)
Jan 15 15:38:49 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=bl460g1n7/crmd/13, version=0.7.6)
Jan 15 15:38:49 [30791] bl460g1n6 stonith-ng: ( cib_utils.c:167   )   debug: Config update: 	Diff: --- 0.7.5
Jan 15 15:38:49 [30791] bl460g1n6 stonith-ng: ( cib_utils.c:169   )   debug: Config update: 	Diff: +++ 0.7.6 dd30cf7b530ba2944e8f56b0450b7e5c
Jan 15 15:38:49 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-- <cib num_updates="5"/>
Jan 15 15:38:49 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+  <cib epoch="7" num_updates="6" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.8" cib-last-written="Wed Jan 15 15:38:46 2014" update-origin="bl460g1n6" update-client="crmd" have-quorum="1" dc-uuid="3232261592">
Jan 15 15:38:49 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+    <status>
Jan 15 15:38:49 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+      <node_state id="3232261593" uname="bl460g1n7" in_ccm="true" crmd="online" crm-debug-origin="do_update_resource" join="member" expected="member">
Jan 15 15:38:49 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+        <lrm id="3232261593">
Jan 15 15:38:49 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+          <lrm_resources>
Jan 15 15:38:49 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++           <lrm_resource id="prmPing" type="ping" class="ocf" provider="pacemaker">
Jan 15 15:38:49 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++             <lrm_rsc_op id="prmPing_last_0" operation_key="prmPing_monitor_0" operation="monitor" crm-debug-origin="do_update_resource" crm_feature_set="3.0.8" transition-key="8:2:7:be72ea63-75a9-4de4-a591-e716f960743b" transition-magic="0:7;8:2:7:be72ea63-75a9-4de4-a591-e716f960743b" call-id="10" rc-code="7" op-status="0" interval="0" last-run="1389767928" last-rc-change="1389767928" exec-time="15" queue-time="
Jan 15 15:38:49 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++           </lrm_resource>
Jan 15 15:38:49 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+          </lrm_resources>
Jan 15 15:38:49 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+        </lrm>
Jan 15 15:38:49 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+      </node_state>
Jan 15 15:38:49 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+    </status>
Jan 15 15:38:49 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+  </cib>
Jan 15 15:38:49 [30795] bl460g1n6       crmd: (services_lin:604   )    info: services_os_action_execute: 	Managed ping_meta-data_0 process 30852 exited with rc=0
Jan 15 15:38:49 [30795] bl460g1n6       crmd: (       lrm.c:2101  )  notice: process_lrm_event: 	LRM operation prmPing_monitor_0 (call=10, rc=7, cib-update=56, confirmed=true) not running
Jan 15 15:38:49 [30795] bl460g1n6       crmd: (       lrm.c:122   )   debug: update_history_cache: 	Updating history for 'prmPing' with monitor op
Jan 15 15:38:49 [30795] bl460g1n6       crmd: (te_callbacks:122   )   debug: te_update_diff: 	Processing diff (cib_modify): 0.7.5 -> 0.7.6 (S_TRANSITION_ENGINE)
Jan 15 15:38:49 [30795] bl460g1n6       crmd: ( te_events.c:375   )    info: match_graph_event: 	Action prmPing_monitor_0 (8) confirmed on bl460g1n7 (rc=0)
Jan 15 15:38:49 [30795] bl460g1n6       crmd: (     graph.c:336   )   debug: run_graph: 	Transition 2 (Complete=2, Pending=3, Fired=0, Skipped=0, Incomplete=8, Source=/var/lib/pacemaker/pengine/pe-input-2.bz2): In-progress
Jan 15 15:38:49 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=local/crmd/56, version=0.7.7)
Jan 15 15:38:49 [30791] bl460g1n6 stonith-ng: ( cib_utils.c:167   )   debug: Config update: 	Diff: --- 0.7.6
Jan 15 15:38:49 [30791] bl460g1n6 stonith-ng: ( cib_utils.c:169   )   debug: Config update: 	Diff: +++ 0.7.7 3078261952dc9a6d62c1dc1732309b3a
Jan 15 15:38:49 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-- <cib num_updates="6"/>
Jan 15 15:38:49 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+  <cib epoch="7" num_updates="7" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.8" cib-last-written="Wed Jan 15 15:38:46 2014" update-origin="bl460g1n6" update-client="crmd" have-quorum="1" dc-uuid="3232261592">
Jan 15 15:38:49 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+    <status>
Jan 15 15:38:49 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+      <node_state id="3232261592" uname="bl460g1n6" in_ccm="true" crmd="online" crm-debug-origin="do_update_resource" join="member" expected="member">
Jan 15 15:38:49 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+        <lrm id="3232261592">
Jan 15 15:38:49 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+          <lrm_resources>
Jan 15 15:38:49 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++           <lrm_resource id="prmPing" type="ping" class="ocf" provider="pacemaker">
Jan 15 15:38:49 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++             <lrm_rsc_op id="prmPing_last_0" operation_key="prmPing_monitor_0" operation="monitor" crm-debug-origin="do_update_resource" crm_feature_set="3.0.8" transition-key="5:2:7:be72ea63-75a9-4de4-a591-e716f960743b" transition-magic="0:7;5:2:7:be72ea63-75a9-4de4-a591-e716f960743b" call-id="10" rc-code="7" op-status="0" interval="0" last-run="1389767928" last-rc-change="1389767928" exec-time="20" queue-time="
Jan 15 15:38:49 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++           </lrm_resource>
Jan 15 15:38:49 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+          </lrm_resources>
Jan 15 15:38:49 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+        </lrm>
Jan 15 15:38:49 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+      </node_state>
Jan 15 15:38:49 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+    </status>
Jan 15 15:38:49 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+  </cib>
Jan 15 15:38:49 [30795] bl460g1n6       crmd: (te_callbacks:122   )   debug: te_update_diff: 	Processing diff (cib_modify): 0.7.6 -> 0.7.7 (S_TRANSITION_ENGINE)
Jan 15 15:38:49 [30795] bl460g1n6       crmd: ( te_events.c:375   )    info: match_graph_event: 	Action prmPing_monitor_0 (5) confirmed on bl460g1n6 (rc=0)
Jan 15 15:38:49 [30795] bl460g1n6       crmd: (     graph.c:336   )   debug: run_graph: 	Transition 2 (Complete=3, Pending=2, Fired=0, Skipped=0, Incomplete=8, Source=/var/lib/pacemaker/pengine/pe-input-2.bz2): In-progress
VirtualDomain(prmVM2)[30840]:	2014/01/15_15:38:49 DEBUG: Virtual domain vm2 is currently error: failed to get domain 'vm2'
error: domain not found: no domain with matching name 'vm2'.
Jan 15 15:38:49 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=bl460g1n7/crmd/14, version=0.7.8)
Jan 15 15:38:49 [30791] bl460g1n6 stonith-ng: ( cib_utils.c:167   )   debug: Config update: 	Diff: --- 0.7.7
Jan 15 15:38:49 [30791] bl460g1n6 stonith-ng: ( cib_utils.c:169   )   debug: Config update: 	Diff: +++ 0.7.8 3a6aa35fde7b1ce804cdb9d387e49af0
Jan 15 15:38:49 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-- <cib num_updates="7"/>
Jan 15 15:38:49 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+  <cib epoch="7" num_updates="8" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.8" cib-last-written="Wed Jan 15 15:38:46 2014" update-origin="bl460g1n6" update-client="crmd" have-quorum="1" dc-uuid="3232261592">
Jan 15 15:38:49 [30795] bl460g1n6       crmd: (te_callbacks:122   )   debug: te_update_diff: 	Processing diff (cib_modify): 0.7.7 -> 0.7.8 (S_TRANSITION_ENGINE)
Jan 15 15:38:49 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+    <status>
Jan 15 15:38:49 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+      <node_state id="3232261593" uname="bl460g1n7" in_ccm="true" crmd="online" crm-debug-origin="do_update_resource" join="member" expected="member">
Jan 15 15:38:49 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+        <lrm id="3232261593">
Jan 15 15:38:49 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+          <lrm_resources>
Jan 15 15:38:49 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++           <lrm_resource id="prmVM2" type="VirtualDomain" class="ocf" provider="heartbeat">
Jan 15 15:38:49 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++             <lrm_rsc_op id="prmVM2_last_0" operation_key="prmVM2_monitor_0" operation="monitor" crm-debug-origin="do_update_resource" crm_feature_set="3.0.8" transition-key="7:2:7:be72ea63-75a9-4de4-a591-e716f960743b" transition-magic="0:7;7:2:7:be72ea63-75a9-4de4-a591-e716f960743b" call-id="5" rc-code="7" op-status="0" interval="0" last-run="1389767928" last-rc-change="1389767928" exec-time="101" queue-time="0"
Jan 15 15:38:49 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++           </lrm_resource>
Jan 15 15:38:49 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+          </lrm_resources>
Jan 15 15:38:49 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+        </lrm>
Jan 15 15:38:49 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+      </node_state>
Jan 15 15:38:49 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+    </status>
Jan 15 15:38:49 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+  </cib>
Jan 15 15:38:49 [30795] bl460g1n6       crmd: ( te_events.c:375   )    info: match_graph_event: 	Action prmVM2_monitor_0 (7) confirmed on bl460g1n7 (rc=0)
Jan 15 15:38:49 [30795] bl460g1n6       crmd: (te_actions.c:416   )  notice: te_rsc_command: 	Initiating action 6: probe_complete probe_complete on bl460g1n7 - no waiting
Jan 15 15:38:49 [30795] bl460g1n6       crmd: (te_actions.c:454   )    info: te_rsc_command: 	Action 6 confirmed - no wait
Jan 15 15:38:49 [30795] bl460g1n6       crmd: (     graph.c:336   )   debug: run_graph: 	Transition 2 (Complete=4, Pending=1, Fired=1, Skipped=0, Incomplete=7, Source=/var/lib/pacemaker/pengine/pe-input-2.bz2): In-progress
Jan 15 15:38:49 [30795] bl460g1n6       crmd: (     graph.c:336   )   debug: run_graph: 	Transition 2 (Complete=5, Pending=1, Fired=0, Skipped=0, Incomplete=7, Source=/var/lib/pacemaker/pengine/pe-input-2.bz2): In-progress
Jan 15 15:38:49 [30792] bl460g1n6       lrmd: (services_lin:301   )   debug: operation_finished: 	prmVM2_monitor_0:30840 - exited with rc=7
Jan 15 15:38:49 [30792] bl460g1n6       lrmd: (services_lin:306   )  notice: operation_finished: 	prmVM2_monitor_0:30840:stderr [ error: failed to get domain 'vm2' ]
Jan 15 15:38:49 [30792] bl460g1n6       lrmd: (services_lin:306   )  notice: operation_finished: 	prmVM2_monitor_0:30840:stderr [ error: Domain not found: no domain with matching name 'vm2' ]
Jan 15 15:38:49 [30792] bl460g1n6       lrmd: (services_lin:306   )  notice: operation_finished: 	prmVM2_monitor_0:30840:stderr [ error: failed to get domain 'vm2' ]
Jan 15 15:38:49 [30792] bl460g1n6       lrmd: (services_lin:306   )  notice: operation_finished: 	prmVM2_monitor_0:30840:stderr [ error: Domain not found: no domain with matching name 'vm2' ]
Jan 15 15:38:49 [30792] bl460g1n6       lrmd: (services_lin:310   )   debug: operation_finished: 	prmVM2_monitor_0:30840:stdout [ -- empty -- ]
Jan 15 15:38:49 [30792] bl460g1n6       lrmd: (      lrmd.c:104   )   debug: log_finished: 	finished - rsc:prmVM2 action:monitor call_id:5 pid:30840 exit-code:7 exec-time:144ms queue-time:0ms
Jan 15 15:38:49 [30795] bl460g1n6       crmd: (     utils.c:2126  )   debug: create_operation_update: 	do_update_resource: Updating resource prmVM2 after monitor op complete (interval=0)
Jan 15 15:38:49 [30795] bl460g1n6       crmd: (services_lin:604   )    info: services_os_action_execute: 	Managed VirtualDomain_meta-data_0 process 30891 exited with rc=0
Jan 15 15:38:49 [30795] bl460g1n6       crmd: (       lrm.c:2101  )  notice: process_lrm_event: 	LRM operation prmVM2_monitor_0 (call=5, rc=7, cib-update=57, confirmed=true) not running
Jan 15 15:38:49 [30795] bl460g1n6       crmd: (       lrm.c:122   )   debug: update_history_cache: 	Updating history for 'prmVM2' with monitor op
Jan 15 15:38:49 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=local/crmd/57, version=0.7.9)
Jan 15 15:38:49 [30791] bl460g1n6 stonith-ng: ( cib_utils.c:167   )   debug: Config update: 	Diff: --- 0.7.8
Jan 15 15:38:49 [30791] bl460g1n6 stonith-ng: ( cib_utils.c:169   )   debug: Config update: 	Diff: +++ 0.7.9 b068e214b66849c847874c0038fb08d3
Jan 15 15:38:49 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-- <cib num_updates="8"/>
Jan 15 15:38:49 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+  <cib epoch="7" num_updates="9" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.8" cib-last-written="Wed Jan 15 15:38:46 2014" update-origin="bl460g1n6" update-client="crmd" have-quorum="1" dc-uuid="3232261592">
Jan 15 15:38:49 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+    <status>
Jan 15 15:38:49 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+      <node_state id="3232261592" uname="bl460g1n6" in_ccm="true" crmd="online" crm-debug-origin="do_update_resource" join="member" expected="member">
Jan 15 15:38:49 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+        <lrm id="3232261592">
Jan 15 15:38:49 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+          <lrm_resources>
Jan 15 15:38:49 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++           <lrm_resource id="prmVM2" type="VirtualDomain" class="ocf" provider="heartbeat">
Jan 15 15:38:49 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++             <lrm_rsc_op id="prmVM2_last_0" operation_key="prmVM2_monitor_0" operation="monitor" crm-debug-origin="do_update_resource" crm_feature_set="3.0.8" transition-key="4:2:7:be72ea63-75a9-4de4-a591-e716f960743b" transition-magic="0:7;4:2:7:be72ea63-75a9-4de4-a591-e716f960743b" call-id="5" rc-code="7" op-status="0" interval="0" last-run="1389767928" last-rc-change="1389767928" exec-time="144" queue-time="0"
Jan 15 15:38:49 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++           </lrm_resource>
Jan 15 15:38:49 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+          </lrm_resources>
Jan 15 15:38:49 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+        </lrm>
Jan 15 15:38:49 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+      </node_state>
Jan 15 15:38:49 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+    </status>
Jan 15 15:38:49 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+  </cib>
Jan 15 15:38:49 [30795] bl460g1n6       crmd: (te_callbacks:122   )   debug: te_update_diff: 	Processing diff (cib_modify): 0.7.8 -> 0.7.9 (S_TRANSITION_ENGINE)
Jan 15 15:38:49 [30795] bl460g1n6       crmd: ( te_events.c:375   )    info: match_graph_event: 	Action prmVM2_monitor_0 (4) confirmed on bl460g1n6 (rc=0)
Jan 15 15:38:49 [30795] bl460g1n6       crmd: (te_actions.c:416   )  notice: te_rsc_command: 	Initiating action 3: probe_complete probe_complete on bl460g1n6 (local) - no waiting
Jan 15 15:38:49 [30795] bl460g1n6       crmd: (     utils.c:2023  )   debug: attrd_update_delegate: 	Sent update: probe_complete=true for bl460g1n6
Jan 15 15:38:49 [30795] bl460g1n6       crmd: (te_actions.c:454   )    info: te_rsc_command: 	Action 3 confirmed - no wait
Jan 15 15:38:49 [30795] bl460g1n6       crmd: (te_actions.c:55    )   debug: te_pseudo_action: 	Pseudo action 2 fired and confirmed
Jan 15 15:38:49 [30795] bl460g1n6       crmd: (     graph.c:336   )   debug: run_graph: 	Transition 2 (Complete=6, Pending=0, Fired=2, Skipped=0, Incomplete=5, Source=/var/lib/pacemaker/pengine/pe-input-2.bz2): In-progress
Jan 15 15:38:49 [30793] bl460g1n6      attrd: (  commands.c:244   )    info: attrd_client_message: 	Broadcasting probe_complete[bl460g1n6] = true (writer)
Jan 15 15:38:49 [30795] bl460g1n6       crmd: (te_actions.c:416   )  notice: te_rsc_command: 	Initiating action 9: start prmPing:0_start_0 on bl460g1n6 (local)
Jan 15 15:38:49 [30795] bl460g1n6       crmd: (       lrm.c:1780  )   debug: do_lrm_rsc_op: 	Stopped 0 recurring operations in preparation for prmPing_start_0
Jan 15 15:38:49 [30795] bl460g1n6       crmd: (       lrm.c:1784  )    info: do_lrm_rsc_op: 	Performing key=9:2:0:be72ea63-75a9-4de4-a591-e716f960743b op=prmPing_start_0
Jan 15 15:38:49 [30792] bl460g1n6       lrmd: (      lrmd.c:1313  )   debug: process_lrmd_message: 	Processed lrmd_rsc_exec operation from ea951299-8a4d-4fd6-8900-d6588e07ac38: rc=11, reply=1, notify=0, exit=4201864
Jan 15 15:38:49 [30792] bl460g1n6       lrmd: (      lrmd.c:122   )    info: log_execute: 	executing - rsc:prmPing action:start call_id:11
Jan 15 15:38:49 [30795] bl460g1n6       crmd: (te_actions.c:416   )  notice: te_rsc_command: 	Initiating action 11: start prmPing:1_start_0 on bl460g1n7
Jan 15 15:38:49 [30795] bl460g1n6       crmd: (     graph.c:336   )   debug: run_graph: 	Transition 2 (Complete=8, Pending=2, Fired=2, Skipped=0, Incomplete=3, Source=/var/lib/pacemaker/pengine/pe-input-2.bz2): In-progress
Jan 15 15:38:50 [30793] bl460g1n6      attrd: (  commands.c:630   )   debug: write_attributes: 	Skipping unchanged attribute shutdown
Jan 15 15:38:50 [30793] bl460g1n6      attrd: (  commands.c:739   )   debug: write_attribute: 	Update: bl460g1n7[default_ping_set]=100 (3232261593 3232261593 3232261593 bl460g1n7)
Jan 15 15:38:50 [30793] bl460g1n6      attrd: (  commands.c:765   )  notice: write_attribute: 	Sent update 9 with 1 changes for default_ping_set, id=<n/a>, set=(null)
Jan 15 15:38:50 [30793] bl460g1n6      attrd: (  commands.c:630   )   debug: write_attributes: 	Skipping unchanged attribute terminate
Jan 15 15:38:50 [30793] bl460g1n6      attrd: (  commands.c:630   )   debug: write_attributes: 	Skipping unchanged attribute probe_complete
Jan 15 15:38:50 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=local/attrd/9, version=0.7.10)
Jan 15 15:38:50 [30791] bl460g1n6 stonith-ng: ( cib_utils.c:167   )   debug: Config update: 	Diff: --- 0.7.9
Jan 15 15:38:50 [30791] bl460g1n6 stonith-ng: ( cib_utils.c:169   )   debug: Config update: 	Diff: +++ 0.7.10 550078b1a1268bff75d3a0d7907f101e
Jan 15 15:38:50 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-- <cib num_updates="9"/>
Jan 15 15:38:50 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+  <cib epoch="7" num_updates="10" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.8" cib-last-written="Wed Jan 15 15:38:46 2014" update-origin="bl460g1n6" update-client="crmd" have-quorum="1" dc-uuid="3232261592">
Jan 15 15:38:50 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+    <status>
Jan 15 15:38:50 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+      <node_state id="3232261593" uname="bl460g1n7" in_ccm="true" crmd="online" crm-debug-origin="do_update_resource" join="member" expected="member">
Jan 15 15:38:50 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+        <transient_attributes id="3232261593">
Jan 15 15:38:50 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+          <instance_attributes id="status-3232261593">
Jan 15 15:38:50 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++           <nvpair id="status-3232261593-default_ping_set" name="default_ping_set" value="100"/>
Jan 15 15:38:50 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+          </instance_attributes>
Jan 15 15:38:50 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+        </transient_attributes>
Jan 15 15:38:50 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+      </node_state>
Jan 15 15:38:50 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+    </status>
Jan 15 15:38:50 [30795] bl460g1n6       crmd: (te_callbacks:122   )   debug: te_update_diff: 	Processing diff (cib_modify): 0.7.9 -> 0.7.10 (S_TRANSITION_ENGINE)
Jan 15 15:38:50 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+  </cib>
Jan 15 15:38:50 [30795] bl460g1n6       crmd: (  te_utils.c:413   )    info: abort_transition_graph: 	te_update_diff:172 - Triggered transition abort (complete=0, node=bl460g1n7, tag=nvpair, id=status-3232261593-default_ping_set, name=default_ping_set, value=100, magic=NA, cib=0.7.10) : Transient attribute: update
Jan 15 15:38:50 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause   <nvpair id="status-3232261593-default_ping_set" name="default_ping_set" value="100" __crm_diff_marker__="added:top"/>
Jan 15 15:38:50 [30795] bl460g1n6       crmd: (     utils.c:271   )   debug: update_abort_priority: 	Abort priority upgraded from 0 to 1000000
Jan 15 15:38:50 [30795] bl460g1n6       crmd: (     utils.c:281   )   debug: update_abort_priority: 	Abort action done superceeded by restart
Jan 15 15:38:50 [30795] bl460g1n6       crmd: (     graph.c:336   )   debug: run_graph: 	Transition 2 (Complete=8, Pending=2, Fired=0, Skipped=2, Incomplete=1, Source=/var/lib/pacemaker/pengine/pe-input-2.bz2): In-progress
Jan 15 15:38:50 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=bl460g1n7/crmd/15, version=0.7.11)
Jan 15 15:38:50 [30791] bl460g1n6 stonith-ng: ( cib_utils.c:167   )   debug: Config update: 	Diff: --- 0.7.10
Jan 15 15:38:50 [30791] bl460g1n6 stonith-ng: ( cib_utils.c:169   )   debug: Config update: 	Diff: +++ 0.7.11 39a259cc792d185a0133a8aeed4aaa1d
Jan 15 15:38:50 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-  <cib num_updates="10">
Jan 15 15:38:50 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-    <status>
Jan 15 15:38:50 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-      <node_state id="3232261593">
Jan 15 15:38:50 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-        <lrm id="3232261593">
Jan 15 15:38:50 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-          <lrm_resources>
Jan 15 15:38:50 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-            <lrm_resource id="prmPing">
Jan 15 15:38:50 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	--             <lrm_rsc_op operation_key="prmPing_monitor_0" operation="monitor" transition-key="8:2:7:be72ea63-75a9-4de4-a591-e716f960743b" transition-magic="0:7;8:2:7:be72ea63-75a9-4de4-a591-e716f960743b" call-id="10" rc-code="7" last-run="1389767928" last-rc-change="1389767928" exec-time="15" id="prmPing_last_0"/>
Jan 15 15:38:50 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-            </lrm_resource>
Jan 15 15:38:50 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-          </lrm_resources>
Jan 15 15:38:50 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-        </lrm>
Jan 15 15:38:50 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-      </node_state>
Jan 15 15:38:50 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-    </status>
Jan 15 15:38:50 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-  </cib>
Jan 15 15:38:50 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+  <cib epoch="7" num_updates="11" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.8" cib-last-written="Wed Jan 15 15:38:46 2014" update-origin="bl460g1n6" update-client="crmd" have-quorum="1" dc-uuid="3232261592">
Jan 15 15:38:50 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+    <status>
Jan 15 15:38:50 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+      <node_state id="3232261593" uname="bl460g1n7" in_ccm="true" crmd="online" crm-debug-origin="do_update_resource" join="member" expected="member">
Jan 15 15:38:50 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+        <lrm id="3232261593">
Jan 15 15:38:50 [30793] bl460g1n6      attrd: (  commands.c:590   )    info: attrd_cib_callback: 	Update 9 for default_ping_set: OK (0)
Jan 15 15:38:50 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+          <lrm_resources>
Jan 15 15:38:50 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+            <lrm_resource id="prmPing" type="ping" class="ocf" provider="pacemaker">
Jan 15 15:38:50 [30793] bl460g1n6      attrd: (  commands.c:594   )  notice: attrd_cib_callback: 	Update 9 for default_ping_set[bl460g1n7]=100: OK (0)
Jan 15 15:38:50 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++             <lrm_rsc_op id="prmPing_last_0" operation_key="prmPing_start_0" operation="start" crm-debug-origin="do_update_resource" crm_feature_set="3.0.8" transition-key="11:2:0:be72ea63-75a9-4de4-a591-e716f960743b" transition-magic="0:0;11:2:0:be72ea63-75a9-4de4-a591-e716f960743b" call-id="11" rc-code="0" op-status="0" interval="0" last-run="1389767929" last-rc-change="1389767929" exec-time="1038" queue-time="
Jan 15 15:38:50 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+            </lrm_resource>
Jan 15 15:38:50 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+          </lrm_resources>
Jan 15 15:38:50 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+        </lrm>
Jan 15 15:38:50 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+      </node_state>
Jan 15 15:38:50 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+    </status>
Jan 15 15:38:50 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+  </cib>
Jan 15 15:38:50 [30795] bl460g1n6       crmd: (te_callbacks:122   )   debug: te_update_diff: 	Processing diff (cib_modify): 0.7.10 -> 0.7.11 (S_TRANSITION_ENGINE)
Jan 15 15:38:50 [30914] bl460g1n6 attrd_updater: (     utils.c:1995  )    info: attrd_update_delegate: 	Connecting to cluster... 5 retries remaining
Jan 15 15:38:50 [30795] bl460g1n6       crmd: ( te_events.c:375   )    info: match_graph_event: 	Action prmPing_start_0 (11) confirmed on bl460g1n7 (rc=0)
Jan 15 15:38:50 [30795] bl460g1n6       crmd: (     graph.c:336   )   debug: run_graph: 	Transition 2 (Complete=9, Pending=1, Fired=0, Skipped=2, Incomplete=1, Source=/var/lib/pacemaker/pengine/pe-input-2.bz2): In-progress
Jan 15 15:38:50 [30793] bl460g1n6      attrd: (       ipc.c:334   )    info: crm_client_new: 	Connecting 0x1754c30 for uid=0 gid=0 pid=30914 id=fafc8c1d-9410-4d8d-bfe2-3ed1dd64d2b1
Jan 15 15:38:50 [30793] bl460g1n6      attrd: ( ipc_setup.c:484   )   debug: handle_new_connection: 	IPC credentials authenticated (30793-30914-10)
Jan 15 15:38:50 [30793] bl460g1n6      attrd: (   ipc_shm.c:295   )   debug: qb_ipcs_shm_connect: 	connecting to client [30914]
Jan 15 15:38:50 [30793] bl460g1n6      attrd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Jan 15 15:38:50 [30793] bl460g1n6      attrd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Jan 15 15:38:50 [30793] bl460g1n6      attrd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Jan 15 15:38:50 [30914] bl460g1n6 attrd_updater: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Jan 15 15:38:50 [30914] bl460g1n6 attrd_updater: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Jan 15 15:38:50 [30914] bl460g1n6 attrd_updater: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Jan 15 15:38:50 [30914] bl460g1n6 attrd_updater: (     utils.c:2023  )   debug: attrd_update_delegate: 	Sent update: default_ping_set=100 for localhost
Jan 15 15:38:50 [30914] bl460g1n6 attrd_updater: (       xml.c:2719  )    info: crm_xml_cleanup: 	Cleaning up memory from libxml2
Jan 15 15:38:50 [30793] bl460g1n6      attrd: (  commands.c:244   )    info: attrd_client_message: 	Broadcasting default_ping_set[bl460g1n6] = 100 (writer)
Jan 15 15:38:50 [30793] bl460g1n6      attrd: (      ipcs.c:757   )   debug: qb_ipcs_dispatch_connection_request: 	HUP conn (30793-30914-10)
Jan 15 15:38:50 [30793] bl460g1n6      attrd: (      ipcs.c:605   )   debug: qb_ipcs_disconnect: 	qb_ipcs_disconnect(30793-30914-10) state:2
Jan 15 15:38:50 [30793] bl460g1n6      attrd: (       ipc.c:368   )    info: crm_client_destroy: 	Destroying 0 events
Jan 15 15:38:50 [30793] bl460g1n6      attrd: (ringbuffer.c:299   )   debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-attrd-response-30793-30914-10-header
Jan 15 15:38:50 [30793] bl460g1n6      attrd: (ringbuffer.c:299   )   debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-attrd-event-30793-30914-10-header
Jan 15 15:38:50 [30793] bl460g1n6      attrd: (ringbuffer.c:299   )   debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-attrd-request-30793-30914-10-header
Jan 15 15:38:50 [30793] bl460g1n6      attrd: (  commands.c:630   )   debug: write_attributes: 	Skipping unchanged attribute shutdown
Jan 15 15:38:50 [30793] bl460g1n6      attrd: (  commands.c:739   )   debug: write_attribute: 	Update: bl460g1n6[default_ping_set]=100 (3232261592 3232261592 3232261592 bl460g1n6)
Jan 15 15:38:50 [30792] bl460g1n6       lrmd: (services_lin:301   )   debug: operation_finished: 	prmPing_start_0:30897 - exited with rc=0
Jan 15 15:38:50 [30793] bl460g1n6      attrd: (  commands.c:739   )   debug: write_attribute: 	Update: bl460g1n7[default_ping_set]=100 (3232261593 3232261593 3232261593 bl460g1n7)
Jan 15 15:38:50 [30792] bl460g1n6       lrmd: (services_lin:306   )   debug: operation_finished: 	prmPing_start_0:30897:stderr [ -- empty -- ]
Jan 15 15:38:50 [30792] bl460g1n6       lrmd: (services_lin:310   )   debug: operation_finished: 	prmPing_start_0:30897:stdout [ -- empty -- ]
Jan 15 15:38:50 [30792] bl460g1n6       lrmd: (      lrmd.c:104   )    info: log_finished: 	finished - rsc:prmPing action:start call_id:11 pid:30897 exit-code:0 exec-time:1052ms queue-time:0ms
Jan 15 15:38:50 [30793] bl460g1n6      attrd: (  commands.c:765   )  notice: write_attribute: 	Sent update 10 with 2 changes for default_ping_set, id=<n/a>, set=(null)
Jan 15 15:38:50 [30793] bl460g1n6      attrd: (  commands.c:630   )   debug: write_attributes: 	Skipping unchanged attribute terminate
Jan 15 15:38:50 [30793] bl460g1n6      attrd: (  commands.c:630   )   debug: write_attributes: 	Skipping unchanged attribute probe_complete
Jan 15 15:38:50 [30795] bl460g1n6       crmd: (     utils.c:2126  )   debug: create_operation_update: 	do_update_resource: Updating resource prmPing after start op complete (interval=0)
Jan 15 15:38:50 [30795] bl460g1n6       crmd: (       lrm.c:2101  )  notice: process_lrm_event: 	LRM operation prmPing_start_0 (call=11, rc=0, cib-update=58, confirmed=true) ok
Jan 15 15:38:50 [30795] bl460g1n6       crmd: (       lrm.c:122   )   debug: update_history_cache: 	Updating history for 'prmPing' with start op
Jan 15 15:38:50 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=local/attrd/10, version=0.7.12)
Jan 15 15:38:50 [30791] bl460g1n6 stonith-ng: ( cib_utils.c:167   )   debug: Config update: 	Diff: --- 0.7.11
Jan 15 15:38:50 [30791] bl460g1n6 stonith-ng: ( cib_utils.c:169   )   debug: Config update: 	Diff: +++ 0.7.12 0e4f2858099cfd92a9f49d12d7cb7c5a
Jan 15 15:38:50 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-- <cib num_updates="11"/>
Jan 15 15:38:50 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+  <cib epoch="7" num_updates="12" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.8" cib-last-written="Wed Jan 15 15:38:46 2014" update-origin="bl460g1n6" update-client="crmd" have-quorum="1" dc-uuid="3232261592">
Jan 15 15:38:50 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+    <status>
Jan 15 15:38:50 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+      <node_state id="3232261592" uname="bl460g1n6" in_ccm="true" crmd="online" crm-debug-origin="do_update_resource" join="member" expected="member">
Jan 15 15:38:50 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+        <transient_attributes id="3232261592">
Jan 15 15:38:50 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+          <instance_attributes id="status-3232261592">
Jan 15 15:38:50 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++           <nvpair id="status-3232261592-default_ping_set" name="default_ping_set" value="100"/>
Jan 15 15:38:50 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+          </instance_attributes>
Jan 15 15:38:50 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+        </transient_attributes>
Jan 15 15:38:50 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+      </node_state>
Jan 15 15:38:50 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+    </status>
Jan 15 15:38:50 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+  </cib>
Jan 15 15:38:50 [30795] bl460g1n6       crmd: (te_callbacks:122   )   debug: te_update_diff: 	Processing diff (cib_modify): 0.7.11 -> 0.7.12 (S_TRANSITION_ENGINE)
Jan 15 15:38:50 [30795] bl460g1n6       crmd: (  te_utils.c:413   )    info: abort_transition_graph: 	te_update_diff:172 - Triggered transition abort (complete=0, node=bl460g1n6, tag=nvpair, id=status-3232261592-default_ping_set, name=default_ping_set, value=100, magic=NA, cib=0.7.12) : Transient attribute: update
Jan 15 15:38:50 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause   <nvpair id="status-3232261592-default_ping_set" name="default_ping_set" value="100" __crm_diff_marker__="added:top"/>
Jan 15 15:38:50 [30795] bl460g1n6       crmd: (te_callbacks:122   )   debug: te_update_diff: 	Processing diff (cib_modify): 0.7.12 -> 0.7.13 (S_TRANSITION_ENGINE)
Jan 15 15:38:50 [30791] bl460g1n6 stonith-ng: ( cib_utils.c:167   )   debug: Config update: 	Diff: --- 0.7.12
Jan 15 15:38:50 [30791] bl460g1n6 stonith-ng: ( cib_utils.c:169   )   debug: Config update: 	Diff: +++ 0.7.13 fa919354f78a3b13cd9c4c7905c540d7
Jan 15 15:38:50 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-  <cib num_updates="12">
Jan 15 15:38:50 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-    <status>
Jan 15 15:38:50 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-      <node_state id="3232261592">
Jan 15 15:38:50 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-        <lrm id="3232261592">
Jan 15 15:38:50 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-          <lrm_resources>
Jan 15 15:38:50 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-            <lrm_resource id="prmPing">
Jan 15 15:38:50 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	--             <lrm_rsc_op operation_key="prmPing_monitor_0" operation="monitor" transition-key="5:2:7:be72ea63-75a9-4de4-a591-e716f960743b" transition-magic="0:7;5:2:7:be72ea63-75a9-4de4-a591-e716f960743b" call-id="10" rc-code="7" last-run="1389767928" last-rc-change="1389767928" exec-time="20" id="prmPing_last_0"/>
Jan 15 15:38:50 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-            </lrm_resource>
Jan 15 15:38:50 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-          </lrm_resources>
Jan 15 15:38:50 [30795] bl460g1n6       crmd: ( te_events.c:375   )    info: match_graph_event: 	Action prmPing_start_0 (9) confirmed on bl460g1n6 (rc=0)
Jan 15 15:38:50 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-        </lrm>
Jan 15 15:38:50 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-      </node_state>
Jan 15 15:38:50 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-    </status>
Jan 15 15:38:50 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-  </cib>
Jan 15 15:38:50 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+  <cib epoch="7" num_updates="13" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.8" cib-last-written="Wed Jan 15 15:38:46 2014" update-origin="bl460g1n6" update-client="crmd" have-quorum="1" dc-uuid="3232261592">
Jan 15 15:38:50 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+    <status>
Jan 15 15:38:50 [30795] bl460g1n6       crmd: (te_actions.c:55    )   debug: te_pseudo_action: 	Pseudo action 14 fired and confirmed
Jan 15 15:38:50 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+      <node_state id="3232261592" uname="bl460g1n6" in_ccm="true" crmd="online" crm-debug-origin="do_update_resource" join="member" expected="member">
Jan 15 15:38:50 [30795] bl460g1n6       crmd: (     graph.c:336   )   debug: run_graph: 	Transition 2 (Complete=10, Pending=0, Fired=1, Skipped=2, Incomplete=0, Source=/var/lib/pacemaker/pengine/pe-input-2.bz2): In-progress
Jan 15 15:38:50 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+        <lrm id="3232261592">
Jan 15 15:38:50 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+          <lrm_resources>
Jan 15 15:38:50 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+            <lrm_resource id="prmPing" type="ping" class="ocf" provider="pacemaker">
Jan 15 15:38:50 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=local/crmd/58, version=0.7.13)
Jan 15 15:38:50 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++             <lrm_rsc_op id="prmPing_last_0" operation_key="prmPing_start_0" operation="start" crm-debug-origin="do_update_resource" crm_feature_set="3.0.8" transition-key="9:2:0:be72ea63-75a9-4de4-a591-e716f960743b" transition-magic="0:0;9:2:0:be72ea63-75a9-4de4-a591-e716f960743b" call-id="11" rc-code="0" op-status="0" interval="0" last-run="1389767929" last-rc-change="1389767929" exec-time="1052" queue-time="0"
Jan 15 15:38:50 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+            </lrm_resource>
Jan 15 15:38:50 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+          </lrm_resources>
Jan 15 15:38:50 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+        </lrm>
Jan 15 15:38:50 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+      </node_state>
Jan 15 15:38:50 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+    </status>
Jan 15 15:38:50 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+  </cib>
Jan 15 15:38:50 [30795] bl460g1n6       crmd: (     graph.c:336   )  notice: run_graph: 	Transition 2 (Complete=11, Pending=0, Fired=0, Skipped=2, Incomplete=0, Source=/var/lib/pacemaker/pengine/pe-input-2.bz2): Stopped
Jan 15 15:38:50 [30795] bl460g1n6       crmd: (  te_utils.c:355   )   debug: te_graph_trigger: 	Transition 2 is now complete
Jan 15 15:38:50 [30795] bl460g1n6       crmd: (te_actions.c:654   )   debug: notify_crmd: 	Processing transition completion in state S_TRANSITION_ENGINE
Jan 15 15:38:50 [30795] bl460g1n6       crmd: (     utils.c:192   )   debug: crm_timer_start: 	Started New Transition Timer (I_PE_CALC:2000ms), src=77
Jan 15 15:38:50 [30795] bl460g1n6       crmd: (te_actions.c:699   )   debug: notify_crmd: 	Transition 2 status: restart - Transient attribute: update
Jan 15 15:38:50 [30793] bl460g1n6      attrd: (  commands.c:590   )    info: attrd_cib_callback: 	Update 10 for default_ping_set: OK (0)
Jan 15 15:38:50 [30793] bl460g1n6      attrd: (  commands.c:594   )  notice: attrd_cib_callback: 	Update 10 for default_ping_set[bl460g1n6]=100: OK (0)
Jan 15 15:38:50 [30793] bl460g1n6      attrd: (  commands.c:594   )  notice: attrd_cib_callback: 	Update 10 for default_ping_set[bl460g1n7]=100: OK (0)
Jan 15 15:38:52 [30795] bl460g1n6       crmd: (     utils.c:120   )    info: crm_timer_popped: 	New Transition Timer (I_PE_CALC) just popped (2000ms)
Jan 15 15:38:52 [30795] bl460g1n6       crmd: (       fsa.c:193   )   debug: s_crmd_fsa: 	Processing I_PE_CALC: [ state=S_TRANSITION_ENGINE cause=C_TIMER_POPPED origin=crm_timer_popped ]
Jan 15 15:38:52 [30795] bl460g1n6       crmd: (       fsa.c:502   )    info: do_state_transition: 	State transition S_TRANSITION_ENGINE -> S_POLICY_ENGINE [ input=I_PE_CALC cause=C_TIMER_POPPED origin=crm_timer_popped ]
Jan 15 15:38:52 [30795] bl460g1n6       crmd: (       fsa.c:599   )    info: do_state_transition: 	Progressed to state S_POLICY_ENGINE after C_TIMER_POPPED
Jan 15 15:38:52 [30795] bl460g1n6       crmd: (       fsa.c:610   )   debug: do_state_transition: 	All 2 cluster nodes are eligible to run resources.
Jan 15 15:38:52 [30795] bl460g1n6       crmd: (   pengine.c:231   )   debug: do_pe_invoke: 	Query 59: Requesting the current CIB: S_POLICY_ENGINE
Jan 15 15:38:52 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_query operation for section 'all': OK (rc=0, origin=local/crmd/59, version=0.7.13)
Jan 15 15:38:52 [30795] bl460g1n6       crmd: (   pengine.c:299   )   debug: do_pe_invoke_callback: 	Invoking the PE: query=59, ref=pe_calc-dc-1389767932-38, seq=8, quorate=1
Jan 15 15:38:52 [30794] bl460g1n6    pengine: (    unpack.c:93    )   debug: unpack_config: 	STONITH timeout: 60000
Jan 15 15:38:52 [30794] bl460g1n6    pengine: (    unpack.c:97    )   debug: unpack_config: 	STONITH of failed nodes is disabled
Jan 15 15:38:52 [30794] bl460g1n6    pengine: (    unpack.c:104   )   debug: unpack_config: 	Stop all active resources: false
Jan 15 15:38:52 [30794] bl460g1n6    pengine: (    unpack.c:108   )   debug: unpack_config: 	Cluster is symmetric - resources can run anywhere by default
Jan 15 15:38:52 [30794] bl460g1n6    pengine: (    unpack.c:113   )   debug: unpack_config: 	Default stickiness: 0
Jan 15 15:38:52 [30794] bl460g1n6    pengine: (    unpack.c:156   )  notice: unpack_config: 	On loss of CCM Quorum: Ignore
Jan 15 15:38:52 [30794] bl460g1n6    pengine: (    unpack.c:196   )   debug: unpack_config: 	Node scores: 'red' = -INFINITY, 'yellow' = 0, 'green' = 0
Jan 15 15:38:52 [30794] bl460g1n6    pengine: (    unpack.c:504   )   debug: unpack_domains: 	Unpacking domains
Jan 15 15:38:52 [30794] bl460g1n6    pengine: (    unpack.c:1331  )    info: determine_online_status: 	Node bl460g1n6 is online
Jan 15 15:38:52 [30794] bl460g1n6    pengine: (    unpack.c:1331  )    info: determine_online_status: 	Node bl460g1n7 is online
Jan 15 15:38:52 [30794] bl460g1n6    pengine: (    unpack.c:1553  )   debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n6 to prmPing:0
Jan 15 15:38:52 [30794] bl460g1n6    pengine: (    unpack.c:2794  )   trace: unpack_rsc_op: 	Unpacking task prmPing_start_0/start (call_id=11, status=0, rc=0) on bl460g1n6 (role=Unknown)
Jan 15 15:38:52 [30794] bl460g1n6    pengine: (    unpack.c:2839  )   trace: unpack_rsc_op: 	Handling status: 0
Jan 15 15:38:52 [30794] bl460g1n6    pengine: (    unpack.c:2862  )   trace: unpack_rsc_op: 	prmPing:0/start completed on bl460g1n6
Jan 15 15:38:52 [30794] bl460g1n6    pengine: (    unpack.c:2928  )   trace: unpack_rsc_op: 	Resource prmPing:0 after start: role=Started
Jan 15 15:38:52 [30794] bl460g1n6    pengine: (    unpack.c:2794  )   trace: unpack_rsc_op: 	Unpacking task prmVM2_monitor_0/monitor (call_id=5, status=0, rc=7) on bl460g1n6 (role=Unknown)
Jan 15 15:38:52 [30794] bl460g1n6    pengine: (    unpack.c:2839  )   trace: unpack_rsc_op: 	Handling status: 0
Jan 15 15:38:52 [30794] bl460g1n6    pengine: (    unpack.c:2862  )   trace: unpack_rsc_op: 	prmVM2/monitor completed on bl460g1n6
Jan 15 15:38:52 [30794] bl460g1n6    pengine: (    unpack.c:2928  )   trace: unpack_rsc_op: 	Resource prmVM2 after monitor: role=Stopped
Jan 15 15:38:52 [30794] bl460g1n6    pengine: (    unpack.c:1553  )   debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n7 to prmPing:1
Jan 15 15:38:52 [30794] bl460g1n6    pengine: (    unpack.c:2794  )   trace: unpack_rsc_op: 	Unpacking task prmPing_start_0/start (call_id=11, status=0, rc=0) on bl460g1n7 (role=Unknown)
Jan 15 15:38:52 [30794] bl460g1n6    pengine: (    unpack.c:2839  )   trace: unpack_rsc_op: 	Handling status: 0
Jan 15 15:38:52 [30794] bl460g1n6    pengine: (    unpack.c:2862  )   trace: unpack_rsc_op: 	prmPing:1/start completed on bl460g1n7
Jan 15 15:38:52 [30794] bl460g1n6    pengine: (    unpack.c:2928  )   trace: unpack_rsc_op: 	Resource prmPing:1 after start: role=Started
Jan 15 15:38:52 [30794] bl460g1n6    pengine: (    unpack.c:2794  )   trace: unpack_rsc_op: 	Unpacking task prmVM2_monitor_0/monitor (call_id=5, status=0, rc=7) on bl460g1n7 (role=Unknown)
Jan 15 15:38:52 [30794] bl460g1n6    pengine: (    unpack.c:2839  )   trace: unpack_rsc_op: 	Handling status: 0
Jan 15 15:38:52 [30794] bl460g1n6    pengine: (    unpack.c:2862  )   trace: unpack_rsc_op: 	prmVM2/monitor completed on bl460g1n7
Jan 15 15:38:52 [30794] bl460g1n6    pengine: (    unpack.c:2928  )   trace: unpack_rsc_op: 	Resource prmVM2 after monitor: role=Stopped
Jan 15 15:38:52 [30794] bl460g1n6    pengine: (    native.c:446   )    info: native_print: 	prmVM2	(ocf::heartbeat:VirtualDomain):	Stopped 
Jan 15 15:38:52 [30794] bl460g1n6    pengine: (     clone.c:417   )    info: clone_print: 	 Clone Set: clnPing [prmPing]
Jan 15 15:38:52 [30794] bl460g1n6    pengine: (    native.c:272   )   debug: native_active: 	Resource prmPing:0 active on bl460g1n6
Jan 15 15:38:52 [30794] bl460g1n6    pengine: (    native.c:272   )   debug: native_active: 	Resource prmPing:0 active on bl460g1n6
Jan 15 15:38:52 [30794] bl460g1n6    pengine: (    native.c:272   )   debug: native_active: 	Resource prmPing:1 active on bl460g1n7
Jan 15 15:38:52 [30794] bl460g1n6    pengine: (    native.c:272   )   debug: native_active: 	Resource prmPing:1 active on bl460g1n7
Jan 15 15:38:52 [30794] bl460g1n6    pengine: (     clone.c:311   )    info: short_print: 	     Started: [ bl460g1n6 bl460g1n7 ]
Jan 15 15:38:52 [30794] bl460g1n6    pengine: (  allocate.c:593   )   debug: common_apply_stickiness: 	Resource prmPing:0: preferring current location (node=bl460g1n6, weight=1000000)
Jan 15 15:38:52 [30794] bl460g1n6    pengine: (  allocate.c:593   )   debug: common_apply_stickiness: 	Resource prmPing:1: preferring current location (node=bl460g1n7, weight=1000000)
Jan 15 15:38:52 [30794] bl460g1n6    pengine: (     utils.c:386   )   debug: native_assign_node: 	Assigning bl460g1n6 to prmPing:0
Jan 15 15:38:52 [30794] bl460g1n6    pengine: (     utils.c:386   )   debug: native_assign_node: 	Assigning bl460g1n7 to prmPing:1
Jan 15 15:38:52 [30794] bl460g1n6    pengine: (     clone.c:625   )   debug: clone_color: 	Allocated 2 clnPing instances of a possible 2
Jan 15 15:38:52 [30794] bl460g1n6    pengine: (     utils.c:386   )   debug: native_assign_node: 	Assigning bl460g1n6 to prmVM2
Jan 15 15:38:52 [30794] bl460g1n6    pengine: (    native.c:790   )    info: RecurringOp: 	 Start recurring monitor (10s) for prmVM2 on bl460g1n6
Jan 15 15:38:52 [30794] bl460g1n6    pengine: (    native.c:790   )    info: RecurringOp: 	 Start recurring monitor (10s) for prmPing:0 on bl460g1n6
Jan 15 15:38:52 [30794] bl460g1n6    pengine: (    native.c:790   )    info: RecurringOp: 	 Start recurring monitor (10s) for prmPing:1 on bl460g1n7
Jan 15 15:38:52 [30794] bl460g1n6    pengine: (    native.c:2238  )  notice: LogActions: 	Start   prmVM2	(bl460g1n6)
Jan 15 15:38:52 [30794] bl460g1n6    pengine: (    native.c:2141  )    info: LogActions: 	Leave   prmPing:0	(Started bl460g1n6)
Jan 15 15:38:52 [30794] bl460g1n6    pengine: (    native.c:2141  )    info: LogActions: 	Leave   prmPing:1	(Started bl460g1n7)
Jan 15 15:38:52 [30794] bl460g1n6    pengine: (     graph.c:769   )   debug: action2xml: 	Using anonymous clone name prmPing for prmPing:0 (aka. prmPing)
Jan 15 15:38:52 [30794] bl460g1n6    pengine: (     graph.c:769   )   debug: action2xml: 	Using anonymous clone name prmPing for prmPing:1 (aka. prmPing)
Jan 15 15:38:52 [30794] bl460g1n6    pengine: (   pengine.c:178   )  notice: process_pe_message: 	Calculated Transition 3: /var/lib/pacemaker/pengine/pe-input-3.bz2
Jan 15 15:38:52 [30795] bl460g1n6       crmd: (       fsa.c:193   )   debug: s_crmd_fsa: 	Processing I_PE_SUCCESS: [ state=S_POLICY_ENGINE cause=C_IPC_MESSAGE origin=handle_response ]
Jan 15 15:38:52 [30795] bl460g1n6       crmd: (       fsa.c:502   )    info: do_state_transition: 	State transition S_POLICY_ENGINE -> S_TRANSITION_ENGINE [ input=I_PE_SUCCESS cause=C_IPC_MESSAGE origin=handle_response ]
Jan 15 15:38:52 [30795] bl460g1n6       crmd: (    unpack.c:230   )   debug: unpack_graph: 	Unpacked transition 3: 4 actions in 4 synapses
Jan 15 15:38:52 [30795] bl460g1n6       crmd: (   tengine.c:208   )    info: do_te_invoke: 	Processing graph 3 (ref=pe_calc-dc-1389767932-38) derived from /var/lib/pacemaker/pengine/pe-input-3.bz2
Jan 15 15:38:52 [30795] bl460g1n6       crmd: (te_actions.c:416   )  notice: te_rsc_command: 	Initiating action 5: start prmVM2_start_0 on bl460g1n6 (local)
Jan 15 15:38:52 [30795] bl460g1n6       crmd: (       lrm.c:1780  )   debug: do_lrm_rsc_op: 	Stopped 0 recurring operations in preparation for prmVM2_start_0
Jan 15 15:38:52 [30795] bl460g1n6       crmd: (       lrm.c:1784  )    info: do_lrm_rsc_op: 	Performing key=5:3:0:be72ea63-75a9-4de4-a591-e716f960743b op=prmVM2_start_0
Jan 15 15:38:52 [30792] bl460g1n6       lrmd: (      lrmd.c:1313  )   debug: process_lrmd_message: 	Processed lrmd_rsc_exec operation from ea951299-8a4d-4fd6-8900-d6588e07ac38: rc=12, reply=1, notify=0, exit=4201864
Jan 15 15:38:52 [30792] bl460g1n6       lrmd: (      lrmd.c:122   )    info: log_execute: 	executing - rsc:prmVM2 action:start call_id:12
Jan 15 15:38:52 [30795] bl460g1n6       crmd: (te_actions.c:416   )  notice: te_rsc_command: 	Initiating action 9: monitor prmPing_monitor_10000 on bl460g1n6 (local)
Jan 15 15:38:52 [30795] bl460g1n6       crmd: (       lrm.c:1784  )    info: do_lrm_rsc_op: 	Performing key=9:3:0:be72ea63-75a9-4de4-a591-e716f960743b op=prmPing_monitor_10000
Jan 15 15:38:52 [30792] bl460g1n6       lrmd: (      lrmd.c:1313  )   debug: process_lrmd_message: 	Processed lrmd_rsc_exec operation from ea951299-8a4d-4fd6-8900-d6588e07ac38: rc=13, reply=1, notify=0, exit=4201864
Jan 15 15:38:52 [30792] bl460g1n6       lrmd: (      lrmd.c:122   )   debug: log_execute: 	executing - rsc:prmPing action:monitor call_id:13
Jan 15 15:38:52 [30795] bl460g1n6       crmd: (te_actions.c:416   )  notice: te_rsc_command: 	Initiating action 12: monitor prmPing_monitor_10000 on bl460g1n7
Jan 15 15:38:52 [30795] bl460g1n6       crmd: (     graph.c:336   )   debug: run_graph: 	Transition 3 (Complete=0, Pending=3, Fired=3, Skipped=0, Incomplete=1, Source=/var/lib/pacemaker/pengine/pe-input-3.bz2): In-progress
VirtualDomain(prmVM2)[30915]:	2014/01/15_15:38:52 DEBUG: Virtual domain vm2 is currently error: failed to get domain 'vm2'
error: domain not found: no domain with matching name 'vm2'.
VirtualDomain(prmVM2)[30915]:	2014/01/15_15:38:53 DEBUG: Virtual domain vm2 is currently running.
Jan 15 15:38:53 [30790] bl460g1n6        cib: (       ipc.c:334   )    info: crm_client_new: 	Connecting 0x1a3ddf0 for uid=0 gid=0 pid=31074 id=93d5c794-ee67-457d-af2f-b69a6c57c5ba
Jan 15 15:38:53 [30790] bl460g1n6        cib: ( ipc_setup.c:484   )   debug: handle_new_connection: 	IPC credentials authenticated (30790-31074-14)
Jan 15 15:38:53 [30790] bl460g1n6        cib: (   ipc_shm.c:295   )   debug: qb_ipcs_shm_connect: 	connecting to client [31074]
Jan 15 15:38:53 [30790] bl460g1n6        cib: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:38:53 [30790] bl460g1n6        cib: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:38:53 [30790] bl460g1n6        cib: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:38:53 [31074] bl460g1n6 crm_resource: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:38:53 [31074] bl460g1n6 crm_resource: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:38:53 [31074] bl460g1n6 crm_resource: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:38:53 [31074] bl460g1n6 crm_resource: (cib_native.c:268   )   debug: cib_native_signon_raw: 	Connection to CIB successful
Jan 15 15:38:53 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_query operation for section 'all': OK (rc=0, origin=local/crm_resource/2, version=0.7.13)
Jan 15 15:38:53 [31074] bl460g1n6 crm_resource: (    unpack.c:93    )   debug: unpack_config: 	STONITH timeout: 60000
Jan 15 15:38:53 [31074] bl460g1n6 crm_resource: (    unpack.c:97    )   debug: unpack_config: 	STONITH of failed nodes is disabled
Jan 15 15:38:53 [31074] bl460g1n6 crm_resource: (    unpack.c:104   )   debug: unpack_config: 	Stop all active resources: false
Jan 15 15:38:53 [31074] bl460g1n6 crm_resource: (    unpack.c:108   )   debug: unpack_config: 	Cluster is symmetric - resources can run anywhere by default
Jan 15 15:38:53 [31074] bl460g1n6 crm_resource: (    unpack.c:113   )   debug: unpack_config: 	Default stickiness: 0
Jan 15 15:38:53 [31074] bl460g1n6 crm_resource: (    unpack.c:156   )  notice: unpack_config: 	On loss of CCM Quorum: Ignore
Jan 15 15:38:53 [31074] bl460g1n6 crm_resource: (    unpack.c:196   )   debug: unpack_config: 	Node scores: 'red' = -INFINITY, 'yellow' = 0, 'green' = 0
Jan 15 15:38:53 [31074] bl460g1n6 crm_resource: (    unpack.c:413   ) warning: handle_startup_fencing: 	Blind faith: not fencing unseen nodes
Jan 15 15:38:53 [31074] bl460g1n6 crm_resource: (    unpack.c:504   )   debug: unpack_domains: 	Unpacking domains
Jan 15 15:38:53 [31074] bl460g1n6 crm_resource: (    unpack.c:1331  )    info: determine_online_status: 	Node bl460g1n6 is online
Jan 15 15:38:53 [31074] bl460g1n6 crm_resource: (    unpack.c:1331  )    info: determine_online_status: 	Node bl460g1n7 is online
Jan 15 15:38:53 [31074] bl460g1n6 crm_resource: (    unpack.c:1553  )   debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n6 to prmPing:0
Jan 15 15:38:53 [31074] bl460g1n6 crm_resource: (    unpack.c:2794  )   trace: unpack_rsc_op: 	Unpacking task prmPing_start_0/start (call_id=11, status=0, rc=0) on bl460g1n6 (role=Unknown)
Jan 15 15:38:53 [31074] bl460g1n6 crm_resource: (    unpack.c:2839  )   trace: unpack_rsc_op: 	Handling status: 0
Jan 15 15:38:53 [31074] bl460g1n6 crm_resource: (    unpack.c:2862  )   trace: unpack_rsc_op: 	prmPing:0/start completed on bl460g1n6
Jan 15 15:38:53 [31074] bl460g1n6 crm_resource: (    unpack.c:2928  )   trace: unpack_rsc_op: 	Resource prmPing:0 after start: role=Started
Jan 15 15:38:53 [31074] bl460g1n6 crm_resource: (    unpack.c:2794  )   trace: unpack_rsc_op: 	Unpacking task prmVM2_monitor_0/monitor (call_id=5, status=0, rc=7) on bl460g1n6 (role=Unknown)
Jan 15 15:38:53 [31074] bl460g1n6 crm_resource: (    unpack.c:2839  )   trace: unpack_rsc_op: 	Handling status: 0
Jan 15 15:38:53 [31074] bl460g1n6 crm_resource: (    unpack.c:2862  )   trace: unpack_rsc_op: 	prmVM2/monitor completed on bl460g1n6
Jan 15 15:38:53 [31074] bl460g1n6 crm_resource: (    unpack.c:2928  )   trace: unpack_rsc_op: 	Resource prmVM2 after monitor: role=Stopped
Jan 15 15:38:53 [31074] bl460g1n6 crm_resource: (    unpack.c:1553  )   debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n7 to prmPing:1
Jan 15 15:38:53 [31074] bl460g1n6 crm_resource: (    unpack.c:2794  )   trace: unpack_rsc_op: 	Unpacking task prmPing_start_0/start (call_id=11, status=0, rc=0) on bl460g1n7 (role=Unknown)
Jan 15 15:38:53 [31074] bl460g1n6 crm_resource: (    unpack.c:2839  )   trace: unpack_rsc_op: 	Handling status: 0
Jan 15 15:38:53 [31074] bl460g1n6 crm_resource: (    unpack.c:2862  )   trace: unpack_rsc_op: 	prmPing:1/start completed on bl460g1n7
Jan 15 15:38:53 [31074] bl460g1n6 crm_resource: (    unpack.c:2928  )   trace: unpack_rsc_op: 	Resource prmPing:1 after start: role=Started
Jan 15 15:38:53 [31074] bl460g1n6 crm_resource: (    unpack.c:2794  )   trace: unpack_rsc_op: 	Unpacking task prmVM2_monitor_0/monitor (call_id=5, status=0, rc=7) on bl460g1n7 (role=Unknown)
Jan 15 15:38:53 [31074] bl460g1n6 crm_resource: (    unpack.c:2839  )   trace: unpack_rsc_op: 	Handling status: 0
Jan 15 15:38:53 [31074] bl460g1n6 crm_resource: (    unpack.c:2862  )   trace: unpack_rsc_op: 	prmVM2/monitor completed on bl460g1n7
Jan 15 15:38:53 [31074] bl460g1n6 crm_resource: (    unpack.c:2928  )   trace: unpack_rsc_op: 	Resource prmVM2 after monitor: role=Stopped
Jan 15 15:38:53 [31074] bl460g1n6 crm_resource: (crm_resource:382   )   debug: dump_resource_attr: 	Looking up cpu in prmVM2
Jan 15 15:38:53 [31074] bl460g1n6 crm_resource: (cib_native.c:282   )   debug: cib_native_signoff: 	Signing out of the CIB Service
Jan 15 15:38:53 [31074] bl460g1n6 crm_resource: (      ipcc.c:378   )   debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Jan 15 15:38:53 [30790] bl460g1n6        cib: (      ipcs.c:757   )   debug: qb_ipcs_dispatch_connection_request: 	HUP conn (30790-31074-14)
Jan 15 15:38:53 [31074] bl460g1n6 crm_resource: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-request-30790-31074-14-header
Jan 15 15:38:53 [30790] bl460g1n6        cib: (      ipcs.c:605   )   debug: qb_ipcs_disconnect: 	qb_ipcs_disconnect(30790-31074-14) state:2
Jan 15 15:38:53 [31074] bl460g1n6 crm_resource: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-response-30790-31074-14-header
Jan 15 15:38:53 [31074] bl460g1n6 crm_resource: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-event-30790-31074-14-header
Jan 15 15:38:53 [30790] bl460g1n6        cib: (       ipc.c:368   )    info: crm_client_destroy: 	Destroying 0 events
Jan 15 15:38:53 [31074] bl460g1n6 crm_resource: (crm_resource:2228  ) warning: main: 	Error performing operation: No such device or address
Jan 15 15:38:53 [30790] bl460g1n6        cib: (ringbuffer.c:299   )   debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-response-30790-31074-14-header
Jan 15 15:38:53 [31074] bl460g1n6 crm_resource: (       xml.c:2719  )    info: crm_xml_cleanup: 	Cleaning up memory from libxml2
Jan 15 15:38:53 [30790] bl460g1n6        cib: (ringbuffer.c:299   )   debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-event-30790-31074-14-header
Jan 15 15:38:53 [30790] bl460g1n6        cib: (ringbuffer.c:299   )   debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-request-30790-31074-14-header
Jan 15 15:38:53 [30790] bl460g1n6        cib: (       ipc.c:334   )    info: crm_client_new: 	Connecting 0x1a3ddf0 for uid=0 gid=0 pid=31076 id=7c36c693-1648-466a-b5fa-14acdb1b3450
Jan 15 15:38:53 [30790] bl460g1n6        cib: ( ipc_setup.c:484   )   debug: handle_new_connection: 	IPC credentials authenticated (30790-31076-14)
Jan 15 15:38:53 [30790] bl460g1n6        cib: (   ipc_shm.c:295   )   debug: qb_ipcs_shm_connect: 	connecting to client [31076]
Jan 15 15:38:53 [30790] bl460g1n6        cib: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:38:53 [30790] bl460g1n6        cib: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:38:53 [30790] bl460g1n6        cib: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:38:53 [31076] bl460g1n6 crm_resource: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:38:53 [31076] bl460g1n6 crm_resource: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:38:53 [31076] bl460g1n6 crm_resource: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:38:53 [31076] bl460g1n6 crm_resource: (cib_native.c:268   )   debug: cib_native_signon_raw: 	Connection to CIB successful
Jan 15 15:38:53 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_query operation for section 'all': OK (rc=0, origin=local/crm_resource/2, version=0.7.13)
Jan 15 15:38:53 [31076] bl460g1n6 crm_resource: (    unpack.c:93    )   debug: unpack_config: 	STONITH timeout: 60000
Jan 15 15:38:53 [31076] bl460g1n6 crm_resource: (    unpack.c:97    )   debug: unpack_config: 	STONITH of failed nodes is disabled
Jan 15 15:38:53 [31076] bl460g1n6 crm_resource: (    unpack.c:104   )   debug: unpack_config: 	Stop all active resources: false
Jan 15 15:38:53 [31076] bl460g1n6 crm_resource: (    unpack.c:108   )   debug: unpack_config: 	Cluster is symmetric - resources can run anywhere by default
Jan 15 15:38:53 [31076] bl460g1n6 crm_resource: (    unpack.c:113   )   debug: unpack_config: 	Default stickiness: 0
Jan 15 15:38:53 [31076] bl460g1n6 crm_resource: (    unpack.c:156   )  notice: unpack_config: 	On loss of CCM Quorum: Ignore
Jan 15 15:38:53 [31076] bl460g1n6 crm_resource: (    unpack.c:196   )   debug: unpack_config: 	Node scores: 'red' = -INFINITY, 'yellow' = 0, 'green' = 0
Jan 15 15:38:53 [31076] bl460g1n6 crm_resource: (    unpack.c:413   ) warning: handle_startup_fencing: 	Blind faith: not fencing unseen nodes
Jan 15 15:38:53 [31076] bl460g1n6 crm_resource: (    unpack.c:504   )   debug: unpack_domains: 	Unpacking domains
Jan 15 15:38:53 [31076] bl460g1n6 crm_resource: (    unpack.c:1331  )    info: determine_online_status: 	Node bl460g1n6 is online
Jan 15 15:38:53 [31076] bl460g1n6 crm_resource: (    unpack.c:1331  )    info: determine_online_status: 	Node bl460g1n7 is online
Jan 15 15:38:53 [31076] bl460g1n6 crm_resource: (    unpack.c:1553  )   debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n6 to prmPing:0
Jan 15 15:38:53 [31076] bl460g1n6 crm_resource: (    unpack.c:2794  )   trace: unpack_rsc_op: 	Unpacking task prmPing_start_0/start (call_id=11, status=0, rc=0) on bl460g1n6 (role=Unknown)
Jan 15 15:38:53 [31076] bl460g1n6 crm_resource: (    unpack.c:2839  )   trace: unpack_rsc_op: 	Handling status: 0
Jan 15 15:38:53 [31076] bl460g1n6 crm_resource: (    unpack.c:2862  )   trace: unpack_rsc_op: 	prmPing:0/start completed on bl460g1n6
Jan 15 15:38:53 [31076] bl460g1n6 crm_resource: (    unpack.c:2928  )   trace: unpack_rsc_op: 	Resource prmPing:0 after start: role=Started
Jan 15 15:38:53 [31076] bl460g1n6 crm_resource: (    unpack.c:2794  )   trace: unpack_rsc_op: 	Unpacking task prmVM2_monitor_0/monitor (call_id=5, status=0, rc=7) on bl460g1n6 (role=Unknown)
Jan 15 15:38:53 [31076] bl460g1n6 crm_resource: (    unpack.c:2839  )   trace: unpack_rsc_op: 	Handling status: 0
Jan 15 15:38:53 [31076] bl460g1n6 crm_resource: (    unpack.c:2862  )   trace: unpack_rsc_op: 	prmVM2/monitor completed on bl460g1n6
Jan 15 15:38:53 [31076] bl460g1n6 crm_resource: (    unpack.c:2928  )   trace: unpack_rsc_op: 	Resource prmVM2 after monitor: role=Stopped
Jan 15 15:38:53 [31076] bl460g1n6 crm_resource: (    unpack.c:1553  )   debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n7 to prmPing:1
Jan 15 15:38:53 [31076] bl460g1n6 crm_resource: (    unpack.c:2794  )   trace: unpack_rsc_op: 	Unpacking task prmPing_start_0/start (call_id=11, status=0, rc=0) on bl460g1n7 (role=Unknown)
Jan 15 15:38:53 [31076] bl460g1n6 crm_resource: (    unpack.c:2839  )   trace: unpack_rsc_op: 	Handling status: 0
Jan 15 15:38:53 [31076] bl460g1n6 crm_resource: (    unpack.c:2862  )   trace: unpack_rsc_op: 	prmPing:1/start completed on bl460g1n7
Jan 15 15:38:53 [31076] bl460g1n6 crm_resource: (    unpack.c:2928  )   trace: unpack_rsc_op: 	Resource prmPing:1 after start: role=Started
Jan 15 15:38:53 [31076] bl460g1n6 crm_resource: (    unpack.c:2794  )   trace: unpack_rsc_op: 	Unpacking task prmVM2_monitor_0/monitor (call_id=5, status=0, rc=7) on bl460g1n7 (role=Unknown)
Jan 15 15:38:53 [31076] bl460g1n6 crm_resource: (    unpack.c:2839  )   trace: unpack_rsc_op: 	Handling status: 0
Jan 15 15:38:53 [31076] bl460g1n6 crm_resource: (    unpack.c:2862  )   trace: unpack_rsc_op: 	prmVM2/monitor completed on bl460g1n7
Jan 15 15:38:53 [31076] bl460g1n6 crm_resource: (    unpack.c:2928  )   trace: unpack_rsc_op: 	Resource prmVM2 after monitor: role=Stopped
Jan 15 15:38:53 [30790] bl460g1n6        cib: (   cib_ops.c:905   )   debug: cib_process_xpath: 	cib_query: //cib/configuration/resources//*[@id="prmVM2"]/utilization//nvpair[@name="cpu"] does not exist
Jan 15 15:38:53 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_query operation for section //cib/configuration/resources//*[@id="prmVM2"]/utilization//nvpair[@name="cpu"]: No such device or address (rc=-6, origin=local/crm_resource/3, version=0.7.13)
Jan 15 15:38:53 [30790] bl460g1n6        cib: (   cib_ops.c:923   )   debug: cib_process_xpath: 	Processing cib_query op for /cib (/cib)
Jan 15 15:38:53 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_query operation for section /cib: OK (rc=0, origin=local/crm_resource/4, version=0.7.13)
Jan 15 15:38:53 [31076] bl460g1n6 crm_resource: (crm_resource:561   )   debug: set_resource_attr: 	Update   <primitive id="prmVM2">
Jan 15 15:38:53 [31076] bl460g1n6 crm_resource: (crm_resource:561   )   debug: set_resource_attr: 	Update     <utilization id="prmVM2-utilization">
Jan 15 15:38:53 [31076] bl460g1n6 crm_resource: (crm_resource:561   )   debug: set_resource_attr: 	Update       <nvpair id="prmVM2-utilization-cpu" name="cpu" value="1"/>
Jan 15 15:38:53 [31076] bl460g1n6 crm_resource: (crm_resource:561   )   debug: set_resource_attr: 	Update     </utilization>
Jan 15 15:38:53 [31076] bl460g1n6 crm_resource: (crm_resource:561   )   debug: set_resource_attr: 	Update   </primitive>
Jan 15 15:38:53 [30790] bl460g1n6        cib: (        io.c:596   )   debug: activateCibXml: 	Triggering CIB write for cib_modify op
Jan 15 15:38:53 [30795] bl460g1n6       crmd: (te_callbacks:122   )   debug: te_update_diff: 	Processing diff (cib_modify): 0.7.13 -> 0.8.1 (S_TRANSITION_ENGINE)
Jan 15 15:38:53 [30795] bl460g1n6       crmd: (  te_utils.c:419   )    info: abort_transition_graph: 	te_update_diff:126 - Triggered transition abort (complete=0, node=, tag=diff, id=(null), magic=NA, cib=0.8.1) : Non-status change
Jan 15 15:38:53 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause   <diff crm_feature_set="3.0.8" digest="e48690591e91d21f853e91166a9db940">
Jan 15 15:38:53 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause     <diff-removed admin_epoch="0" epoch="7" num_updates="13">
Jan 15 15:38:53 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause       <cib admin_epoch="0" epoch="7" num_updates="13"/>
Jan 15 15:38:53 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause     </diff-removed>
Jan 15 15:38:53 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause     <diff-added>
Jan 15 15:38:53 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause       <cib epoch="8" num_updates="1" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.8" cib-last-written="Wed Jan 15 15:38:53 2014" update-origin="bl460g1n6" update-client="crm_resource" have-quorum="1" dc-uuid="3232261592">
Jan 15 15:38:53 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause         <configuration>
Jan 15 15:38:53 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause           <resources>
Jan 15 15:38:53 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause             <primitive id="prmVM2" class="ocf" provider="heartbeat" type="VirtualDomain">
Jan 15 15:38:53 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause               <utilization id="prmVM2-utilization" __crm_diff_marker__="added:top">
Jan 15 15:38:53 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause                 <nvpair id="prmVM2-utilization-cpu" name="cpu" value="1"/>
Jan 15 15:38:53 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause               </utilization>
Jan 15 15:38:53 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause             </primitive>
Jan 15 15:38:53 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause           </resources>
Jan 15 15:38:53 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause         </configuration>
Jan 15 15:38:53 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause       </cib>
Jan 15 15:38:53 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause     </diff-added>
Jan 15 15:38:53 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause   </diff>
Jan 15 15:38:53 [30795] bl460g1n6       crmd: (     utils.c:271   )   debug: update_abort_priority: 	Abort priority upgraded from 0 to 1000000
Jan 15 15:38:53 [30795] bl460g1n6       crmd: (     utils.c:281   )   debug: update_abort_priority: 	Abort action done superceeded by restart
Jan 15 15:38:53 [30795] bl460g1n6       crmd: (     graph.c:336   )   debug: run_graph: 	Transition 3 (Complete=0, Pending=3, Fired=0, Skipped=1, Incomplete=0, Source=/var/lib/pacemaker/pengine/pe-input-3.bz2): In-progress
Jan 15 15:38:53 [30791] bl460g1n6 stonith-ng: ( cib_utils.c:167   )   debug: Config update: 	Diff: --- 0.7.13
Jan 15 15:38:53 [30791] bl460g1n6 stonith-ng: ( cib_utils.c:169   )   debug: Config update: 	Diff: +++ 0.8.1 e48690591e91d21f853e91166a9db940
Jan 15 15:38:53 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-- <cib admin_epoch="0" epoch="7" num_updates="13"/>
Jan 15 15:38:53 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+  <cib epoch="8" num_updates="1" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.8" cib-last-written="Wed Jan 15 15:38:53 2014" update-origin="bl460g1n6" update-client="crm_resource" have-quorum="1" dc-uuid="3232261592">
Jan 15 15:38:53 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+    <configuration>
Jan 15 15:38:53 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+      <resources>
Jan 15 15:38:53 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+        <primitive id="prmVM2" class="ocf" provider="heartbeat" type="VirtualDomain">
Jan 15 15:38:53 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++         <utilization id="prmVM2-utilization">
Jan 15 15:38:53 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++           <nvpair id="prmVM2-utilization-cpu" name="cpu" value="1"/>
Jan 15 15:38:53 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++         </utilization>
Jan 15 15:38:53 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+        </primitive>
Jan 15 15:38:53 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+      </resources>
Jan 15 15:38:53 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+    </configuration>
Jan 15 15:38:53 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+  </cib>
Jan 15 15:38:53 [30790] bl460g1n6        cib: ( cib_utils.c:167   )  notice: cib:diff: 	Diff: --- 0.7.13
Jan 15 15:38:53 [30790] bl460g1n6        cib: ( cib_utils.c:169   )  notice: cib:diff: 	Diff: +++ 0.8.1 e48690591e91d21f853e91166a9db940
Jan 15 15:38:53 [30790] bl460g1n6        cib: (       xml.c:1496  )  notice: cib:diff: 	-- <cib admin_epoch="0" epoch="7" num_updates="13"/>
Jan 15 15:38:53 [30790] bl460g1n6        cib: (       xml.c:1507  )  notice: cib:diff: 	++         <utilization id="prmVM2-utilization">
Jan 15 15:38:53 [30790] bl460g1n6        cib: (       xml.c:1507  )  notice: cib:diff: 	++           <nvpair id="prmVM2-utilization-cpu" name="cpu" value="1"/>
Jan 15 15:38:53 [30790] bl460g1n6        cib: (       xml.c:1507  )  notice: cib:diff: 	++         </utilization>
Jan 15 15:38:53 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_modify operation for section resources: OK (rc=0, origin=local/crm_resource/5, version=0.8.1)
Jan 15 15:38:53 [31076] bl460g1n6 crm_resource: (cib_native.c:282   )   debug: cib_native_signoff: 	Signing out of the CIB Service
Jan 15 15:38:53 [31076] bl460g1n6 crm_resource: (      ipcc.c:378   )   debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Jan 15 15:38:53 [31076] bl460g1n6 crm_resource: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-request-30790-31076-14-header
Jan 15 15:38:53 [30790] bl460g1n6        cib: (      ipcs.c:757   )   debug: qb_ipcs_dispatch_connection_request: 	HUP conn (30790-31076-14)
Jan 15 15:38:53 [30790] bl460g1n6        cib: (      ipcs.c:605   )   debug: qb_ipcs_disconnect: 	qb_ipcs_disconnect(30790-31076-14) state:2
Jan 15 15:38:53 [31076] bl460g1n6 crm_resource: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-response-30790-31076-14-header
Jan 15 15:38:53 [31076] bl460g1n6 crm_resource: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-event-30790-31076-14-header
Jan 15 15:38:53 [30790] bl460g1n6        cib: (       ipc.c:368   )    info: crm_client_destroy: 	Destroying 0 events
Jan 15 15:38:53 [31076] bl460g1n6 crm_resource: (       xml.c:2719  )    info: crm_xml_cleanup: 	Cleaning up memory from libxml2
Jan 15 15:38:53 [30790] bl460g1n6        cib: (ringbuffer.c:299   )   debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-response-30790-31076-14-header
Jan 15 15:38:53 [30790] bl460g1n6        cib: (ringbuffer.c:299   )   debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-event-30790-31076-14-header
Jan 15 15:38:53 [30790] bl460g1n6        cib: (ringbuffer.c:299   )   debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-request-30790-31076-14-header
Jan 15 15:38:53 [30790] bl460g1n6        cib: (        io.c:738   )    info: write_cib_contents: 	Archived previous version as /var/lib/pacemaker/cib/cib-4.raw
Jan 15 15:38:53 [30790] bl460g1n6        cib: (        io.c:748   )   debug: write_cib_contents: 	Writing CIB to disk
Jan 15 15:38:53 [30790] bl460g1n6        cib: (       ipc.c:334   )    info: crm_client_new: 	Connecting 0x19816b0 for uid=0 gid=0 pid=31083 id=b4c9bb55-fa30-481f-8902-e87e0aae1b9c
Jan 15 15:38:53 [30790] bl460g1n6        cib: ( ipc_setup.c:484   )   debug: handle_new_connection: 	IPC credentials authenticated (30790-31083-14)
Jan 15 15:38:53 [30790] bl460g1n6        cib: (   ipc_shm.c:295   )   debug: qb_ipcs_shm_connect: 	connecting to client [31083]
Jan 15 15:38:53 [30790] bl460g1n6        cib: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:38:53 [30790] bl460g1n6        cib: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:38:53 [30790] bl460g1n6        cib: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:38:53 [31083] bl460g1n6 crm_resource: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:38:53 [31083] bl460g1n6 crm_resource: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:38:53 [31083] bl460g1n6 crm_resource: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:38:53 [31083] bl460g1n6 crm_resource: (cib_native.c:268   )   debug: cib_native_signon_raw: 	Connection to CIB successful
Jan 15 15:38:53 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_query operation for section 'all': OK (rc=0, origin=local/crm_resource/2, version=0.8.1)
Jan 15 15:38:53 [31083] bl460g1n6 crm_resource: (    unpack.c:93    )   debug: unpack_config: 	STONITH timeout: 60000
Jan 15 15:38:53 [31083] bl460g1n6 crm_resource: (    unpack.c:97    )   debug: unpack_config: 	STONITH of failed nodes is disabled
Jan 15 15:38:53 [31083] bl460g1n6 crm_resource: (    unpack.c:104   )   debug: unpack_config: 	Stop all active resources: false
Jan 15 15:38:53 [31083] bl460g1n6 crm_resource: (    unpack.c:108   )   debug: unpack_config: 	Cluster is symmetric - resources can run anywhere by default
Jan 15 15:38:53 [31083] bl460g1n6 crm_resource: (    unpack.c:113   )   debug: unpack_config: 	Default stickiness: 0
Jan 15 15:38:53 [31083] bl460g1n6 crm_resource: (    unpack.c:156   )  notice: unpack_config: 	On loss of CCM Quorum: Ignore
Jan 15 15:38:53 [31083] bl460g1n6 crm_resource: (    unpack.c:196   )   debug: unpack_config: 	Node scores: 'red' = -INFINITY, 'yellow' = 0, 'green' = 0
Jan 15 15:38:53 [31083] bl460g1n6 crm_resource: (    unpack.c:413   ) warning: handle_startup_fencing: 	Blind faith: not fencing unseen nodes
Jan 15 15:38:53 [31083] bl460g1n6 crm_resource: (    unpack.c:504   )   debug: unpack_domains: 	Unpacking domains
Jan 15 15:38:53 [31083] bl460g1n6 crm_resource: (    unpack.c:1331  )    info: determine_online_status: 	Node bl460g1n6 is online
Jan 15 15:38:53 [31083] bl460g1n6 crm_resource: (    unpack.c:1331  )    info: determine_online_status: 	Node bl460g1n7 is online
Jan 15 15:38:53 [31083] bl460g1n6 crm_resource: (    unpack.c:1553  )   debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n6 to prmPing:0
Jan 15 15:38:53 [31083] bl460g1n6 crm_resource: (    unpack.c:2794  )   trace: unpack_rsc_op: 	Unpacking task prmPing_start_0/start (call_id=11, status=0, rc=0) on bl460g1n6 (role=Unknown)
Jan 15 15:38:53 [31083] bl460g1n6 crm_resource: (    unpack.c:2839  )   trace: unpack_rsc_op: 	Handling status: 0
Jan 15 15:38:53 [31083] bl460g1n6 crm_resource: (    unpack.c:2862  )   trace: unpack_rsc_op: 	prmPing:0/start completed on bl460g1n6
Jan 15 15:38:53 [31083] bl460g1n6 crm_resource: (    unpack.c:2928  )   trace: unpack_rsc_op: 	Resource prmPing:0 after start: role=Started
Jan 15 15:38:53 [31083] bl460g1n6 crm_resource: (    unpack.c:2794  )   trace: unpack_rsc_op: 	Unpacking task prmVM2_monitor_0/monitor (call_id=5, status=0, rc=7) on bl460g1n6 (role=Unknown)
Jan 15 15:38:53 [31083] bl460g1n6 crm_resource: (    unpack.c:2839  )   trace: unpack_rsc_op: 	Handling status: 0
Jan 15 15:38:53 [31083] bl460g1n6 crm_resource: (    unpack.c:2862  )   trace: unpack_rsc_op: 	prmVM2/monitor completed on bl460g1n6
Jan 15 15:38:53 [31083] bl460g1n6 crm_resource: (    unpack.c:2928  )   trace: unpack_rsc_op: 	Resource prmVM2 after monitor: role=Stopped
Jan 15 15:38:53 [31083] bl460g1n6 crm_resource: (    unpack.c:1553  )   debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n7 to prmPing:1
Jan 15 15:38:53 [31083] bl460g1n6 crm_resource: (    unpack.c:2794  )   trace: unpack_rsc_op: 	Unpacking task prmPing_start_0/start (call_id=11, status=0, rc=0) on bl460g1n7 (role=Unknown)
Jan 15 15:38:53 [31083] bl460g1n6 crm_resource: (    unpack.c:2839  )   trace: unpack_rsc_op: 	Handling status: 0
Jan 15 15:38:53 [31083] bl460g1n6 crm_resource: (    unpack.c:2862  )   trace: unpack_rsc_op: 	prmPing:1/start completed on bl460g1n7
Jan 15 15:38:53 [31083] bl460g1n6 crm_resource: (    unpack.c:2928  )   trace: unpack_rsc_op: 	Resource prmPing:1 after start: role=Started
Jan 15 15:38:53 [31083] bl460g1n6 crm_resource: (    unpack.c:2794  )   trace: unpack_rsc_op: 	Unpacking task prmVM2_monitor_0/monitor (call_id=5, status=0, rc=7) on bl460g1n7 (role=Unknown)
Jan 15 15:38:53 [31083] bl460g1n6 crm_resource: (    unpack.c:2839  )   trace: unpack_rsc_op: 	Handling status: 0
Jan 15 15:38:53 [31083] bl460g1n6 crm_resource: (    unpack.c:2862  )   trace: unpack_rsc_op: 	prmVM2/monitor completed on bl460g1n7
Jan 15 15:38:53 [31083] bl460g1n6 crm_resource: (    unpack.c:2928  )   trace: unpack_rsc_op: 	Resource prmVM2 after monitor: role=Stopped
Jan 15 15:38:53 [31083] bl460g1n6 crm_resource: (crm_resource:382   )   debug: dump_resource_attr: 	Looking up hv_memory in prmVM2
Jan 15 15:38:53 [31083] bl460g1n6 crm_resource: (cib_native.c:282   )   debug: cib_native_signoff: 	Signing out of the CIB Service
Jan 15 15:38:53 [31083] bl460g1n6 crm_resource: (      ipcc.c:378   )   debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Jan 15 15:38:53 [30790] bl460g1n6        cib: (      ipcs.c:757   )   debug: qb_ipcs_dispatch_connection_request: 	HUP conn (30790-31083-14)
Jan 15 15:38:53 [31083] bl460g1n6 crm_resource: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-request-30790-31083-14-header
Jan 15 15:38:53 [30790] bl460g1n6        cib: (      ipcs.c:605   )   debug: qb_ipcs_disconnect: 	qb_ipcs_disconnect(30790-31083-14) state:2
Jan 15 15:38:53 [31083] bl460g1n6 crm_resource: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-response-30790-31083-14-header
Jan 15 15:38:53 [31083] bl460g1n6 crm_resource: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-event-30790-31083-14-header
Jan 15 15:38:53 [30790] bl460g1n6        cib: (       ipc.c:368   )    info: crm_client_destroy: 	Destroying 0 events
Jan 15 15:38:53 [30790] bl460g1n6        cib: (ringbuffer.c:299   )   debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-response-30790-31083-14-header
Jan 15 15:38:53 [31083] bl460g1n6 crm_resource: (crm_resource:2228  ) warning: main: 	Error performing operation: No such device or address
Jan 15 15:38:53 [31083] bl460g1n6 crm_resource: (       xml.c:2719  )    info: crm_xml_cleanup: 	Cleaning up memory from libxml2
Jan 15 15:38:53 [30790] bl460g1n6        cib: (ringbuffer.c:299   )   debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-event-30790-31083-14-header
Jan 15 15:38:53 [30790] bl460g1n6        cib: (ringbuffer.c:299   )   debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-request-30790-31083-14-header
Jan 15 15:38:53 [30790] bl460g1n6        cib: (        io.c:773   )    info: write_cib_contents: 	Wrote version 0.8.0 of the CIB to disk (digest: caa5d622b396eb492d1acba1234836ed)
Jan 15 15:38:53 [30790] bl460g1n6        cib: (       ipc.c:334   )    info: crm_client_new: 	Connecting 0x19816b0 for uid=0 gid=0 pid=31085 id=545fd781-bbb9-4b4e-887e-b016a2cd6c74
Jan 15 15:38:53 [30790] bl460g1n6        cib: ( ipc_setup.c:484   )   debug: handle_new_connection: 	IPC credentials authenticated (30790-31085-14)
Jan 15 15:38:53 [30790] bl460g1n6        cib: (   ipc_shm.c:295   )   debug: qb_ipcs_shm_connect: 	connecting to client [31085]
Jan 15 15:38:53 [30790] bl460g1n6        cib: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:38:53 [30790] bl460g1n6        cib: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:38:53 [30790] bl460g1n6        cib: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:38:53 [31085] bl460g1n6 crm_resource: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:38:53 [31085] bl460g1n6 crm_resource: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:38:53 [31085] bl460g1n6 crm_resource: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:38:53 [31085] bl460g1n6 crm_resource: (cib_native.c:268   )   debug: cib_native_signon_raw: 	Connection to CIB successful
Jan 15 15:38:53 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_query operation for section 'all': OK (rc=0, origin=local/crm_resource/2, version=0.8.1)
Jan 15 15:38:53 [31085] bl460g1n6 crm_resource: (    unpack.c:93    )   debug: unpack_config: 	STONITH timeout: 60000
Jan 15 15:38:53 [31085] bl460g1n6 crm_resource: (    unpack.c:97    )   debug: unpack_config: 	STONITH of failed nodes is disabled
Jan 15 15:38:53 [31085] bl460g1n6 crm_resource: (    unpack.c:104   )   debug: unpack_config: 	Stop all active resources: false
Jan 15 15:38:53 [31085] bl460g1n6 crm_resource: (    unpack.c:108   )   debug: unpack_config: 	Cluster is symmetric - resources can run anywhere by default
Jan 15 15:38:53 [31085] bl460g1n6 crm_resource: (    unpack.c:113   )   debug: unpack_config: 	Default stickiness: 0
Jan 15 15:38:53 [31085] bl460g1n6 crm_resource: (    unpack.c:156   )  notice: unpack_config: 	On loss of CCM Quorum: Ignore
Jan 15 15:38:53 [31085] bl460g1n6 crm_resource: (    unpack.c:196   )   debug: unpack_config: 	Node scores: 'red' = -INFINITY, 'yellow' = 0, 'green' = 0
Jan 15 15:38:53 [31085] bl460g1n6 crm_resource: (    unpack.c:413   ) warning: handle_startup_fencing: 	Blind faith: not fencing unseen nodes
Jan 15 15:38:53 [31085] bl460g1n6 crm_resource: (    unpack.c:504   )   debug: unpack_domains: 	Unpacking domains
Jan 15 15:38:53 [31085] bl460g1n6 crm_resource: (    unpack.c:1331  )    info: determine_online_status: 	Node bl460g1n6 is online
Jan 15 15:38:53 [31085] bl460g1n6 crm_resource: (    unpack.c:1331  )    info: determine_online_status: 	Node bl460g1n7 is online
Jan 15 15:38:53 [31085] bl460g1n6 crm_resource: (    unpack.c:1553  )   debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n6 to prmPing:0
Jan 15 15:38:53 [31085] bl460g1n6 crm_resource: (    unpack.c:2794  )   trace: unpack_rsc_op: 	Unpacking task prmPing_start_0/start (call_id=11, status=0, rc=0) on bl460g1n6 (role=Unknown)
Jan 15 15:38:53 [31085] bl460g1n6 crm_resource: (    unpack.c:2839  )   trace: unpack_rsc_op: 	Handling status: 0
Jan 15 15:38:53 [31085] bl460g1n6 crm_resource: (    unpack.c:2862  )   trace: unpack_rsc_op: 	prmPing:0/start completed on bl460g1n6
Jan 15 15:38:53 [31085] bl460g1n6 crm_resource: (    unpack.c:2928  )   trace: unpack_rsc_op: 	Resource prmPing:0 after start: role=Started
Jan 15 15:38:53 [31085] bl460g1n6 crm_resource: (    unpack.c:2794  )   trace: unpack_rsc_op: 	Unpacking task prmVM2_monitor_0/monitor (call_id=5, status=0, rc=7) on bl460g1n6 (role=Unknown)
Jan 15 15:38:53 [31085] bl460g1n6 crm_resource: (    unpack.c:2839  )   trace: unpack_rsc_op: 	Handling status: 0
Jan 15 15:38:53 [31085] bl460g1n6 crm_resource: (    unpack.c:2862  )   trace: unpack_rsc_op: 	prmVM2/monitor completed on bl460g1n6
Jan 15 15:38:53 [31085] bl460g1n6 crm_resource: (    unpack.c:2928  )   trace: unpack_rsc_op: 	Resource prmVM2 after monitor: role=Stopped
Jan 15 15:38:53 [31085] bl460g1n6 crm_resource: (    unpack.c:1553  )   debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n7 to prmPing:1
Jan 15 15:38:53 [31085] bl460g1n6 crm_resource: (    unpack.c:2794  )   trace: unpack_rsc_op: 	Unpacking task prmPing_start_0/start (call_id=11, status=0, rc=0) on bl460g1n7 (role=Unknown)
Jan 15 15:38:53 [31085] bl460g1n6 crm_resource: (    unpack.c:2839  )   trace: unpack_rsc_op: 	Handling status: 0
Jan 15 15:38:53 [31085] bl460g1n6 crm_resource: (    unpack.c:2862  )   trace: unpack_rsc_op: 	prmPing:1/start completed on bl460g1n7
Jan 15 15:38:53 [31085] bl460g1n6 crm_resource: (    unpack.c:2928  )   trace: unpack_rsc_op: 	Resource prmPing:1 after start: role=Started
Jan 15 15:38:53 [31085] bl460g1n6 crm_resource: (    unpack.c:2794  )   trace: unpack_rsc_op: 	Unpacking task prmVM2_monitor_0/monitor (call_id=5, status=0, rc=7) on bl460g1n7 (role=Unknown)
Jan 15 15:38:53 [31085] bl460g1n6 crm_resource: (    unpack.c:2839  )   trace: unpack_rsc_op: 	Handling status: 0
Jan 15 15:38:53 [31085] bl460g1n6 crm_resource: (    unpack.c:2862  )   trace: unpack_rsc_op: 	prmVM2/monitor completed on bl460g1n7
Jan 15 15:38:53 [31085] bl460g1n6 crm_resource: (    unpack.c:2928  )   trace: unpack_rsc_op: 	Resource prmVM2 after monitor: role=Stopped
Jan 15 15:38:53 [30790] bl460g1n6        cib: (   cib_ops.c:905   )   debug: cib_process_xpath: 	cib_query: //cib/configuration/resources//*[@id="prmVM2"]/utilization//nvpair[@name="hv_memory"] does not exist
Jan 15 15:38:53 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_query operation for section //cib/configuration/resources//*[@id="prmVM2"]/utilization//nvpair[@name="hv_memory"]: No such device or address (rc=-6, origin=local/crm_resource/3, version=0.8.1)
Jan 15 15:38:53 [30790] bl460g1n6        cib: (   cib_ops.c:923   )   debug: cib_process_xpath: 	Processing cib_query op for /cib (/cib)
Jan 15 15:38:53 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_query operation for section /cib: OK (rc=0, origin=local/crm_resource/4, version=0.8.1)
Jan 15 15:38:53 [31085] bl460g1n6 crm_resource: (crm_resource:561   )   debug: set_resource_attr: 	Update   <primitive id="prmVM2">
Jan 15 15:38:53 [31085] bl460g1n6 crm_resource: (crm_resource:561   )   debug: set_resource_attr: 	Update     <utilization id="prmVM2-utilization">
Jan 15 15:38:53 [31085] bl460g1n6 crm_resource: (crm_resource:561   )   debug: set_resource_attr: 	Update       <nvpair id="prmVM2-utilization-hv_memory" name="hv_memory" value="2048"/>
Jan 15 15:38:53 [31085] bl460g1n6 crm_resource: (crm_resource:561   )   debug: set_resource_attr: 	Update     </utilization>
Jan 15 15:38:53 [31085] bl460g1n6 crm_resource: (crm_resource:561   )   debug: set_resource_attr: 	Update   </primitive>
Jan 15 15:38:53 [30790] bl460g1n6        cib: (        io.c:781   )   debug: write_cib_contents: 	Wrote digest caa5d622b396eb492d1acba1234836ed to disk
Jan 15 15:38:53 [30790] bl460g1n6        cib: (        io.c:259   )    info: retrieveCib: 	Reading cluster configuration from: /var/lib/pacemaker/cib/cib.56U7aR (digest: /var/lib/pacemaker/cib/cib.xY9hko)
Jan 15 15:38:53 [30790] bl460g1n6        cib: (        io.c:596   )   debug: activateCibXml: 	Triggering CIB write for cib_modify op
Jan 15 15:38:53 [30795] bl460g1n6       crmd: (te_callbacks:122   )   debug: te_update_diff: 	Processing diff (cib_modify): 0.8.1 -> 0.9.1 (S_TRANSITION_ENGINE)
Jan 15 15:38:53 [30795] bl460g1n6       crmd: (  te_utils.c:419   )    info: abort_transition_graph: 	te_update_diff:126 - Triggered transition abort (complete=0, node=, tag=diff, id=(null), magic=NA, cib=0.9.1) : Non-status change
Jan 15 15:38:53 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause   <diff crm_feature_set="3.0.8" digest="58bcfbb96735089e77d68ed9262a1728">
Jan 15 15:38:53 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause     <diff-removed admin_epoch="0" epoch="8" num_updates="1">
Jan 15 15:38:53 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause       <cib admin_epoch="0" epoch="8" num_updates="1"/>
Jan 15 15:38:53 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause     </diff-removed>
Jan 15 15:38:53 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause     <diff-added>
Jan 15 15:38:53 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause       <cib epoch="9" num_updates="1" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.8" cib-last-written="Wed Jan 15 15:38:53 2014" update-origin="bl460g1n6" update-client="crm_resource" have-quorum="1" dc-uuid="3232261592">
Jan 15 15:38:53 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause         <configuration>
Jan 15 15:38:53 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause           <resources>
Jan 15 15:38:53 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause             <primitive id="prmVM2" class="ocf" provider="heartbeat" type="VirtualDomain">
Jan 15 15:38:53 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause               <utilization id="prmVM2-utilization">
Jan 15 15:38:53 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause                 <nvpair id="prmVM2-utilization-hv_memory" name="hv_memory" value="2048" __crm_diff_marker__="added:top"/>
Jan 15 15:38:53 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause               </utilization>
Jan 15 15:38:53 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause             </primitive>
Jan 15 15:38:53 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause           </resources>
Jan 15 15:38:53 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause         </configuration>
Jan 15 15:38:53 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause       </cib>
Jan 15 15:38:53 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause     </diff-added>
Jan 15 15:38:53 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause   </diff>
Jan 15 15:38:53 [30795] bl460g1n6       crmd: (     graph.c:336   )   debug: run_graph: 	Transition 3 (Complete=0, Pending=3, Fired=0, Skipped=1, Incomplete=0, Source=/var/lib/pacemaker/pengine/pe-input-3.bz2): In-progress
Jan 15 15:38:53 [30791] bl460g1n6 stonith-ng: ( cib_utils.c:174   )   debug: log_cib_diff: 	Config update: Local-only Change: 0.9.1
Jan 15 15:38:53 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-- <cib admin_epoch="0" epoch="8" num_updates="1"/>
Jan 15 15:38:53 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+  <cib epoch="9" num_updates="1" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.8" cib-last-written="Wed Jan 15 15:38:53 2014" update-origin="bl460g1n6" update-client="crm_resource" have-quorum="1" dc-uuid="3232261592">
Jan 15 15:38:53 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+    <configuration>
Jan 15 15:38:53 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+      <resources>
Jan 15 15:38:53 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+        <primitive id="prmVM2" class="ocf" provider="heartbeat" type="VirtualDomain">
Jan 15 15:38:53 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+          <utilization id="prmVM2-utilization">
Jan 15 15:38:53 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++           <nvpair id="prmVM2-utilization-hv_memory" name="hv_memory" value="2048"/>
Jan 15 15:38:53 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+          </utilization>
Jan 15 15:38:53 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+        </primitive>
Jan 15 15:38:53 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+      </resources>
Jan 15 15:38:53 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+    </configuration>
Jan 15 15:38:53 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+  </cib>
Jan 15 15:38:53 [30790] bl460g1n6        cib: ( cib_utils.c:174   )  notice: log_cib_diff: 	cib:diff: Local-only Change: 0.9.1
Jan 15 15:38:53 [30790] bl460g1n6        cib: (       xml.c:1496  )  notice: cib:diff: 	-- <cib admin_epoch="0" epoch="8" num_updates="1"/>
Jan 15 15:38:53 [30790] bl460g1n6        cib: (       xml.c:1507  )  notice: cib:diff: 	++           <nvpair id="prmVM2-utilization-hv_memory" name="hv_memory" value="2048"/>
Jan 15 15:38:53 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_modify operation for section resources: OK (rc=0, origin=local/crm_resource/5, version=0.9.1)
Jan 15 15:38:53 [31085] bl460g1n6 crm_resource: (cib_native.c:282   )   debug: cib_native_signoff: 	Signing out of the CIB Service
Jan 15 15:38:53 [31085] bl460g1n6 crm_resource: (      ipcc.c:378   )   debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Jan 15 15:38:53 [30790] bl460g1n6        cib: (      ipcs.c:757   )   debug: qb_ipcs_dispatch_connection_request: 	HUP conn (30790-31085-14)
Jan 15 15:38:53 [31085] bl460g1n6 crm_resource: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-request-30790-31085-14-header
Jan 15 15:38:53 [30790] bl460g1n6        cib: (      ipcs.c:605   )   debug: qb_ipcs_disconnect: 	qb_ipcs_disconnect(30790-31085-14) state:2
Jan 15 15:38:53 [31085] bl460g1n6 crm_resource: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-response-30790-31085-14-header
Jan 15 15:38:53 [31085] bl460g1n6 crm_resource: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-event-30790-31085-14-header
Jan 15 15:38:53 [30790] bl460g1n6        cib: (       ipc.c:368   )    info: crm_client_destroy: 	Destroying 0 events
Jan 15 15:38:53 [30790] bl460g1n6        cib: (ringbuffer.c:299   )   debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-response-30790-31085-14-header
Jan 15 15:38:53 [31085] bl460g1n6 crm_resource: (       xml.c:2719  )    info: crm_xml_cleanup: 	Cleaning up memory from libxml2
Jan 15 15:38:53 [30790] bl460g1n6        cib: (ringbuffer.c:299   )   debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-event-30790-31085-14-header
Jan 15 15:38:53 [30790] bl460g1n6        cib: (ringbuffer.c:299   )   debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-request-30790-31085-14-header
Jan 15 15:38:53 [30790] bl460g1n6        cib: (        io.c:786   )   debug: write_cib_contents: 	Activating /var/lib/pacemaker/cib/cib.56U7aR
Jan 15 15:38:53 [30792] bl460g1n6       lrmd: (services_lin:301   )   debug: operation_finished: 	prmVM2_start_0:30915 - exited with rc=0
Jan 15 15:38:53 [30792] bl460g1n6       lrmd: (services_lin:306   )   debug: operation_finished: 	prmVM2_start_0:30915:stderr [ -- empty -- ]
Jan 15 15:38:53 [30792] bl460g1n6       lrmd: (services_lin:310   )   debug: operation_finished: 	prmVM2_start_0:30915:stdout [ Domain vm2 created from /migrate_test/config/vm2.xml ]
Jan 15 15:38:53 [30792] bl460g1n6       lrmd: (      lrmd.c:104   )    info: log_finished: 	finished - rsc:prmVM2 action:start call_id:12 pid:30915 exit-code:0 exec-time:1030ms queue-time:0ms
Jan 15 15:38:53 [30795] bl460g1n6       crmd: (     utils.c:2126  )   debug: create_operation_update: 	do_update_resource: Updating resource prmVM2 after start op complete (interval=0)
Jan 15 15:38:53 [30795] bl460g1n6       crmd: (       lrm.c:2101  )  notice: process_lrm_event: 	LRM operation prmVM2_start_0 (call=12, rc=0, cib-update=60, confirmed=true) ok
Jan 15 15:38:53 [30795] bl460g1n6       crmd: (       lrm.c:2116  )   debug: process_lrm_event: 	bl460g1n6-prmVM2_start_0:12 [ Domain vm2 created from /migrate_test/config/vm2.xml\n ]
Jan 15 15:38:53 [30795] bl460g1n6       crmd: (       lrm.c:122   )   debug: update_history_cache: 	Updating history for 'prmVM2' with start op
Jan 15 15:38:53 [30795] bl460g1n6       crmd: (te_callbacks:122   )   debug: te_update_diff: 	Processing diff (cib_modify): 0.9.1 -> 0.9.2 (S_TRANSITION_ENGINE)
Jan 15 15:38:53 [30795] bl460g1n6       crmd: ( te_events.c:375   )    info: match_graph_event: 	Action prmVM2_start_0 (5) confirmed on bl460g1n6 (rc=0)
Jan 15 15:38:53 [30795] bl460g1n6       crmd: (     graph.c:336   )   debug: run_graph: 	Transition 3 (Complete=1, Pending=2, Fired=0, Skipped=1, Incomplete=0, Source=/var/lib/pacemaker/pengine/pe-input-3.bz2): In-progress
Jan 15 15:38:53 [30791] bl460g1n6 stonith-ng: ( cib_utils.c:167   )   debug: Config update: 	Diff: --- 0.9.1
Jan 15 15:38:53 [30791] bl460g1n6 stonith-ng: ( cib_utils.c:169   )   debug: Config update: 	Diff: +++ 0.9.2 33203ca529bcd1ee280b1ffc3ed152d0
Jan 15 15:38:53 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-  <cib num_updates="1">
Jan 15 15:38:53 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-    <status>
Jan 15 15:38:53 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-      <node_state id="3232261592">
Jan 15 15:38:53 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-        <lrm id="3232261592">
Jan 15 15:38:53 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-          <lrm_resources>
Jan 15 15:38:53 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-            <lrm_resource id="prmVM2">
Jan 15 15:38:53 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	--             <lrm_rsc_op operation_key="prmVM2_monitor_0" operation="monitor" transition-key="4:2:7:be72ea63-75a9-4de4-a591-e716f960743b" transition-magic="0:7;4:2:7:be72ea63-75a9-4de4-a591-e716f960743b" call-id="5" rc-code="7" last-run="1389767928" last-rc-change="1389767928" exec-time="144" id="prmVM2_last_0"/>
Jan 15 15:38:53 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-            </lrm_resource>
Jan 15 15:38:53 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-          </lrm_resources>
Jan 15 15:38:53 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-        </lrm>
Jan 15 15:38:53 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-      </node_state>
Jan 15 15:38:53 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-    </status>
Jan 15 15:38:53 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-  </cib>
Jan 15 15:38:53 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+  <cib epoch="9" num_updates="2" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.8" cib-last-written="Wed Jan 15 15:38:53 2014" update-origin="bl460g1n6" update-client="crm_resource" have-quorum="1" dc-uuid="3232261592">
Jan 15 15:38:53 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+    <status>
Jan 15 15:38:53 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+      <node_state id="3232261592" uname="bl460g1n6" in_ccm="true" crmd="online" crm-debug-origin="do_update_resource" join="member" expected="member">
Jan 15 15:38:53 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+        <lrm id="3232261592">
Jan 15 15:38:53 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+          <lrm_resources>
Jan 15 15:38:53 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+            <lrm_resource id="prmVM2" type="VirtualDomain" class="ocf" provider="heartbeat">
Jan 15 15:38:53 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++             <lrm_rsc_op id="prmVM2_last_0" operation_key="prmVM2_start_0" operation="start" crm-debug-origin="do_update_resource" crm_feature_set="3.0.8" transition-key="5:3:0:be72ea63-75a9-4de4-a591-e716f960743b" transition-magic="0:0;5:3:0:be72ea63-75a9-4de4-a591-e716f960743b" call-id="12" rc-code="0" op-status="0" interval="0" last-run="1389767932" last-rc-change="1389767932" exec-time="1030" queue-time="0" o
Jan 15 15:38:53 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+            </lrm_resource>
Jan 15 15:38:53 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+          </lrm_resources>
Jan 15 15:38:53 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+        </lrm>
Jan 15 15:38:53 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+      </node_state>
Jan 15 15:38:53 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+    </status>
Jan 15 15:38:53 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+  </cib>
Jan 15 15:38:53 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=local/crmd/60, version=0.9.2)
Jan 15 15:38:53 [30795] bl460g1n6       crmd: (te_callbacks:122   )   debug: te_update_diff: 	Processing diff (cib_modify): 0.9.2 -> 0.9.3 (S_TRANSITION_ENGINE)
Jan 15 15:38:53 [30795] bl460g1n6       crmd: ( te_events.c:375   )    info: match_graph_event: 	Action prmPing_monitor_10000 (12) confirmed on bl460g1n7 (rc=0)
Jan 15 15:38:53 [30795] bl460g1n6       crmd: (     graph.c:336   )   debug: run_graph: 	Transition 3 (Complete=2, Pending=1, Fired=0, Skipped=1, Incomplete=0, Source=/var/lib/pacemaker/pengine/pe-input-3.bz2): In-progress
Jan 15 15:38:53 [30791] bl460g1n6 stonith-ng: ( cib_utils.c:167   )   debug: Config update: 	Diff: --- 0.9.2
Jan 15 15:38:53 [30791] bl460g1n6 stonith-ng: ( cib_utils.c:169   )   debug: Config update: 	Diff: +++ 0.9.3 e7d6882121ad935dceafc4d17f2262e3
Jan 15 15:38:53 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-- <cib num_updates="2"/>
Jan 15 15:38:53 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+  <cib epoch="9" num_updates="3" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.8" cib-last-written="Wed Jan 15 15:38:53 2014" update-origin="bl460g1n6" update-client="crm_resource" have-quorum="1" dc-uuid="3232261592">
Jan 15 15:38:53 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+    <status>
Jan 15 15:38:53 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+      <node_state id="3232261593" uname="bl460g1n7" in_ccm="true" crmd="online" crm-debug-origin="do_update_resource" join="member" expected="member">
Jan 15 15:38:53 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+        <lrm id="3232261593">
Jan 15 15:38:53 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+          <lrm_resources>
Jan 15 15:38:53 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+            <lrm_resource id="prmPing" type="ping" class="ocf" provider="pacemaker">
Jan 15 15:38:53 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++             <lrm_rsc_op id="prmPing_monitor_10000" operation_key="prmPing_monitor_10000" operation="monitor" crm-debug-origin="do_update_resource" crm_feature_set="3.0.8" transition-key="12:3:0:be72ea63-75a9-4de4-a591-e716f960743b" transition-magic="0:0;12:3:0:be72ea63-75a9-4de4-a591-e716f960743b" call-id="12" rc-code="0" op-status="0" interval="10000" last-rc-change="1389767932" exec-time="1033" queue-time="0" 
Jan 15 15:38:53 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+            </lrm_resource>
Jan 15 15:38:53 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+          </lrm_resources>
Jan 15 15:38:53 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+        </lrm>
Jan 15 15:38:53 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+      </node_state>
Jan 15 15:38:53 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+    </status>
Jan 15 15:38:53 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+  </cib>
Jan 15 15:38:53 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=bl460g1n7/crmd/16, version=0.9.3)
Jan 15 15:38:53 [31088] bl460g1n6 attrd_updater: (     utils.c:1995  )    info: attrd_update_delegate: 	Connecting to cluster... 5 retries remaining
Jan 15 15:38:53 [30793] bl460g1n6      attrd: (       ipc.c:334   )    info: crm_client_new: 	Connecting 0x1754d60 for uid=0 gid=0 pid=31088 id=452d916e-1d5e-476c-8c74-5efe538560a7
Jan 15 15:38:53 [30793] bl460g1n6      attrd: ( ipc_setup.c:484   )   debug: handle_new_connection: 	IPC credentials authenticated (30793-31088-10)
Jan 15 15:38:53 [30793] bl460g1n6      attrd: (   ipc_shm.c:295   )   debug: qb_ipcs_shm_connect: 	connecting to client [31088]
Jan 15 15:38:53 [30793] bl460g1n6      attrd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Jan 15 15:38:53 [30793] bl460g1n6      attrd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Jan 15 15:38:53 [30793] bl460g1n6      attrd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Jan 15 15:38:53 [31088] bl460g1n6 attrd_updater: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Jan 15 15:38:53 [31088] bl460g1n6 attrd_updater: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Jan 15 15:38:53 [31088] bl460g1n6 attrd_updater: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Jan 15 15:38:53 [31088] bl460g1n6 attrd_updater: (     utils.c:2023  )   debug: attrd_update_delegate: 	Sent update: default_ping_set=100 for localhost
Jan 15 15:38:53 [30793] bl460g1n6      attrd: (  commands.c:244   )    info: attrd_client_message: 	Broadcasting default_ping_set[bl460g1n6] = 100 (writer)
Jan 15 15:38:53 [31088] bl460g1n6 attrd_updater: (       xml.c:2719  )    info: crm_xml_cleanup: 	Cleaning up memory from libxml2
Jan 15 15:38:53 [30793] bl460g1n6      attrd: (      ipcs.c:757   )   debug: qb_ipcs_dispatch_connection_request: 	HUP conn (30793-31088-10)
Jan 15 15:38:53 [30793] bl460g1n6      attrd: (      ipcs.c:605   )   debug: qb_ipcs_disconnect: 	qb_ipcs_disconnect(30793-31088-10) state:2
Jan 15 15:38:53 [30793] bl460g1n6      attrd: (       ipc.c:368   )    info: crm_client_destroy: 	Destroying 0 events
Jan 15 15:38:53 [30793] bl460g1n6      attrd: (ringbuffer.c:299   )   debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-attrd-response-30793-31088-10-header
Jan 15 15:38:53 [30793] bl460g1n6      attrd: (ringbuffer.c:299   )   debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-attrd-event-30793-31088-10-header
Jan 15 15:38:53 [30793] bl460g1n6      attrd: (ringbuffer.c:299   )   debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-attrd-request-30793-31088-10-header
Jan 15 15:38:53 [30792] bl460g1n6       lrmd: (services_lin:301   )   debug: operation_finished: 	prmPing_monitor_10000:30916 - exited with rc=0
Jan 15 15:38:53 [30792] bl460g1n6       lrmd: (services_lin:306   )   debug: operation_finished: 	prmPing_monitor_10000:30916:stderr [ -- empty -- ]
Jan 15 15:38:53 [30792] bl460g1n6       lrmd: (services_lin:310   )   debug: operation_finished: 	prmPing_monitor_10000:30916:stdout [ -- empty -- ]
Jan 15 15:38:53 [30792] bl460g1n6       lrmd: (      lrmd.c:104   )   debug: log_finished: 	finished - rsc:prmPing action:monitor call_id:13 pid:30916 exit-code:0 exec-time:1067ms queue-time:0ms
Jan 15 15:38:53 [30795] bl460g1n6       crmd: (     utils.c:2126  )   debug: create_operation_update: 	do_update_resource: Updating resource prmPing after monitor op complete (interval=10000)
Jan 15 15:38:53 [30795] bl460g1n6       crmd: (       lrm.c:2101  )  notice: process_lrm_event: 	LRM operation prmPing_monitor_10000 (call=13, rc=0, cib-update=61, confirmed=false) ok
Jan 15 15:38:53 [30795] bl460g1n6       crmd: (       lrm.c:122   )   debug: update_history_cache: 	Updating history for 'prmPing' with monitor op
Jan 15 15:38:53 [30795] bl460g1n6       crmd: (te_callbacks:122   )   debug: te_update_diff: 	Processing diff (cib_modify): 0.9.3 -> 0.9.4 (S_TRANSITION_ENGINE)
Jan 15 15:38:53 [30795] bl460g1n6       crmd: ( te_events.c:375   )    info: match_graph_event: 	Action prmPing_monitor_10000 (9) confirmed on bl460g1n6 (rc=0)
Jan 15 15:38:53 [30795] bl460g1n6       crmd: (     graph.c:336   )  notice: run_graph: 	Transition 3 (Complete=3, Pending=0, Fired=0, Skipped=1, Incomplete=0, Source=/var/lib/pacemaker/pengine/pe-input-3.bz2): Stopped
Jan 15 15:38:53 [30795] bl460g1n6       crmd: (  te_utils.c:355   )   debug: te_graph_trigger: 	Transition 3 is now complete
Jan 15 15:38:53 [30795] bl460g1n6       crmd: (te_actions.c:654   )   debug: notify_crmd: 	Processing transition completion in state S_TRANSITION_ENGINE
Jan 15 15:38:53 [30795] bl460g1n6       crmd: (     utils.c:192   )   debug: crm_timer_start: 	Started New Transition Timer (I_PE_CALC:2000ms), src=84
Jan 15 15:38:53 [30795] bl460g1n6       crmd: (te_actions.c:699   )   debug: notify_crmd: 	Transition 3 status: restart - Non-status change
Jan 15 15:38:53 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=local/crmd/61, version=0.9.4)
Jan 15 15:38:53 [30791] bl460g1n6 stonith-ng: ( cib_utils.c:167   )   debug: Config update: 	Diff: --- 0.9.3
Jan 15 15:38:53 [30791] bl460g1n6 stonith-ng: ( cib_utils.c:169   )   debug: Config update: 	Diff: +++ 0.9.4 d98f1c2e25685160f1f97fb496ba8638
Jan 15 15:38:53 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-- <cib num_updates="3"/>
Jan 15 15:38:53 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+  <cib epoch="9" num_updates="4" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.8" cib-last-written="Wed Jan 15 15:38:53 2014" update-origin="bl460g1n6" update-client="crm_resource" have-quorum="1" dc-uuid="3232261592">
Jan 15 15:38:53 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+    <status>
Jan 15 15:38:53 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+      <node_state id="3232261592" uname="bl460g1n6" in_ccm="true" crmd="online" crm-debug-origin="do_update_resource" join="member" expected="member">
Jan 15 15:38:53 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+        <lrm id="3232261592">
Jan 15 15:38:53 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+          <lrm_resources>
Jan 15 15:38:53 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+            <lrm_resource id="prmPing" type="ping" class="ocf" provider="pacemaker">
Jan 15 15:38:53 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++             <lrm_rsc_op id="prmPing_monitor_10000" operation_key="prmPing_monitor_10000" operation="monitor" crm-debug-origin="do_update_resource" crm_feature_set="3.0.8" transition-key="9:3:0:be72ea63-75a9-4de4-a591-e716f960743b" transition-magic="0:0;9:3:0:be72ea63-75a9-4de4-a591-e716f960743b" call-id="13" rc-code="0" op-status="0" interval="10000" last-rc-change="1389767932" exec-time="1067" queue-time="0" op
Jan 15 15:38:53 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+            </lrm_resource>
Jan 15 15:38:53 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+          </lrm_resources>
Jan 15 15:38:53 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+        </lrm>
Jan 15 15:38:53 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+      </node_state>
Jan 15 15:38:53 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+    </status>
Jan 15 15:38:53 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+  </cib>
Jan 15 15:38:53 [30790] bl460g1n6        cib: (        io.c:738   )    info: write_cib_contents: 	Archived previous version as /var/lib/pacemaker/cib/cib-5.raw
Jan 15 15:38:53 [30790] bl460g1n6        cib: (        io.c:748   )   debug: write_cib_contents: 	Writing CIB to disk
Jan 15 15:38:53 [30790] bl460g1n6        cib: (        io.c:773   )    info: write_cib_contents: 	Wrote version 0.9.0 of the CIB to disk (digest: aca48e4d4ed7a4f0d998bcd77e84d6e9)
Jan 15 15:38:53 [30790] bl460g1n6        cib: (        io.c:781   )   debug: write_cib_contents: 	Wrote digest aca48e4d4ed7a4f0d998bcd77e84d6e9 to disk
Jan 15 15:38:53 [30790] bl460g1n6        cib: (        io.c:259   )    info: retrieveCib: 	Reading cluster configuration from: /var/lib/pacemaker/cib/cib.PQhCb1 (digest: /var/lib/pacemaker/cib/cib.hNGoGy)
Jan 15 15:38:53 [30790] bl460g1n6        cib: (        io.c:786   )   debug: write_cib_contents: 	Activating /var/lib/pacemaker/cib/cib.PQhCb1
Jan 15 15:38:55 [30795] bl460g1n6       crmd: (     utils.c:120   )    info: crm_timer_popped: 	New Transition Timer (I_PE_CALC) just popped (2000ms)
Jan 15 15:38:55 [30795] bl460g1n6       crmd: (       fsa.c:193   )   debug: s_crmd_fsa: 	Processing I_PE_CALC: [ state=S_TRANSITION_ENGINE cause=C_TIMER_POPPED origin=crm_timer_popped ]
Jan 15 15:38:55 [30795] bl460g1n6       crmd: (       fsa.c:502   )    info: do_state_transition: 	State transition S_TRANSITION_ENGINE -> S_POLICY_ENGINE [ input=I_PE_CALC cause=C_TIMER_POPPED origin=crm_timer_popped ]
Jan 15 15:38:55 [30795] bl460g1n6       crmd: (       fsa.c:599   )    info: do_state_transition: 	Progressed to state S_POLICY_ENGINE after C_TIMER_POPPED
Jan 15 15:38:55 [30795] bl460g1n6       crmd: (       fsa.c:610   )   debug: do_state_transition: 	All 2 cluster nodes are eligible to run resources.
Jan 15 15:38:55 [30795] bl460g1n6       crmd: (   pengine.c:231   )   debug: do_pe_invoke: 	Query 62: Requesting the current CIB: S_POLICY_ENGINE
Jan 15 15:38:55 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_query operation for section 'all': OK (rc=0, origin=local/crmd/62, version=0.9.4)
Jan 15 15:38:55 [30795] bl460g1n6       crmd: (   pengine.c:299   )   debug: do_pe_invoke_callback: 	Invoking the PE: query=62, ref=pe_calc-dc-1389767935-42, seq=8, quorate=1
Jan 15 15:38:55 [30794] bl460g1n6    pengine: (    unpack.c:93    )   debug: unpack_config: 	STONITH timeout: 60000
Jan 15 15:38:55 [30794] bl460g1n6    pengine: (    unpack.c:97    )   debug: unpack_config: 	STONITH of failed nodes is disabled
Jan 15 15:38:55 [30794] bl460g1n6    pengine: (    unpack.c:104   )   debug: unpack_config: 	Stop all active resources: false
Jan 15 15:38:55 [30794] bl460g1n6    pengine: (    unpack.c:108   )   debug: unpack_config: 	Cluster is symmetric - resources can run anywhere by default
Jan 15 15:38:55 [30794] bl460g1n6    pengine: (    unpack.c:113   )   debug: unpack_config: 	Default stickiness: 0
Jan 15 15:38:55 [30794] bl460g1n6    pengine: (    unpack.c:156   )  notice: unpack_config: 	On loss of CCM Quorum: Ignore
Jan 15 15:38:55 [30794] bl460g1n6    pengine: (    unpack.c:196   )   debug: unpack_config: 	Node scores: 'red' = -INFINITY, 'yellow' = 0, 'green' = 0
Jan 15 15:38:55 [30794] bl460g1n6    pengine: (    unpack.c:504   )   debug: unpack_domains: 	Unpacking domains
Jan 15 15:38:55 [30794] bl460g1n6    pengine: (    unpack.c:1331  )    info: determine_online_status: 	Node bl460g1n6 is online
Jan 15 15:38:55 [30794] bl460g1n6    pengine: (    unpack.c:1331  )    info: determine_online_status: 	Node bl460g1n7 is online
Jan 15 15:38:55 [30794] bl460g1n6    pengine: (    unpack.c:1553  )   debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n6 to prmPing:0
Jan 15 15:38:55 [30794] bl460g1n6    pengine: (    unpack.c:2794  )   trace: unpack_rsc_op: 	Unpacking task prmPing_start_0/start (call_id=11, status=0, rc=0) on bl460g1n6 (role=Unknown)
Jan 15 15:38:55 [30794] bl460g1n6    pengine: (    unpack.c:2839  )   trace: unpack_rsc_op: 	Handling status: 0
Jan 15 15:38:55 [30794] bl460g1n6    pengine: (    unpack.c:2862  )   trace: unpack_rsc_op: 	prmPing:0/start completed on bl460g1n6
Jan 15 15:38:55 [30794] bl460g1n6    pengine: (    unpack.c:2928  )   trace: unpack_rsc_op: 	Resource prmPing:0 after start: role=Started
Jan 15 15:38:55 [30794] bl460g1n6    pengine: (    unpack.c:2794  )   trace: unpack_rsc_op: 	Unpacking task prmPing_monitor_10000/monitor (call_id=13, status=0, rc=0) on bl460g1n6 (role=Started)
Jan 15 15:38:55 [30794] bl460g1n6    pengine: (    unpack.c:2839  )   trace: unpack_rsc_op: 	Handling status: 0
Jan 15 15:38:55 [30794] bl460g1n6    pengine: (    unpack.c:2862  )   trace: unpack_rsc_op: 	prmPing:0/monitor completed on bl460g1n6
Jan 15 15:38:55 [30794] bl460g1n6    pengine: (    unpack.c:2928  )   trace: unpack_rsc_op: 	Resource prmPing:0 after monitor: role=Started
Jan 15 15:38:55 [30794] bl460g1n6    pengine: (    unpack.c:2794  )   trace: unpack_rsc_op: 	Unpacking task prmVM2_start_0/start (call_id=12, status=0, rc=0) on bl460g1n6 (role=Unknown)
Jan 15 15:38:55 [30794] bl460g1n6    pengine: (    unpack.c:2839  )   trace: unpack_rsc_op: 	Handling status: 0
Jan 15 15:38:55 [30794] bl460g1n6    pengine: (    unpack.c:2862  )   trace: unpack_rsc_op: 	prmVM2/start completed on bl460g1n6
Jan 15 15:38:55 [30794] bl460g1n6    pengine: (    unpack.c:2928  )   trace: unpack_rsc_op: 	Resource prmVM2 after start: role=Started
Jan 15 15:38:55 [30794] bl460g1n6    pengine: (    unpack.c:1553  )   debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n7 to prmPing:1
Jan 15 15:38:55 [30794] bl460g1n6    pengine: (    unpack.c:2794  )   trace: unpack_rsc_op: 	Unpacking task prmPing_start_0/start (call_id=11, status=0, rc=0) on bl460g1n7 (role=Unknown)
Jan 15 15:38:55 [30794] bl460g1n6    pengine: (    unpack.c:2839  )   trace: unpack_rsc_op: 	Handling status: 0
Jan 15 15:38:55 [30794] bl460g1n6    pengine: (    unpack.c:2862  )   trace: unpack_rsc_op: 	prmPing:1/start completed on bl460g1n7
Jan 15 15:38:55 [30794] bl460g1n6    pengine: (    unpack.c:2928  )   trace: unpack_rsc_op: 	Resource prmPing:1 after start: role=Started
Jan 15 15:38:55 [30794] bl460g1n6    pengine: (    unpack.c:2794  )   trace: unpack_rsc_op: 	Unpacking task prmPing_monitor_10000/monitor (call_id=12, status=0, rc=0) on bl460g1n7 (role=Started)
Jan 15 15:38:55 [30794] bl460g1n6    pengine: (    unpack.c:2839  )   trace: unpack_rsc_op: 	Handling status: 0
Jan 15 15:38:55 [30794] bl460g1n6    pengine: (    unpack.c:2862  )   trace: unpack_rsc_op: 	prmPing:1/monitor completed on bl460g1n7
Jan 15 15:38:55 [30794] bl460g1n6    pengine: (    unpack.c:2928  )   trace: unpack_rsc_op: 	Resource prmPing:1 after monitor: role=Started
Jan 15 15:38:55 [30794] bl460g1n6    pengine: (    unpack.c:2794  )   trace: unpack_rsc_op: 	Unpacking task prmVM2_monitor_0/monitor (call_id=5, status=0, rc=7) on bl460g1n7 (role=Unknown)
Jan 15 15:38:55 [30794] bl460g1n6    pengine: (    unpack.c:2839  )   trace: unpack_rsc_op: 	Handling status: 0
Jan 15 15:38:55 [30794] bl460g1n6    pengine: (    unpack.c:2862  )   trace: unpack_rsc_op: 	prmVM2/monitor completed on bl460g1n7
Jan 15 15:38:55 [30794] bl460g1n6    pengine: (    unpack.c:2928  )   trace: unpack_rsc_op: 	Resource prmVM2 after monitor: role=Stopped
Jan 15 15:38:55 [30794] bl460g1n6    pengine: (    native.c:446   )    info: native_print: 	prmVM2	(ocf::heartbeat:VirtualDomain):	Started bl460g1n6 
Jan 15 15:38:55 [30794] bl460g1n6    pengine: (     clone.c:417   )    info: clone_print: 	 Clone Set: clnPing [prmPing]
Jan 15 15:38:55 [30794] bl460g1n6    pengine: (    native.c:272   )   debug: native_active: 	Resource prmPing:0 active on bl460g1n6
Jan 15 15:38:55 [30794] bl460g1n6    pengine: (    native.c:272   )   debug: native_active: 	Resource prmPing:0 active on bl460g1n6
Jan 15 15:38:55 [30794] bl460g1n6    pengine: (    native.c:272   )   debug: native_active: 	Resource prmPing:1 active on bl460g1n7
Jan 15 15:38:55 [30794] bl460g1n6    pengine: (    native.c:272   )   debug: native_active: 	Resource prmPing:1 active on bl460g1n7
Jan 15 15:38:55 [30794] bl460g1n6    pengine: (     clone.c:311   )    info: short_print: 	     Started: [ bl460g1n6 bl460g1n7 ]
Jan 15 15:38:55 [30794] bl460g1n6    pengine: (  allocate.c:593   )   debug: common_apply_stickiness: 	Resource prmVM2: preferring current location (node=bl460g1n6, weight=1000000)
Jan 15 15:38:55 [30794] bl460g1n6    pengine: (  allocate.c:593   )   debug: common_apply_stickiness: 	Resource prmPing:0: preferring current location (node=bl460g1n6, weight=1000000)
Jan 15 15:38:55 [30794] bl460g1n6    pengine: (  allocate.c:593   )   debug: common_apply_stickiness: 	Resource prmPing:1: preferring current location (node=bl460g1n7, weight=1000000)
Jan 15 15:38:55 [30794] bl460g1n6    pengine: (     utils.c:386   )   debug: native_assign_node: 	Assigning bl460g1n6 to prmPing:0
Jan 15 15:38:55 [30794] bl460g1n6    pengine: (     utils.c:386   )   debug: native_assign_node: 	Assigning bl460g1n7 to prmPing:1
Jan 15 15:38:55 [30794] bl460g1n6    pengine: (     clone.c:625   )   debug: clone_color: 	Allocated 2 clnPing instances of a possible 2
Jan 15 15:38:55 [30794] bl460g1n6    pengine: (     utils.c:386   )   debug: native_assign_node: 	Assigning bl460g1n6 to prmVM2
Jan 15 15:38:55 [30794] bl460g1n6    pengine: (    native.c:790   )    info: RecurringOp: 	 Start recurring monitor (10s) for prmVM2 on bl460g1n6
Jan 15 15:38:55 [30794] bl460g1n6    pengine: (    native.c:2141  )    info: LogActions: 	Leave   prmVM2	(Started bl460g1n6)
Jan 15 15:38:55 [30794] bl460g1n6    pengine: (    native.c:2141  )    info: LogActions: 	Leave   prmPing:0	(Started bl460g1n6)
Jan 15 15:38:55 [30794] bl460g1n6    pengine: (    native.c:2141  )    info: LogActions: 	Leave   prmPing:1	(Started bl460g1n7)
Jan 15 15:38:55 [30794] bl460g1n6    pengine: (   pengine.c:178   )  notice: process_pe_message: 	Calculated Transition 4: /var/lib/pacemaker/pengine/pe-input-4.bz2
Jan 15 15:38:55 [30795] bl460g1n6       crmd: (       fsa.c:193   )   debug: s_crmd_fsa: 	Processing I_PE_SUCCESS: [ state=S_POLICY_ENGINE cause=C_IPC_MESSAGE origin=handle_response ]
Jan 15 15:38:55 [30795] bl460g1n6       crmd: (       fsa.c:502   )    info: do_state_transition: 	State transition S_POLICY_ENGINE -> S_TRANSITION_ENGINE [ input=I_PE_SUCCESS cause=C_IPC_MESSAGE origin=handle_response ]
Jan 15 15:38:55 [30795] bl460g1n6       crmd: (    unpack.c:230   )   debug: unpack_graph: 	Unpacked transition 4: 1 actions in 1 synapses
Jan 15 15:38:55 [30795] bl460g1n6       crmd: (   tengine.c:208   )    info: do_te_invoke: 	Processing graph 4 (ref=pe_calc-dc-1389767935-42) derived from /var/lib/pacemaker/pengine/pe-input-4.bz2
Jan 15 15:38:55 [30795] bl460g1n6       crmd: (te_actions.c:416   )  notice: te_rsc_command: 	Initiating action 9: monitor prmVM2_monitor_10000 on bl460g1n6 (local)
Jan 15 15:38:55 [30795] bl460g1n6       crmd: (       lrm.c:1784  )    info: do_lrm_rsc_op: 	Performing key=9:4:0:be72ea63-75a9-4de4-a591-e716f960743b op=prmVM2_monitor_10000
Jan 15 15:38:55 [30792] bl460g1n6       lrmd: (      lrmd.c:1313  )   debug: process_lrmd_message: 	Processed lrmd_rsc_exec operation from ea951299-8a4d-4fd6-8900-d6588e07ac38: rc=14, reply=1, notify=0, exit=4201864
Jan 15 15:38:55 [30792] bl460g1n6       lrmd: (      lrmd.c:122   )   debug: log_execute: 	executing - rsc:prmVM2 action:monitor call_id:14
Jan 15 15:38:55 [30795] bl460g1n6       crmd: (     graph.c:336   )   debug: run_graph: 	Transition 4 (Complete=0, Pending=1, Fired=1, Skipped=0, Incomplete=0, Source=/var/lib/pacemaker/pengine/pe-input-4.bz2): In-progress
VirtualDomain(prmVM2)[31118]:	2014/01/15_15:38:55 DEBUG: Virtual domain vm2 is currently running.
Jan 15 15:38:55 [30790] bl460g1n6        cib: (       ipc.c:334   )    info: crm_client_new: 	Connecting 0x19820d0 for uid=0 gid=0 pid=31151 id=628e786e-208a-4c39-bf4b-746314fde2bf
Jan 15 15:38:55 [30790] bl460g1n6        cib: ( ipc_setup.c:484   )   debug: handle_new_connection: 	IPC credentials authenticated (30790-31151-14)
Jan 15 15:38:55 [30790] bl460g1n6        cib: (   ipc_shm.c:295   )   debug: qb_ipcs_shm_connect: 	connecting to client [31151]
Jan 15 15:38:55 [30790] bl460g1n6        cib: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:38:55 [30790] bl460g1n6        cib: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:38:55 [30790] bl460g1n6        cib: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:38:55 [31151] bl460g1n6 crm_resource: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:38:55 [31151] bl460g1n6 crm_resource: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:38:55 [31151] bl460g1n6 crm_resource: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:38:55 [31151] bl460g1n6 crm_resource: (cib_native.c:268   )   debug: cib_native_signon_raw: 	Connection to CIB successful
Jan 15 15:38:55 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_query operation for section 'all': OK (rc=0, origin=local/crm_resource/2, version=0.9.4)
Jan 15 15:38:55 [31151] bl460g1n6 crm_resource: (    unpack.c:93    )   debug: unpack_config: 	STONITH timeout: 60000
Jan 15 15:38:55 [31151] bl460g1n6 crm_resource: (    unpack.c:97    )   debug: unpack_config: 	STONITH of failed nodes is disabled
Jan 15 15:38:55 [31151] bl460g1n6 crm_resource: (    unpack.c:104   )   debug: unpack_config: 	Stop all active resources: false
Jan 15 15:38:55 [31151] bl460g1n6 crm_resource: (    unpack.c:108   )   debug: unpack_config: 	Cluster is symmetric - resources can run anywhere by default
Jan 15 15:38:55 [31151] bl460g1n6 crm_resource: (    unpack.c:113   )   debug: unpack_config: 	Default stickiness: 0
Jan 15 15:38:55 [31151] bl460g1n6 crm_resource: (    unpack.c:156   )  notice: unpack_config: 	On loss of CCM Quorum: Ignore
Jan 15 15:38:55 [31151] bl460g1n6 crm_resource: (    unpack.c:196   )   debug: unpack_config: 	Node scores: 'red' = -INFINITY, 'yellow' = 0, 'green' = 0
Jan 15 15:38:55 [31151] bl460g1n6 crm_resource: (    unpack.c:413   ) warning: handle_startup_fencing: 	Blind faith: not fencing unseen nodes
Jan 15 15:38:55 [31151] bl460g1n6 crm_resource: (    unpack.c:504   )   debug: unpack_domains: 	Unpacking domains
Jan 15 15:38:55 [31151] bl460g1n6 crm_resource: (    unpack.c:1331  )    info: determine_online_status: 	Node bl460g1n6 is online
Jan 15 15:38:55 [31151] bl460g1n6 crm_resource: (    unpack.c:1331  )    info: determine_online_status: 	Node bl460g1n7 is online
Jan 15 15:38:55 [31151] bl460g1n6 crm_resource: (    unpack.c:1553  )   debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n6 to prmPing:0
Jan 15 15:38:55 [31151] bl460g1n6 crm_resource: (    unpack.c:2794  )   trace: unpack_rsc_op: 	Unpacking task prmPing_start_0/start (call_id=11, status=0, rc=0) on bl460g1n6 (role=Unknown)
Jan 15 15:38:55 [31151] bl460g1n6 crm_resource: (    unpack.c:2839  )   trace: unpack_rsc_op: 	Handling status: 0
Jan 15 15:38:55 [31151] bl460g1n6 crm_resource: (    unpack.c:2862  )   trace: unpack_rsc_op: 	prmPing:0/start completed on bl460g1n6
Jan 15 15:38:55 [31151] bl460g1n6 crm_resource: (    unpack.c:2928  )   trace: unpack_rsc_op: 	Resource prmPing:0 after start: role=Started
Jan 15 15:38:55 [31151] bl460g1n6 crm_resource: (    unpack.c:2794  )   trace: unpack_rsc_op: 	Unpacking task prmPing_monitor_10000/monitor (call_id=13, status=0, rc=0) on bl460g1n6 (role=Started)
Jan 15 15:38:55 [31151] bl460g1n6 crm_resource: (    unpack.c:2839  )   trace: unpack_rsc_op: 	Handling status: 0
Jan 15 15:38:55 [31151] bl460g1n6 crm_resource: (    unpack.c:2862  )   trace: unpack_rsc_op: 	prmPing:0/monitor completed on bl460g1n6
Jan 15 15:38:55 [31151] bl460g1n6 crm_resource: (    unpack.c:2928  )   trace: unpack_rsc_op: 	Resource prmPing:0 after monitor: role=Started
Jan 15 15:38:55 [31151] bl460g1n6 crm_resource: (    unpack.c:2794  )   trace: unpack_rsc_op: 	Unpacking task prmVM2_start_0/start (call_id=12, status=0, rc=0) on bl460g1n6 (role=Unknown)
Jan 15 15:38:55 [31151] bl460g1n6 crm_resource: (    unpack.c:2839  )   trace: unpack_rsc_op: 	Handling status: 0
Jan 15 15:38:55 [31151] bl460g1n6 crm_resource: (    unpack.c:2862  )   trace: unpack_rsc_op: 	prmVM2/start completed on bl460g1n6
Jan 15 15:38:55 [31151] bl460g1n6 crm_resource: (    unpack.c:2928  )   trace: unpack_rsc_op: 	Resource prmVM2 after start: role=Started
Jan 15 15:38:55 [31151] bl460g1n6 crm_resource: (    unpack.c:1553  )   debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n7 to prmPing:1
Jan 15 15:38:55 [31151] bl460g1n6 crm_resource: (    unpack.c:2794  )   trace: unpack_rsc_op: 	Unpacking task prmPing_start_0/start (call_id=11, status=0, rc=0) on bl460g1n7 (role=Unknown)
Jan 15 15:38:55 [31151] bl460g1n6 crm_resource: (    unpack.c:2839  )   trace: unpack_rsc_op: 	Handling status: 0
Jan 15 15:38:55 [31151] bl460g1n6 crm_resource: (    unpack.c:2862  )   trace: unpack_rsc_op: 	prmPing:1/start completed on bl460g1n7
Jan 15 15:38:55 [31151] bl460g1n6 crm_resource: (    unpack.c:2928  )   trace: unpack_rsc_op: 	Resource prmPing:1 after start: role=Started
Jan 15 15:38:55 [31151] bl460g1n6 crm_resource: (    unpack.c:2794  )   trace: unpack_rsc_op: 	Unpacking task prmPing_monitor_10000/monitor (call_id=12, status=0, rc=0) on bl460g1n7 (role=Started)
Jan 15 15:38:55 [31151] bl460g1n6 crm_resource: (    unpack.c:2839  )   trace: unpack_rsc_op: 	Handling status: 0
Jan 15 15:38:55 [31151] bl460g1n6 crm_resource: (    unpack.c:2862  )   trace: unpack_rsc_op: 	prmPing:1/monitor completed on bl460g1n7
Jan 15 15:38:55 [31151] bl460g1n6 crm_resource: (    unpack.c:2928  )   trace: unpack_rsc_op: 	Resource prmPing:1 after monitor: role=Started
Jan 15 15:38:55 [31151] bl460g1n6 crm_resource: (    unpack.c:2794  )   trace: unpack_rsc_op: 	Unpacking task prmVM2_monitor_0/monitor (call_id=5, status=0, rc=7) on bl460g1n7 (role=Unknown)
Jan 15 15:38:55 [31151] bl460g1n6 crm_resource: (    unpack.c:2839  )   trace: unpack_rsc_op: 	Handling status: 0
Jan 15 15:38:55 [31151] bl460g1n6 crm_resource: (    unpack.c:2862  )   trace: unpack_rsc_op: 	prmVM2/monitor completed on bl460g1n7
Jan 15 15:38:55 [31151] bl460g1n6 crm_resource: (    unpack.c:2928  )   trace: unpack_rsc_op: 	Resource prmVM2 after monitor: role=Stopped
Jan 15 15:38:55 [31151] bl460g1n6 crm_resource: (crm_resource:382   )   debug: dump_resource_attr: 	Looking up cpu in prmVM2
Jan 15 15:38:55 [31151] bl460g1n6 crm_resource: (cib_native.c:282   )   debug: cib_native_signoff: 	Signing out of the CIB Service
Jan 15 15:38:55 [31151] bl460g1n6 crm_resource: (      ipcc.c:378   )   debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Jan 15 15:38:55 [30790] bl460g1n6        cib: (      ipcs.c:757   )   debug: qb_ipcs_dispatch_connection_request: 	HUP conn (30790-31151-14)
Jan 15 15:38:55 [31151] bl460g1n6 crm_resource: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-request-30790-31151-14-header
Jan 15 15:38:55 [30790] bl460g1n6        cib: (      ipcs.c:605   )   debug: qb_ipcs_disconnect: 	qb_ipcs_disconnect(30790-31151-14) state:2
Jan 15 15:38:55 [31151] bl460g1n6 crm_resource: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-response-30790-31151-14-header
Jan 15 15:38:55 [31151] bl460g1n6 crm_resource: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-event-30790-31151-14-header
Jan 15 15:38:55 [30790] bl460g1n6        cib: (       ipc.c:368   )    info: crm_client_destroy: 	Destroying 0 events
Jan 15 15:38:55 [31151] bl460g1n6 crm_resource: (       xml.c:2719  )    info: crm_xml_cleanup: 	Cleaning up memory from libxml2
Jan 15 15:38:55 [30790] bl460g1n6        cib: (ringbuffer.c:299   )   debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-response-30790-31151-14-header
Jan 15 15:38:55 [30790] bl460g1n6        cib: (ringbuffer.c:299   )   debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-event-30790-31151-14-header
Jan 15 15:38:55 [30790] bl460g1n6        cib: (ringbuffer.c:299   )   debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-request-30790-31151-14-header
Jan 15 15:38:55 [30790] bl460g1n6        cib: (       ipc.c:334   )    info: crm_client_new: 	Connecting 0x19820d0 for uid=0 gid=0 pid=31157 id=4d34e30c-838b-4fd1-b838-44e010549376
Jan 15 15:38:55 [30790] bl460g1n6        cib: ( ipc_setup.c:484   )   debug: handle_new_connection: 	IPC credentials authenticated (30790-31157-14)
Jan 15 15:38:55 [30790] bl460g1n6        cib: (   ipc_shm.c:295   )   debug: qb_ipcs_shm_connect: 	connecting to client [31157]
Jan 15 15:38:55 [30790] bl460g1n6        cib: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:38:55 [30790] bl460g1n6        cib: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:38:55 [30790] bl460g1n6        cib: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:38:55 [31157] bl460g1n6 crm_resource: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:38:55 [31157] bl460g1n6 crm_resource: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:38:55 [31157] bl460g1n6 crm_resource: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:38:55 [31157] bl460g1n6 crm_resource: (cib_native.c:268   )   debug: cib_native_signon_raw: 	Connection to CIB successful
Jan 15 15:38:55 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_query operation for section 'all': OK (rc=0, origin=local/crm_resource/2, version=0.9.4)
Jan 15 15:38:55 [31157] bl460g1n6 crm_resource: (    unpack.c:93    )   debug: unpack_config: 	STONITH timeout: 60000
Jan 15 15:38:55 [31157] bl460g1n6 crm_resource: (    unpack.c:97    )   debug: unpack_config: 	STONITH of failed nodes is disabled
Jan 15 15:38:55 [31157] bl460g1n6 crm_resource: (    unpack.c:104   )   debug: unpack_config: 	Stop all active resources: false
Jan 15 15:38:55 [31157] bl460g1n6 crm_resource: (    unpack.c:108   )   debug: unpack_config: 	Cluster is symmetric - resources can run anywhere by default
Jan 15 15:38:55 [31157] bl460g1n6 crm_resource: (    unpack.c:113   )   debug: unpack_config: 	Default stickiness: 0
Jan 15 15:38:55 [31157] bl460g1n6 crm_resource: (    unpack.c:156   )  notice: unpack_config: 	On loss of CCM Quorum: Ignore
Jan 15 15:38:55 [31157] bl460g1n6 crm_resource: (    unpack.c:196   )   debug: unpack_config: 	Node scores: 'red' = -INFINITY, 'yellow' = 0, 'green' = 0
Jan 15 15:38:55 [31157] bl460g1n6 crm_resource: (    unpack.c:413   ) warning: handle_startup_fencing: 	Blind faith: not fencing unseen nodes
Jan 15 15:38:55 [31157] bl460g1n6 crm_resource: (    unpack.c:504   )   debug: unpack_domains: 	Unpacking domains
Jan 15 15:38:55 [31157] bl460g1n6 crm_resource: (    unpack.c:1331  )    info: determine_online_status: 	Node bl460g1n6 is online
Jan 15 15:38:55 [31157] bl460g1n6 crm_resource: (    unpack.c:1331  )    info: determine_online_status: 	Node bl460g1n7 is online
Jan 15 15:38:55 [31157] bl460g1n6 crm_resource: (    unpack.c:1553  )   debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n6 to prmPing:0
Jan 15 15:38:55 [31157] bl460g1n6 crm_resource: (    unpack.c:2794  )   trace: unpack_rsc_op: 	Unpacking task prmPing_start_0/start (call_id=11, status=0, rc=0) on bl460g1n6 (role=Unknown)
Jan 15 15:38:55 [31157] bl460g1n6 crm_resource: (    unpack.c:2839  )   trace: unpack_rsc_op: 	Handling status: 0
Jan 15 15:38:55 [31157] bl460g1n6 crm_resource: (    unpack.c:2862  )   trace: unpack_rsc_op: 	prmPing:0/start completed on bl460g1n6
Jan 15 15:38:55 [31157] bl460g1n6 crm_resource: (    unpack.c:2928  )   trace: unpack_rsc_op: 	Resource prmPing:0 after start: role=Started
Jan 15 15:38:55 [31157] bl460g1n6 crm_resource: (    unpack.c:2794  )   trace: unpack_rsc_op: 	Unpacking task prmPing_monitor_10000/monitor (call_id=13, status=0, rc=0) on bl460g1n6 (role=Started)
Jan 15 15:38:55 [31157] bl460g1n6 crm_resource: (    unpack.c:2839  )   trace: unpack_rsc_op: 	Handling status: 0
Jan 15 15:38:55 [31157] bl460g1n6 crm_resource: (    unpack.c:2862  )   trace: unpack_rsc_op: 	prmPing:0/monitor completed on bl460g1n6
Jan 15 15:38:55 [31157] bl460g1n6 crm_resource: (    unpack.c:2928  )   trace: unpack_rsc_op: 	Resource prmPing:0 after monitor: role=Started
Jan 15 15:38:55 [31157] bl460g1n6 crm_resource: (    unpack.c:2794  )   trace: unpack_rsc_op: 	Unpacking task prmVM2_start_0/start (call_id=12, status=0, rc=0) on bl460g1n6 (role=Unknown)
Jan 15 15:38:55 [31157] bl460g1n6 crm_resource: (    unpack.c:2839  )   trace: unpack_rsc_op: 	Handling status: 0
Jan 15 15:38:55 [31157] bl460g1n6 crm_resource: (    unpack.c:2862  )   trace: unpack_rsc_op: 	prmVM2/start completed on bl460g1n6
Jan 15 15:38:55 [31157] bl460g1n6 crm_resource: (    unpack.c:2928  )   trace: unpack_rsc_op: 	Resource prmVM2 after start: role=Started
Jan 15 15:38:55 [31157] bl460g1n6 crm_resource: (    unpack.c:1553  )   debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n7 to prmPing:1
Jan 15 15:38:55 [31157] bl460g1n6 crm_resource: (    unpack.c:2794  )   trace: unpack_rsc_op: 	Unpacking task prmPing_start_0/start (call_id=11, status=0, rc=0) on bl460g1n7 (role=Unknown)
Jan 15 15:38:55 [31157] bl460g1n6 crm_resource: (    unpack.c:2839  )   trace: unpack_rsc_op: 	Handling status: 0
Jan 15 15:38:55 [31157] bl460g1n6 crm_resource: (    unpack.c:2862  )   trace: unpack_rsc_op: 	prmPing:1/start completed on bl460g1n7
Jan 15 15:38:55 [31157] bl460g1n6 crm_resource: (    unpack.c:2928  )   trace: unpack_rsc_op: 	Resource prmPing:1 after start: role=Started
Jan 15 15:38:55 [31157] bl460g1n6 crm_resource: (    unpack.c:2794  )   trace: unpack_rsc_op: 	Unpacking task prmPing_monitor_10000/monitor (call_id=12, status=0, rc=0) on bl460g1n7 (role=Started)
Jan 15 15:38:55 [31157] bl460g1n6 crm_resource: (    unpack.c:2839  )   trace: unpack_rsc_op: 	Handling status: 0
Jan 15 15:38:55 [31157] bl460g1n6 crm_resource: (    unpack.c:2862  )   trace: unpack_rsc_op: 	prmPing:1/monitor completed on bl460g1n7
Jan 15 15:38:55 [31157] bl460g1n6 crm_resource: (    unpack.c:2928  )   trace: unpack_rsc_op: 	Resource prmPing:1 after monitor: role=Started
Jan 15 15:38:55 [31157] bl460g1n6 crm_resource: (    unpack.c:2794  )   trace: unpack_rsc_op: 	Unpacking task prmVM2_monitor_0/monitor (call_id=5, status=0, rc=7) on bl460g1n7 (role=Unknown)
Jan 15 15:38:55 [31157] bl460g1n6 crm_resource: (    unpack.c:2839  )   trace: unpack_rsc_op: 	Handling status: 0
Jan 15 15:38:55 [31157] bl460g1n6 crm_resource: (    unpack.c:2862  )   trace: unpack_rsc_op: 	prmVM2/monitor completed on bl460g1n7
Jan 15 15:38:55 [31157] bl460g1n6 crm_resource: (    unpack.c:2928  )   trace: unpack_rsc_op: 	Resource prmVM2 after monitor: role=Stopped
Jan 15 15:38:55 [31157] bl460g1n6 crm_resource: (crm_resource:382   )   debug: dump_resource_attr: 	Looking up hv_memory in prmVM2
Jan 15 15:38:55 [31157] bl460g1n6 crm_resource: (cib_native.c:282   )   debug: cib_native_signoff: 	Signing out of the CIB Service
Jan 15 15:38:55 [31157] bl460g1n6 crm_resource: (      ipcc.c:378   )   debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Jan 15 15:38:55 [30790] bl460g1n6        cib: (      ipcs.c:757   )   debug: qb_ipcs_dispatch_connection_request: 	HUP conn (30790-31157-14)
Jan 15 15:38:55 [31157] bl460g1n6 crm_resource: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-request-30790-31157-14-header
Jan 15 15:38:55 [30790] bl460g1n6        cib: (      ipcs.c:605   )   debug: qb_ipcs_disconnect: 	qb_ipcs_disconnect(30790-31157-14) state:2
Jan 15 15:38:55 [31157] bl460g1n6 crm_resource: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-response-30790-31157-14-header
Jan 15 15:38:55 [31157] bl460g1n6 crm_resource: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-event-30790-31157-14-header
Jan 15 15:38:55 [30790] bl460g1n6        cib: (       ipc.c:368   )    info: crm_client_destroy: 	Destroying 0 events
Jan 15 15:38:55 [31157] bl460g1n6 crm_resource: (       xml.c:2719  )    info: crm_xml_cleanup: 	Cleaning up memory from libxml2
Jan 15 15:38:55 [30790] bl460g1n6        cib: (ringbuffer.c:299   )   debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-response-30790-31157-14-header
Jan 15 15:38:55 [30790] bl460g1n6        cib: (ringbuffer.c:299   )   debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-event-30790-31157-14-header
Jan 15 15:38:55 [30790] bl460g1n6        cib: (ringbuffer.c:299   )   debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-request-30790-31157-14-header
Jan 15 15:38:55 [30792] bl460g1n6       lrmd: (services_lin:301   )   debug: operation_finished: 	prmVM2_monitor_10000:31118 - exited with rc=0
Jan 15 15:38:55 [30792] bl460g1n6       lrmd: (services_lin:306   )   debug: operation_finished: 	prmVM2_monitor_10000:31118:stderr [ -- empty -- ]
Jan 15 15:38:55 [30792] bl460g1n6       lrmd: (services_lin:310   )   debug: operation_finished: 	prmVM2_monitor_10000:31118:stdout [ -- empty -- ]
Jan 15 15:38:55 [30792] bl460g1n6       lrmd: (      lrmd.c:104   )   debug: log_finished: 	finished - rsc:prmVM2 action:monitor call_id:14 pid:31118 exit-code:0 exec-time:180ms queue-time:0ms
Jan 15 15:38:55 [30795] bl460g1n6       crmd: (     utils.c:2126  )   debug: create_operation_update: 	do_update_resource: Updating resource prmVM2 after monitor op complete (interval=10000)
Jan 15 15:38:55 [30795] bl460g1n6       crmd: (       lrm.c:2101  )  notice: process_lrm_event: 	LRM operation prmVM2_monitor_10000 (call=14, rc=0, cib-update=63, confirmed=false) ok
Jan 15 15:38:55 [30795] bl460g1n6       crmd: (       lrm.c:122   )   debug: update_history_cache: 	Updating history for 'prmVM2' with monitor op
Jan 15 15:38:55 [30795] bl460g1n6       crmd: (te_callbacks:122   )   debug: te_update_diff: 	Processing diff (cib_modify): 0.9.4 -> 0.9.5 (S_TRANSITION_ENGINE)
Jan 15 15:38:55 [30795] bl460g1n6       crmd: ( te_events.c:375   )    info: match_graph_event: 	Action prmVM2_monitor_10000 (9) confirmed on bl460g1n6 (rc=0)
Jan 15 15:38:55 [30795] bl460g1n6       crmd: (     graph.c:336   )  notice: run_graph: 	Transition 4 (Complete=1, Pending=0, Fired=0, Skipped=0, Incomplete=0, Source=/var/lib/pacemaker/pengine/pe-input-4.bz2): Complete
Jan 15 15:38:55 [30795] bl460g1n6       crmd: (  te_utils.c:355   )   debug: te_graph_trigger: 	Transition 4 is now complete
Jan 15 15:38:55 [30795] bl460g1n6       crmd: (te_actions.c:654   )   debug: notify_crmd: 	Processing transition completion in state S_TRANSITION_ENGINE
Jan 15 15:38:55 [30795] bl460g1n6       crmd: (te_actions.c:699   )   debug: notify_crmd: 	Transition 4 status: done - <null>
Jan 15 15:38:55 [30795] bl460g1n6       crmd: (       fsa.c:193   )   debug: s_crmd_fsa: 	Processing I_TE_SUCCESS: [ state=S_TRANSITION_ENGINE cause=C_FSA_INTERNAL origin=notify_crmd ]
Jan 15 15:38:55 [30795] bl460g1n6       crmd: (      misc.c:47    )    info: do_log: 	FSA: Input I_TE_SUCCESS from notify_crmd() received in state S_TRANSITION_ENGINE
Jan 15 15:38:55 [30795] bl460g1n6       crmd: (       fsa.c:502   )  notice: do_state_transition: 	State transition S_TRANSITION_ENGINE -> S_IDLE [ input=I_TE_SUCCESS cause=C_FSA_INTERNAL origin=notify_crmd ]
Jan 15 15:38:55 [30795] bl460g1n6       crmd: (       fsa.c:645   )   debug: do_state_transition: 	Starting PEngine Recheck Timer
Jan 15 15:38:55 [30795] bl460g1n6       crmd: (     utils.c:192   )   debug: crm_timer_start: 	Started PEngine Recheck Timer (I_PE_CALC:900000ms), src=88
Jan 15 15:38:55 [30791] bl460g1n6 stonith-ng: ( cib_utils.c:167   )   debug: Config update: 	Diff: --- 0.9.4
Jan 15 15:38:55 [30791] bl460g1n6 stonith-ng: ( cib_utils.c:169   )   debug: Config update: 	Diff: +++ 0.9.5 2443576706ba7aac8a646a787bca4760
Jan 15 15:38:55 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-- <cib num_updates="4"/>
Jan 15 15:38:55 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+  <cib epoch="9" num_updates="5" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.8" cib-last-written="Wed Jan 15 15:38:53 2014" update-origin="bl460g1n6" update-client="crm_resource" have-quorum="1" dc-uuid="3232261592">
Jan 15 15:38:55 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+    <status>
Jan 15 15:38:55 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+      <node_state id="3232261592" uname="bl460g1n6" in_ccm="true" crmd="online" crm-debug-origin="do_update_resource" join="member" expected="member">
Jan 15 15:38:55 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+        <lrm id="3232261592">
Jan 15 15:38:55 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+          <lrm_resources>
Jan 15 15:38:55 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+            <lrm_resource id="prmVM2" type="VirtualDomain" class="ocf" provider="heartbeat">
Jan 15 15:38:55 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++             <lrm_rsc_op id="prmVM2_monitor_10000" operation_key="prmVM2_monitor_10000" operation="monitor" crm-debug-origin="do_update_resource" crm_feature_set="3.0.8" transition-key="9:4:0:be72ea63-75a9-4de4-a591-e716f960743b" transition-magic="0:0;9:4:0:be72ea63-75a9-4de4-a591-e716f960743b" call-id="14" rc-code="0" op-status="0" interval="10000" last-rc-change="1389767935" exec-time="180" queue-time="0" op-di
Jan 15 15:38:55 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+            </lrm_resource>
Jan 15 15:38:55 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+          </lrm_resources>
Jan 15 15:38:55 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+        </lrm>
Jan 15 15:38:55 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+      </node_state>
Jan 15 15:38:55 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+    </status>
Jan 15 15:38:55 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+  </cib>
Jan 15 15:38:55 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=local/crmd/63, version=0.9.5)
Jan 15 15:39:03 [30792] bl460g1n6       lrmd: (services_lin:217   )   debug: recurring_action_timer: 	Scheduling another invokation of prmPing_monitor_10000
Jan 15 15:39:04 [31185] bl460g1n6 attrd_updater: (     utils.c:1995  )    info: attrd_update_delegate: 	Connecting to cluster... 5 retries remaining
Jan 15 15:39:04 [30793] bl460g1n6      attrd: (       ipc.c:334   )    info: crm_client_new: 	Connecting 0x1754d60 for uid=0 gid=0 pid=31185 id=022cc79f-7da2-4ebd-a6c8-88cd7f05de8d
Jan 15 15:39:04 [30793] bl460g1n6      attrd: ( ipc_setup.c:484   )   debug: handle_new_connection: 	IPC credentials authenticated (30793-31185-10)
Jan 15 15:39:04 [30793] bl460g1n6      attrd: (   ipc_shm.c:295   )   debug: qb_ipcs_shm_connect: 	connecting to client [31185]
Jan 15 15:39:04 [30793] bl460g1n6      attrd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Jan 15 15:39:04 [30793] bl460g1n6      attrd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Jan 15 15:39:04 [30793] bl460g1n6      attrd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Jan 15 15:39:04 [31185] bl460g1n6 attrd_updater: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Jan 15 15:39:04 [31185] bl460g1n6 attrd_updater: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Jan 15 15:39:04 [31185] bl460g1n6 attrd_updater: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Jan 15 15:39:04 [31185] bl460g1n6 attrd_updater: (     utils.c:2023  )   debug: attrd_update_delegate: 	Sent update: default_ping_set=100 for localhost
Jan 15 15:39:04 [30793] bl460g1n6      attrd: (  commands.c:244   )    info: attrd_client_message: 	Broadcasting default_ping_set[bl460g1n6] = 100 (writer)
Jan 15 15:39:04 [31185] bl460g1n6 attrd_updater: (       xml.c:2719  )    info: crm_xml_cleanup: 	Cleaning up memory from libxml2
Jan 15 15:39:04 [30793] bl460g1n6      attrd: (      ipcs.c:757   )   debug: qb_ipcs_dispatch_connection_request: 	HUP conn (30793-31185-10)
Jan 15 15:39:04 [30793] bl460g1n6      attrd: (      ipcs.c:605   )   debug: qb_ipcs_disconnect: 	qb_ipcs_disconnect(30793-31185-10) state:2
Jan 15 15:39:04 [30793] bl460g1n6      attrd: (       ipc.c:368   )    info: crm_client_destroy: 	Destroying 0 events
Jan 15 15:39:04 [30793] bl460g1n6      attrd: (ringbuffer.c:299   )   debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-attrd-response-30793-31185-10-header
Jan 15 15:39:04 [30793] bl460g1n6      attrd: (ringbuffer.c:299   )   debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-attrd-event-30793-31185-10-header
Jan 15 15:39:04 [30793] bl460g1n6      attrd: (ringbuffer.c:299   )   debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-attrd-request-30793-31185-10-header
Jan 15 15:39:04 [30792] bl460g1n6       lrmd: (services_lin:301   )   debug: operation_finished: 	prmPing_monitor_10000:31168 - exited with rc=0
Jan 15 15:39:04 [30792] bl460g1n6       lrmd: (services_lin:306   )   debug: operation_finished: 	prmPing_monitor_10000:31168:stderr [ -- empty -- ]
Jan 15 15:39:04 [30792] bl460g1n6       lrmd: (services_lin:310   )   debug: operation_finished: 	prmPing_monitor_10000:31168:stdout [ -- empty -- ]
Jan 15 15:39:04 [30792] bl460g1n6       lrmd: (      lrmd.c:104   )   debug: log_finished: 	finished - rsc:prmPing action:monitor call_id:13 pid:31168 exit-code:0 exec-time:0ms queue-time:0ms
Jan 15 15:39:05 [30792] bl460g1n6       lrmd: (services_lin:217   )   debug: recurring_action_timer: 	Scheduling another invokation of prmVM2_monitor_10000
VirtualDomain(prmVM2)[31187]:	2014/01/15_15:39:05 DEBUG: Virtual domain vm2 is currently running.
Jan 15 15:39:05 [30790] bl460g1n6        cib: (       ipc.c:334   )    info: crm_client_new: 	Connecting 0x19816b0 for uid=0 gid=0 pid=31220 id=15b6d7a0-a588-4a39-b2d9-199fce76df86
Jan 15 15:39:05 [30790] bl460g1n6        cib: ( ipc_setup.c:484   )   debug: handle_new_connection: 	IPC credentials authenticated (30790-31220-14)
Jan 15 15:39:05 [30790] bl460g1n6        cib: (   ipc_shm.c:295   )   debug: qb_ipcs_shm_connect: 	connecting to client [31220]
Jan 15 15:39:05 [30790] bl460g1n6        cib: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:39:05 [30790] bl460g1n6        cib: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:39:05 [30790] bl460g1n6        cib: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:39:05 [31220] bl460g1n6 crm_resource: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:39:05 [31220] bl460g1n6 crm_resource: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:39:05 [31220] bl460g1n6 crm_resource: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:39:05 [31220] bl460g1n6 crm_resource: (cib_native.c:268   )   debug: cib_native_signon_raw: 	Connection to CIB successful
Jan 15 15:39:05 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_query operation for section 'all': OK (rc=0, origin=local/crm_resource/2, version=0.9.5)
Jan 15 15:39:05 [31220] bl460g1n6 crm_resource: (    unpack.c:93    )   debug: unpack_config: 	STONITH timeout: 60000
Jan 15 15:39:05 [31220] bl460g1n6 crm_resource: (    unpack.c:97    )   debug: unpack_config: 	STONITH of failed nodes is disabled
Jan 15 15:39:05 [31220] bl460g1n6 crm_resource: (    unpack.c:104   )   debug: unpack_config: 	Stop all active resources: false
Jan 15 15:39:05 [31220] bl460g1n6 crm_resource: (    unpack.c:108   )   debug: unpack_config: 	Cluster is symmetric - resources can run anywhere by default
Jan 15 15:39:05 [31220] bl460g1n6 crm_resource: (    unpack.c:113   )   debug: unpack_config: 	Default stickiness: 0
Jan 15 15:39:05 [31220] bl460g1n6 crm_resource: (    unpack.c:156   )  notice: unpack_config: 	On loss of CCM Quorum: Ignore
Jan 15 15:39:05 [31220] bl460g1n6 crm_resource: (    unpack.c:196   )   debug: unpack_config: 	Node scores: 'red' = -INFINITY, 'yellow' = 0, 'green' = 0
Jan 15 15:39:05 [31220] bl460g1n6 crm_resource: (    unpack.c:413   ) warning: handle_startup_fencing: 	Blind faith: not fencing unseen nodes
Jan 15 15:39:05 [31220] bl460g1n6 crm_resource: (    unpack.c:504   )   debug: unpack_domains: 	Unpacking domains
Jan 15 15:39:05 [31220] bl460g1n6 crm_resource: (    unpack.c:1331  )    info: determine_online_status: 	Node bl460g1n6 is online
Jan 15 15:39:05 [31220] bl460g1n6 crm_resource: (    unpack.c:1331  )    info: determine_online_status: 	Node bl460g1n7 is online
Jan 15 15:39:05 [31220] bl460g1n6 crm_resource: (    unpack.c:1553  )   debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n6 to prmPing:0
Jan 15 15:39:05 [31220] bl460g1n6 crm_resource: (    unpack.c:2794  )   trace: unpack_rsc_op: 	Unpacking task prmPing_start_0/start (call_id=11, status=0, rc=0) on bl460g1n6 (role=Unknown)
Jan 15 15:39:05 [31220] bl460g1n6 crm_resource: (    unpack.c:2839  )   trace: unpack_rsc_op: 	Handling status: 0
Jan 15 15:39:05 [31220] bl460g1n6 crm_resource: (    unpack.c:2862  )   trace: unpack_rsc_op: 	prmPing:0/start completed on bl460g1n6
Jan 15 15:39:05 [31220] bl460g1n6 crm_resource: (    unpack.c:2928  )   trace: unpack_rsc_op: 	Resource prmPing:0 after start: role=Started
Jan 15 15:39:05 [31220] bl460g1n6 crm_resource: (    unpack.c:2794  )   trace: unpack_rsc_op: 	Unpacking task prmPing_monitor_10000/monitor (call_id=13, status=0, rc=0) on bl460g1n6 (role=Started)
Jan 15 15:39:05 [31220] bl460g1n6 crm_resource: (    unpack.c:2839  )   trace: unpack_rsc_op: 	Handling status: 0
Jan 15 15:39:05 [31220] bl460g1n6 crm_resource: (    unpack.c:2862  )   trace: unpack_rsc_op: 	prmPing:0/monitor completed on bl460g1n6
Jan 15 15:39:05 [31220] bl460g1n6 crm_resource: (    unpack.c:2928  )   trace: unpack_rsc_op: 	Resource prmPing:0 after monitor: role=Started
Jan 15 15:39:05 [31220] bl460g1n6 crm_resource: (    unpack.c:2794  )   trace: unpack_rsc_op: 	Unpacking task prmVM2_start_0/start (call_id=12, status=0, rc=0) on bl460g1n6 (role=Unknown)
Jan 15 15:39:05 [31220] bl460g1n6 crm_resource: (    unpack.c:2839  )   trace: unpack_rsc_op: 	Handling status: 0
Jan 15 15:39:05 [31220] bl460g1n6 crm_resource: (    unpack.c:2862  )   trace: unpack_rsc_op: 	prmVM2/start completed on bl460g1n6
Jan 15 15:39:05 [31220] bl460g1n6 crm_resource: (    unpack.c:2928  )   trace: unpack_rsc_op: 	Resource prmVM2 after start: role=Started
Jan 15 15:39:05 [31220] bl460g1n6 crm_resource: (    unpack.c:2794  )   trace: unpack_rsc_op: 	Unpacking task prmVM2_monitor_10000/monitor (call_id=14, status=0, rc=0) on bl460g1n6 (role=Started)
Jan 15 15:39:05 [31220] bl460g1n6 crm_resource: (    unpack.c:2839  )   trace: unpack_rsc_op: 	Handling status: 0
Jan 15 15:39:05 [31220] bl460g1n6 crm_resource: (    unpack.c:2862  )   trace: unpack_rsc_op: 	prmVM2/monitor completed on bl460g1n6
Jan 15 15:39:05 [31220] bl460g1n6 crm_resource: (    unpack.c:2928  )   trace: unpack_rsc_op: 	Resource prmVM2 after monitor: role=Started
Jan 15 15:39:05 [31220] bl460g1n6 crm_resource: (    unpack.c:1553  )   debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n7 to prmPing:1
Jan 15 15:39:05 [31220] bl460g1n6 crm_resource: (    unpack.c:2794  )   trace: unpack_rsc_op: 	Unpacking task prmPing_start_0/start (call_id=11, status=0, rc=0) on bl460g1n7 (role=Unknown)
Jan 15 15:39:05 [31220] bl460g1n6 crm_resource: (    unpack.c:2839  )   trace: unpack_rsc_op: 	Handling status: 0
Jan 15 15:39:05 [31220] bl460g1n6 crm_resource: (    unpack.c:2862  )   trace: unpack_rsc_op: 	prmPing:1/start completed on bl460g1n7
Jan 15 15:39:05 [31220] bl460g1n6 crm_resource: (    unpack.c:2928  )   trace: unpack_rsc_op: 	Resource prmPing:1 after start: role=Started
Jan 15 15:39:05 [31220] bl460g1n6 crm_resource: (    unpack.c:2794  )   trace: unpack_rsc_op: 	Unpacking task prmPing_monitor_10000/monitor (call_id=12, status=0, rc=0) on bl460g1n7 (role=Started)
Jan 15 15:39:05 [31220] bl460g1n6 crm_resource: (    unpack.c:2839  )   trace: unpack_rsc_op: 	Handling status: 0
Jan 15 15:39:05 [31220] bl460g1n6 crm_resource: (    unpack.c:2862  )   trace: unpack_rsc_op: 	prmPing:1/monitor completed on bl460g1n7
Jan 15 15:39:05 [31220] bl460g1n6 crm_resource: (    unpack.c:2928  )   trace: unpack_rsc_op: 	Resource prmPing:1 after monitor: role=Started
Jan 15 15:39:05 [31220] bl460g1n6 crm_resource: (    unpack.c:2794  )   trace: unpack_rsc_op: 	Unpacking task prmVM2_monitor_0/monitor (call_id=5, status=0, rc=7) on bl460g1n7 (role=Unknown)
Jan 15 15:39:05 [31220] bl460g1n6 crm_resource: (    unpack.c:2839  )   trace: unpack_rsc_op: 	Handling status: 0
Jan 15 15:39:05 [31220] bl460g1n6 crm_resource: (    unpack.c:2862  )   trace: unpack_rsc_op: 	prmVM2/monitor completed on bl460g1n7
Jan 15 15:39:05 [31220] bl460g1n6 crm_resource: (    unpack.c:2928  )   trace: unpack_rsc_op: 	Resource prmVM2 after monitor: role=Stopped
Jan 15 15:39:05 [31220] bl460g1n6 crm_resource: (crm_resource:382   )   debug: dump_resource_attr: 	Looking up cpu in prmVM2
Jan 15 15:39:05 [31220] bl460g1n6 crm_resource: (cib_native.c:282   )   debug: cib_native_signoff: 	Signing out of the CIB Service
Jan 15 15:39:05 [31220] bl460g1n6 crm_resource: (      ipcc.c:378   )   debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Jan 15 15:39:05 [30790] bl460g1n6        cib: (      ipcs.c:757   )   debug: qb_ipcs_dispatch_connection_request: 	HUP conn (30790-31220-14)
Jan 15 15:39:05 [31220] bl460g1n6 crm_resource: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-request-30790-31220-14-header
Jan 15 15:39:05 [30790] bl460g1n6        cib: (      ipcs.c:605   )   debug: qb_ipcs_disconnect: 	qb_ipcs_disconnect(30790-31220-14) state:2
Jan 15 15:39:05 [31220] bl460g1n6 crm_resource: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-response-30790-31220-14-header
Jan 15 15:39:05 [31220] bl460g1n6 crm_resource: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-event-30790-31220-14-header
Jan 15 15:39:05 [31220] bl460g1n6 crm_resource: (       xml.c:2719  )    info: crm_xml_cleanup: 	Cleaning up memory from libxml2
Jan 15 15:39:05 [30790] bl460g1n6        cib: (       ipc.c:368   )    info: crm_client_destroy: 	Destroying 0 events
Jan 15 15:39:05 [30790] bl460g1n6        cib: (ringbuffer.c:299   )   debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-response-30790-31220-14-header
Jan 15 15:39:05 [30790] bl460g1n6        cib: (ringbuffer.c:299   )   debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-event-30790-31220-14-header
Jan 15 15:39:05 [30790] bl460g1n6        cib: (ringbuffer.c:299   )   debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-request-30790-31220-14-header
Jan 15 15:39:05 [30790] bl460g1n6        cib: (       ipc.c:334   )    info: crm_client_new: 	Connecting 0x19816b0 for uid=0 gid=0 pid=31226 id=e06bf24e-2bdd-42c0-863c-b4cc09fda7b1
Jan 15 15:39:05 [30790] bl460g1n6        cib: ( ipc_setup.c:484   )   debug: handle_new_connection: 	IPC credentials authenticated (30790-31226-14)
Jan 15 15:39:05 [30790] bl460g1n6        cib: (   ipc_shm.c:295   )   debug: qb_ipcs_shm_connect: 	connecting to client [31226]
Jan 15 15:39:05 [30790] bl460g1n6        cib: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:39:05 [30790] bl460g1n6        cib: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:39:05 [30790] bl460g1n6        cib: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:39:05 [31226] bl460g1n6 crm_resource: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:39:05 [31226] bl460g1n6 crm_resource: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:39:05 [31226] bl460g1n6 crm_resource: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:39:05 [31226] bl460g1n6 crm_resource: (cib_native.c:268   )   debug: cib_native_signon_raw: 	Connection to CIB successful
Jan 15 15:39:05 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_query operation for section 'all': OK (rc=0, origin=local/crm_resource/2, version=0.9.5)
Jan 15 15:39:05 [31226] bl460g1n6 crm_resource: (    unpack.c:93    )   debug: unpack_config: 	STONITH timeout: 60000
Jan 15 15:39:05 [31226] bl460g1n6 crm_resource: (    unpack.c:97    )   debug: unpack_config: 	STONITH of failed nodes is disabled
Jan 15 15:39:05 [31226] bl460g1n6 crm_resource: (    unpack.c:104   )   debug: unpack_config: 	Stop all active resources: false
Jan 15 15:39:05 [31226] bl460g1n6 crm_resource: (    unpack.c:108   )   debug: unpack_config: 	Cluster is symmetric - resources can run anywhere by default
Jan 15 15:39:05 [31226] bl460g1n6 crm_resource: (    unpack.c:113   )   debug: unpack_config: 	Default stickiness: 0
Jan 15 15:39:05 [31226] bl460g1n6 crm_resource: (    unpack.c:156   )  notice: unpack_config: 	On loss of CCM Quorum: Ignore
Jan 15 15:39:05 [31226] bl460g1n6 crm_resource: (    unpack.c:196   )   debug: unpack_config: 	Node scores: 'red' = -INFINITY, 'yellow' = 0, 'green' = 0
Jan 15 15:39:05 [31226] bl460g1n6 crm_resource: (    unpack.c:413   ) warning: handle_startup_fencing: 	Blind faith: not fencing unseen nodes
Jan 15 15:39:05 [31226] bl460g1n6 crm_resource: (    unpack.c:504   )   debug: unpack_domains: 	Unpacking domains
Jan 15 15:39:05 [31226] bl460g1n6 crm_resource: (    unpack.c:1331  )    info: determine_online_status: 	Node bl460g1n6 is online
Jan 15 15:39:05 [31226] bl460g1n6 crm_resource: (    unpack.c:1331  )    info: determine_online_status: 	Node bl460g1n7 is online
Jan 15 15:39:05 [31226] bl460g1n6 crm_resource: (    unpack.c:1553  )   debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n6 to prmPing:0
Jan 15 15:39:05 [31226] bl460g1n6 crm_resource: (    unpack.c:2794  )   trace: unpack_rsc_op: 	Unpacking task prmPing_start_0/start (call_id=11, status=0, rc=0) on bl460g1n6 (role=Unknown)
Jan 15 15:39:05 [31226] bl460g1n6 crm_resource: (    unpack.c:2839  )   trace: unpack_rsc_op: 	Handling status: 0
Jan 15 15:39:05 [31226] bl460g1n6 crm_resource: (    unpack.c:2862  )   trace: unpack_rsc_op: 	prmPing:0/start completed on bl460g1n6
Jan 15 15:39:05 [31226] bl460g1n6 crm_resource: (    unpack.c:2928  )   trace: unpack_rsc_op: 	Resource prmPing:0 after start: role=Started
Jan 15 15:39:05 [31226] bl460g1n6 crm_resource: (    unpack.c:2794  )   trace: unpack_rsc_op: 	Unpacking task prmPing_monitor_10000/monitor (call_id=13, status=0, rc=0) on bl460g1n6 (role=Started)
Jan 15 15:39:05 [31226] bl460g1n6 crm_resource: (    unpack.c:2839  )   trace: unpack_rsc_op: 	Handling status: 0
Jan 15 15:39:05 [31226] bl460g1n6 crm_resource: (    unpack.c:2862  )   trace: unpack_rsc_op: 	prmPing:0/monitor completed on bl460g1n6
Jan 15 15:39:05 [31226] bl460g1n6 crm_resource: (    unpack.c:2928  )   trace: unpack_rsc_op: 	Resource prmPing:0 after monitor: role=Started
Jan 15 15:39:05 [31226] bl460g1n6 crm_resource: (    unpack.c:2794  )   trace: unpack_rsc_op: 	Unpacking task prmVM2_start_0/start (call_id=12, status=0, rc=0) on bl460g1n6 (role=Unknown)
Jan 15 15:39:05 [31226] bl460g1n6 crm_resource: (    unpack.c:2839  )   trace: unpack_rsc_op: 	Handling status: 0
Jan 15 15:39:05 [31226] bl460g1n6 crm_resource: (    unpack.c:2862  )   trace: unpack_rsc_op: 	prmVM2/start completed on bl460g1n6
Jan 15 15:39:05 [31226] bl460g1n6 crm_resource: (    unpack.c:2928  )   trace: unpack_rsc_op: 	Resource prmVM2 after start: role=Started
Jan 15 15:39:05 [31226] bl460g1n6 crm_resource: (    unpack.c:2794  )   trace: unpack_rsc_op: 	Unpacking task prmVM2_monitor_10000/monitor (call_id=14, status=0, rc=0) on bl460g1n6 (role=Started)
Jan 15 15:39:05 [31226] bl460g1n6 crm_resource: (    unpack.c:2839  )   trace: unpack_rsc_op: 	Handling status: 0
Jan 15 15:39:05 [31226] bl460g1n6 crm_resource: (    unpack.c:2862  )   trace: unpack_rsc_op: 	prmVM2/monitor completed on bl460g1n6
Jan 15 15:39:05 [31226] bl460g1n6 crm_resource: (    unpack.c:2928  )   trace: unpack_rsc_op: 	Resource prmVM2 after monitor: role=Started
Jan 15 15:39:05 [31226] bl460g1n6 crm_resource: (    unpack.c:1553  )   debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n7 to prmPing:1
Jan 15 15:39:05 [31226] bl460g1n6 crm_resource: (    unpack.c:2794  )   trace: unpack_rsc_op: 	Unpacking task prmPing_start_0/start (call_id=11, status=0, rc=0) on bl460g1n7 (role=Unknown)
Jan 15 15:39:05 [31226] bl460g1n6 crm_resource: (    unpack.c:2839  )   trace: unpack_rsc_op: 	Handling status: 0
Jan 15 15:39:05 [31226] bl460g1n6 crm_resource: (    unpack.c:2862  )   trace: unpack_rsc_op: 	prmPing:1/start completed on bl460g1n7
Jan 15 15:39:05 [31226] bl460g1n6 crm_resource: (    unpack.c:2928  )   trace: unpack_rsc_op: 	Resource prmPing:1 after start: role=Started
Jan 15 15:39:05 [31226] bl460g1n6 crm_resource: (    unpack.c:2794  )   trace: unpack_rsc_op: 	Unpacking task prmPing_monitor_10000/monitor (call_id=12, status=0, rc=0) on bl460g1n7 (role=Started)
Jan 15 15:39:05 [31226] bl460g1n6 crm_resource: (    unpack.c:2839  )   trace: unpack_rsc_op: 	Handling status: 0
Jan 15 15:39:05 [31226] bl460g1n6 crm_resource: (    unpack.c:2862  )   trace: unpack_rsc_op: 	prmPing:1/monitor completed on bl460g1n7
Jan 15 15:39:05 [31226] bl460g1n6 crm_resource: (    unpack.c:2928  )   trace: unpack_rsc_op: 	Resource prmPing:1 after monitor: role=Started
Jan 15 15:39:05 [31226] bl460g1n6 crm_resource: (    unpack.c:2794  )   trace: unpack_rsc_op: 	Unpacking task prmVM2_monitor_0/monitor (call_id=5, status=0, rc=7) on bl460g1n7 (role=Unknown)
Jan 15 15:39:05 [31226] bl460g1n6 crm_resource: (    unpack.c:2839  )   trace: unpack_rsc_op: 	Handling status: 0
Jan 15 15:39:05 [31226] bl460g1n6 crm_resource: (    unpack.c:2862  )   trace: unpack_rsc_op: 	prmVM2/monitor completed on bl460g1n7
Jan 15 15:39:05 [31226] bl460g1n6 crm_resource: (    unpack.c:2928  )   trace: unpack_rsc_op: 	Resource prmVM2 after monitor: role=Stopped
Jan 15 15:39:05 [31226] bl460g1n6 crm_resource: (crm_resource:382   )   debug: dump_resource_attr: 	Looking up hv_memory in prmVM2
Jan 15 15:39:05 [31226] bl460g1n6 crm_resource: (cib_native.c:282   )   debug: cib_native_signoff: 	Signing out of the CIB Service
Jan 15 15:39:05 [31226] bl460g1n6 crm_resource: (      ipcc.c:378   )   debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Jan 15 15:39:05 [30790] bl460g1n6        cib: (      ipcs.c:757   )   debug: qb_ipcs_dispatch_connection_request: 	HUP conn (30790-31226-14)
Jan 15 15:39:05 [31226] bl460g1n6 crm_resource: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-request-30790-31226-14-header
Jan 15 15:39:05 [30790] bl460g1n6        cib: (      ipcs.c:605   )   debug: qb_ipcs_disconnect: 	qb_ipcs_disconnect(30790-31226-14) state:2
Jan 15 15:39:05 [31226] bl460g1n6 crm_resource: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-response-30790-31226-14-header
Jan 15 15:39:05 [31226] bl460g1n6 crm_resource: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-event-30790-31226-14-header
Jan 15 15:39:05 [30790] bl460g1n6        cib: (       ipc.c:368   )    info: crm_client_destroy: 	Destroying 0 events
Jan 15 15:39:05 [31226] bl460g1n6 crm_resource: (       xml.c:2719  )    info: crm_xml_cleanup: 	Cleaning up memory from libxml2
Jan 15 15:39:05 [30790] bl460g1n6        cib: (ringbuffer.c:299   )   debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-response-30790-31226-14-header
Jan 15 15:39:05 [30790] bl460g1n6        cib: (ringbuffer.c:299   )   debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-event-30790-31226-14-header
Jan 15 15:39:05 [30790] bl460g1n6        cib: (ringbuffer.c:299   )   debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-request-30790-31226-14-header
Jan 15 15:39:05 [30792] bl460g1n6       lrmd: (services_lin:301   )   debug: operation_finished: 	prmVM2_monitor_10000:31187 - exited with rc=0
Jan 15 15:39:05 [30792] bl460g1n6       lrmd: (services_lin:306   )   debug: operation_finished: 	prmVM2_monitor_10000:31187:stderr [ -- empty -- ]
Jan 15 15:39:05 [30792] bl460g1n6       lrmd: (services_lin:310   )   debug: operation_finished: 	prmVM2_monitor_10000:31187:stdout [ -- empty -- ]
Jan 15 15:39:05 [30792] bl460g1n6       lrmd: (      lrmd.c:104   )   debug: log_finished: 	finished - rsc:prmVM2 action:monitor call_id:14 pid:31187 exit-code:0 exec-time:0ms queue-time:0ms
Jan 15 15:39:09 [30795] bl460g1n6       crmd: (  throttle.c:260   )   debug: throttle_cib_load: 	cib load: 0.008667 (26 ticks in 30s)
Jan 15 15:39:09 [30795] bl460g1n6       crmd: (  throttle.c:302   )   debug: throttle_load_avg: 	Current load is 0.060000 (full: 0.06 0.03 0.00 1/408 31228)
Jan 15 15:39:09 [30795] bl460g1n6       crmd: (  throttle.c:382   )   debug: throttle_io_load: 	Current IO load is 0.000000
Jan 15 15:39:14 [30792] bl460g1n6       lrmd: (services_lin:217   )   debug: recurring_action_timer: 	Scheduling another invokation of prmPing_monitor_10000
Jan 15 15:39:15 [31245] bl460g1n6 attrd_updater: (     utils.c:1995  )    info: attrd_update_delegate: 	Connecting to cluster... 5 retries remaining
Jan 15 15:39:15 [30793] bl460g1n6      attrd: (       ipc.c:334   )    info: crm_client_new: 	Connecting 0x1754d60 for uid=0 gid=0 pid=31245 id=bdfb287f-b935-40e7-a009-9d8a68363dbb
Jan 15 15:39:15 [30793] bl460g1n6      attrd: ( ipc_setup.c:484   )   debug: handle_new_connection: 	IPC credentials authenticated (30793-31245-10)
Jan 15 15:39:15 [30793] bl460g1n6      attrd: (   ipc_shm.c:295   )   debug: qb_ipcs_shm_connect: 	connecting to client [31245]
Jan 15 15:39:15 [30793] bl460g1n6      attrd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Jan 15 15:39:15 [30793] bl460g1n6      attrd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Jan 15 15:39:15 [30793] bl460g1n6      attrd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Jan 15 15:39:15 [31245] bl460g1n6 attrd_updater: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Jan 15 15:39:15 [31245] bl460g1n6 attrd_updater: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Jan 15 15:39:15 [31245] bl460g1n6 attrd_updater: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Jan 15 15:39:15 [30793] bl460g1n6      attrd: (  commands.c:244   )    info: attrd_client_message: 	Broadcasting default_ping_set[bl460g1n6] = 100 (writer)
Jan 15 15:39:15 [31245] bl460g1n6 attrd_updater: (     utils.c:2023  )   debug: attrd_update_delegate: 	Sent update: default_ping_set=100 for localhost
Jan 15 15:39:15 [31245] bl460g1n6 attrd_updater: (       xml.c:2719  )    info: crm_xml_cleanup: 	Cleaning up memory from libxml2
Jan 15 15:39:15 [30793] bl460g1n6      attrd: (      ipcs.c:757   )   debug: qb_ipcs_dispatch_connection_request: 	HUP conn (30793-31245-10)
Jan 15 15:39:15 [30793] bl460g1n6      attrd: (      ipcs.c:605   )   debug: qb_ipcs_disconnect: 	qb_ipcs_disconnect(30793-31245-10) state:2
Jan 15 15:39:15 [30793] bl460g1n6      attrd: (       ipc.c:368   )    info: crm_client_destroy: 	Destroying 0 events
Jan 15 15:39:15 [30793] bl460g1n6      attrd: (ringbuffer.c:299   )   debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-attrd-response-30793-31245-10-header
Jan 15 15:39:15 [30793] bl460g1n6      attrd: (ringbuffer.c:299   )   debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-attrd-event-30793-31245-10-header
Jan 15 15:39:15 [30793] bl460g1n6      attrd: (ringbuffer.c:299   )   debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-attrd-request-30793-31245-10-header
Jan 15 15:39:15 [30792] bl460g1n6       lrmd: (services_lin:301   )   debug: operation_finished: 	prmPing_monitor_10000:31229 - exited with rc=0
Jan 15 15:39:15 [30792] bl460g1n6       lrmd: (services_lin:306   )   debug: operation_finished: 	prmPing_monitor_10000:31229:stderr [ -- empty -- ]
Jan 15 15:39:15 [30792] bl460g1n6       lrmd: (services_lin:310   )   debug: operation_finished: 	prmPing_monitor_10000:31229:stdout [ -- empty -- ]
Jan 15 15:39:15 [30792] bl460g1n6       lrmd: (      lrmd.c:104   )   debug: log_finished: 	finished - rsc:prmPing action:monitor call_id:13 pid:31229 exit-code:0 exec-time:0ms queue-time:0ms
Jan 15 15:39:15 [30792] bl460g1n6       lrmd: (services_lin:217   )   debug: recurring_action_timer: 	Scheduling another invokation of prmVM2_monitor_10000
VirtualDomain(prmVM2)[31246]:	2014/01/15_15:39:15 DEBUG: Virtual domain vm2 is currently running.
Jan 15 15:39:15 [30790] bl460g1n6        cib: (       ipc.c:334   )    info: crm_client_new: 	Connecting 0x19816b0 for uid=0 gid=0 pid=31279 id=4ddf7869-0769-4285-ba4e-bbf9070ea0e7
Jan 15 15:39:15 [30790] bl460g1n6        cib: ( ipc_setup.c:484   )   debug: handle_new_connection: 	IPC credentials authenticated (30790-31279-14)
Jan 15 15:39:15 [30790] bl460g1n6        cib: (   ipc_shm.c:295   )   debug: qb_ipcs_shm_connect: 	connecting to client [31279]
Jan 15 15:39:15 [30790] bl460g1n6        cib: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:39:15 [30790] bl460g1n6        cib: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:39:15 [30790] bl460g1n6        cib: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:39:15 [31279] bl460g1n6 crm_resource: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:39:15 [31279] bl460g1n6 crm_resource: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:39:15 [31279] bl460g1n6 crm_resource: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:39:15 [31279] bl460g1n6 crm_resource: (cib_native.c:268   )   debug: cib_native_signon_raw: 	Connection to CIB successful
Jan 15 15:39:15 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_query operation for section 'all': OK (rc=0, origin=local/crm_resource/2, version=0.9.5)
Jan 15 15:39:15 [31279] bl460g1n6 crm_resource: (    unpack.c:93    )   debug: unpack_config: 	STONITH timeout: 60000
Jan 15 15:39:15 [31279] bl460g1n6 crm_resource: (    unpack.c:97    )   debug: unpack_config: 	STONITH of failed nodes is disabled
Jan 15 15:39:15 [31279] bl460g1n6 crm_resource: (    unpack.c:104   )   debug: unpack_config: 	Stop all active resources: false
Jan 15 15:39:15 [31279] bl460g1n6 crm_resource: (    unpack.c:108   )   debug: unpack_config: 	Cluster is symmetric - resources can run anywhere by default
Jan 15 15:39:15 [31279] bl460g1n6 crm_resource: (    unpack.c:113   )   debug: unpack_config: 	Default stickiness: 0
Jan 15 15:39:15 [31279] bl460g1n6 crm_resource: (    unpack.c:156   )  notice: unpack_config: 	On loss of CCM Quorum: Ignore
Jan 15 15:39:15 [31279] bl460g1n6 crm_resource: (    unpack.c:196   )   debug: unpack_config: 	Node scores: 'red' = -INFINITY, 'yellow' = 0, 'green' = 0
Jan 15 15:39:15 [31279] bl460g1n6 crm_resource: (    unpack.c:413   ) warning: handle_startup_fencing: 	Blind faith: not fencing unseen nodes
Jan 15 15:39:15 [31279] bl460g1n6 crm_resource: (    unpack.c:504   )   debug: unpack_domains: 	Unpacking domains
Jan 15 15:39:15 [31279] bl460g1n6 crm_resource: (    unpack.c:1331  )    info: determine_online_status: 	Node bl460g1n6 is online
Jan 15 15:39:15 [31279] bl460g1n6 crm_resource: (    unpack.c:1331  )    info: determine_online_status: 	Node bl460g1n7 is online
Jan 15 15:39:15 [31279] bl460g1n6 crm_resource: (    unpack.c:1553  )   debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n6 to prmPing:0
Jan 15 15:39:15 [31279] bl460g1n6 crm_resource: (    unpack.c:2794  )   trace: unpack_rsc_op: 	Unpacking task prmPing_start_0/start (call_id=11, status=0, rc=0) on bl460g1n6 (role=Unknown)
Jan 15 15:39:15 [31279] bl460g1n6 crm_resource: (    unpack.c:2839  )   trace: unpack_rsc_op: 	Handling status: 0
Jan 15 15:39:15 [31279] bl460g1n6 crm_resource: (    unpack.c:2862  )   trace: unpack_rsc_op: 	prmPing:0/start completed on bl460g1n6
Jan 15 15:39:15 [31279] bl460g1n6 crm_resource: (    unpack.c:2928  )   trace: unpack_rsc_op: 	Resource prmPing:0 after start: role=Started
Jan 15 15:39:15 [31279] bl460g1n6 crm_resource: (    unpack.c:2794  )   trace: unpack_rsc_op: 	Unpacking task prmPing_monitor_10000/monitor (call_id=13, status=0, rc=0) on bl460g1n6 (role=Started)
Jan 15 15:39:15 [31279] bl460g1n6 crm_resource: (    unpack.c:2839  )   trace: unpack_rsc_op: 	Handling status: 0
Jan 15 15:39:15 [31279] bl460g1n6 crm_resource: (    unpack.c:2862  )   trace: unpack_rsc_op: 	prmPing:0/monitor completed on bl460g1n6
Jan 15 15:39:15 [31279] bl460g1n6 crm_resource: (    unpack.c:2928  )   trace: unpack_rsc_op: 	Resource prmPing:0 after monitor: role=Started
Jan 15 15:39:15 [31279] bl460g1n6 crm_resource: (    unpack.c:2794  )   trace: unpack_rsc_op: 	Unpacking task prmVM2_start_0/start (call_id=12, status=0, rc=0) on bl460g1n6 (role=Unknown)
Jan 15 15:39:15 [31279] bl460g1n6 crm_resource: (    unpack.c:2839  )   trace: unpack_rsc_op: 	Handling status: 0
Jan 15 15:39:15 [31279] bl460g1n6 crm_resource: (    unpack.c:2862  )   trace: unpack_rsc_op: 	prmVM2/start completed on bl460g1n6
Jan 15 15:39:15 [31279] bl460g1n6 crm_resource: (    unpack.c:2928  )   trace: unpack_rsc_op: 	Resource prmVM2 after start: role=Started
Jan 15 15:39:15 [31279] bl460g1n6 crm_resource: (    unpack.c:2794  )   trace: unpack_rsc_op: 	Unpacking task prmVM2_monitor_10000/monitor (call_id=14, status=0, rc=0) on bl460g1n6 (role=Started)
Jan 15 15:39:15 [31279] bl460g1n6 crm_resource: (    unpack.c:2839  )   trace: unpack_rsc_op: 	Handling status: 0
Jan 15 15:39:15 [31279] bl460g1n6 crm_resource: (    unpack.c:2862  )   trace: unpack_rsc_op: 	prmVM2/monitor completed on bl460g1n6
Jan 15 15:39:15 [31279] bl460g1n6 crm_resource: (    unpack.c:2928  )   trace: unpack_rsc_op: 	Resource prmVM2 after monitor: role=Started
Jan 15 15:39:15 [31279] bl460g1n6 crm_resource: (    unpack.c:1553  )   debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n7 to prmPing:1
Jan 15 15:39:15 [31279] bl460g1n6 crm_resource: (    unpack.c:2794  )   trace: unpack_rsc_op: 	Unpacking task prmPing_start_0/start (call_id=11, status=0, rc=0) on bl460g1n7 (role=Unknown)
Jan 15 15:39:15 [31279] bl460g1n6 crm_resource: (    unpack.c:2839  )   trace: unpack_rsc_op: 	Handling status: 0
Jan 15 15:39:15 [31279] bl460g1n6 crm_resource: (    unpack.c:2862  )   trace: unpack_rsc_op: 	prmPing:1/start completed on bl460g1n7
Jan 15 15:39:15 [31279] bl460g1n6 crm_resource: (    unpack.c:2928  )   trace: unpack_rsc_op: 	Resource prmPing:1 after start: role=Started
Jan 15 15:39:15 [31279] bl460g1n6 crm_resource: (    unpack.c:2794  )   trace: unpack_rsc_op: 	Unpacking task prmPing_monitor_10000/monitor (call_id=12, status=0, rc=0) on bl460g1n7 (role=Started)
Jan 15 15:39:15 [31279] bl460g1n6 crm_resource: (    unpack.c:2839  )   trace: unpack_rsc_op: 	Handling status: 0
Jan 15 15:39:15 [31279] bl460g1n6 crm_resource: (    unpack.c:2862  )   trace: unpack_rsc_op: 	prmPing:1/monitor completed on bl460g1n7
Jan 15 15:39:15 [31279] bl460g1n6 crm_resource: (    unpack.c:2928  )   trace: unpack_rsc_op: 	Resource prmPing:1 after monitor: role=Started
Jan 15 15:39:15 [31279] bl460g1n6 crm_resource: (    unpack.c:2794  )   trace: unpack_rsc_op: 	Unpacking task prmVM2_monitor_0/monitor (call_id=5, status=0, rc=7) on bl460g1n7 (role=Unknown)
Jan 15 15:39:15 [31279] bl460g1n6 crm_resource: (    unpack.c:2839  )   trace: unpack_rsc_op: 	Handling status: 0
Jan 15 15:39:15 [31279] bl460g1n6 crm_resource: (    unpack.c:2862  )   trace: unpack_rsc_op: 	prmVM2/monitor completed on bl460g1n7
Jan 15 15:39:15 [31279] bl460g1n6 crm_resource: (    unpack.c:2928  )   trace: unpack_rsc_op: 	Resource prmVM2 after monitor: role=Stopped
Jan 15 15:39:15 [31279] bl460g1n6 crm_resource: (crm_resource:382   )   debug: dump_resource_attr: 	Looking up cpu in prmVM2
Jan 15 15:39:15 [31279] bl460g1n6 crm_resource: (cib_native.c:282   )   debug: cib_native_signoff: 	Signing out of the CIB Service
Jan 15 15:39:15 [31279] bl460g1n6 crm_resource: (      ipcc.c:378   )   debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Jan 15 15:39:15 [30790] bl460g1n6        cib: (      ipcs.c:757   )   debug: qb_ipcs_dispatch_connection_request: 	HUP conn (30790-31279-14)
Jan 15 15:39:15 [31279] bl460g1n6 crm_resource: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-request-30790-31279-14-header
Jan 15 15:39:15 [30790] bl460g1n6        cib: (      ipcs.c:605   )   debug: qb_ipcs_disconnect: 	qb_ipcs_disconnect(30790-31279-14) state:2
Jan 15 15:39:15 [31279] bl460g1n6 crm_resource: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-response-30790-31279-14-header
Jan 15 15:39:15 [31279] bl460g1n6 crm_resource: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-event-30790-31279-14-header
Jan 15 15:39:15 [30790] bl460g1n6        cib: (       ipc.c:368   )    info: crm_client_destroy: 	Destroying 0 events
Jan 15 15:39:15 [31279] bl460g1n6 crm_resource: (       xml.c:2719  )    info: crm_xml_cleanup: 	Cleaning up memory from libxml2
Jan 15 15:39:15 [30790] bl460g1n6        cib: (ringbuffer.c:299   )   debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-response-30790-31279-14-header
Jan 15 15:39:15 [30790] bl460g1n6        cib: (ringbuffer.c:299   )   debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-event-30790-31279-14-header
Jan 15 15:39:15 [30790] bl460g1n6        cib: (ringbuffer.c:299   )   debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-request-30790-31279-14-header
Jan 15 15:39:15 [30790] bl460g1n6        cib: (       ipc.c:334   )    info: crm_client_new: 	Connecting 0x19816b0 for uid=0 gid=0 pid=31285 id=43add664-b741-4bf6-880a-4d52b4d263be
Jan 15 15:39:15 [30790] bl460g1n6        cib: ( ipc_setup.c:484   )   debug: handle_new_connection: 	IPC credentials authenticated (30790-31285-14)
Jan 15 15:39:15 [30790] bl460g1n6        cib: (   ipc_shm.c:295   )   debug: qb_ipcs_shm_connect: 	connecting to client [31285]
Jan 15 15:39:15 [30790] bl460g1n6        cib: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:39:15 [30790] bl460g1n6        cib: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:39:15 [30790] bl460g1n6        cib: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:39:15 [31285] bl460g1n6 crm_resource: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:39:15 [31285] bl460g1n6 crm_resource: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:39:15 [31285] bl460g1n6 crm_resource: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:39:15 [31285] bl460g1n6 crm_resource: (cib_native.c:268   )   debug: cib_native_signon_raw: 	Connection to CIB successful
Jan 15 15:39:15 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_query operation for section 'all': OK (rc=0, origin=local/crm_resource/2, version=0.9.5)
Jan 15 15:39:15 [31285] bl460g1n6 crm_resource: (    unpack.c:93    )   debug: unpack_config: 	STONITH timeout: 60000
Jan 15 15:39:15 [31285] bl460g1n6 crm_resource: (    unpack.c:97    )   debug: unpack_config: 	STONITH of failed nodes is disabled
Jan 15 15:39:15 [31285] bl460g1n6 crm_resource: (    unpack.c:104   )   debug: unpack_config: 	Stop all active resources: false
Jan 15 15:39:15 [31285] bl460g1n6 crm_resource: (    unpack.c:108   )   debug: unpack_config: 	Cluster is symmetric - resources can run anywhere by default
Jan 15 15:39:15 [31285] bl460g1n6 crm_resource: (    unpack.c:113   )   debug: unpack_config: 	Default stickiness: 0
Jan 15 15:39:15 [31285] bl460g1n6 crm_resource: (    unpack.c:156   )  notice: unpack_config: 	On loss of CCM Quorum: Ignore
Jan 15 15:39:15 [31285] bl460g1n6 crm_resource: (    unpack.c:196   )   debug: unpack_config: 	Node scores: 'red' = -INFINITY, 'yellow' = 0, 'green' = 0
Jan 15 15:39:15 [31285] bl460g1n6 crm_resource: (    unpack.c:413   ) warning: handle_startup_fencing: 	Blind faith: not fencing unseen nodes
Jan 15 15:39:15 [31285] bl460g1n6 crm_resource: (    unpack.c:504   )   debug: unpack_domains: 	Unpacking domains
Jan 15 15:39:15 [31285] bl460g1n6 crm_resource: (    unpack.c:1331  )    info: determine_online_status: 	Node bl460g1n6 is online
Jan 15 15:39:15 [31285] bl460g1n6 crm_resource: (    unpack.c:1331  )    info: determine_online_status: 	Node bl460g1n7 is online
Jan 15 15:39:15 [31285] bl460g1n6 crm_resource: (    unpack.c:1553  )   debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n6 to prmPing:0
Jan 15 15:39:15 [31285] bl460g1n6 crm_resource: (    unpack.c:2794  )   trace: unpack_rsc_op: 	Unpacking task prmPing_start_0/start (call_id=11, status=0, rc=0) on bl460g1n6 (role=Unknown)
Jan 15 15:39:15 [31285] bl460g1n6 crm_resource: (    unpack.c:2839  )   trace: unpack_rsc_op: 	Handling status: 0
Jan 15 15:39:15 [31285] bl460g1n6 crm_resource: (    unpack.c:2862  )   trace: unpack_rsc_op: 	prmPing:0/start completed on bl460g1n6
Jan 15 15:39:15 [31285] bl460g1n6 crm_resource: (    unpack.c:2928  )   trace: unpack_rsc_op: 	Resource prmPing:0 after start: role=Started
Jan 15 15:39:15 [31285] bl460g1n6 crm_resource: (    unpack.c:2794  )   trace: unpack_rsc_op: 	Unpacking task prmPing_monitor_10000/monitor (call_id=13, status=0, rc=0) on bl460g1n6 (role=Started)
Jan 15 15:39:15 [31285] bl460g1n6 crm_resource: (    unpack.c:2839  )   trace: unpack_rsc_op: 	Handling status: 0
Jan 15 15:39:15 [31285] bl460g1n6 crm_resource: (    unpack.c:2862  )   trace: unpack_rsc_op: 	prmPing:0/monitor completed on bl460g1n6
Jan 15 15:39:15 [31285] bl460g1n6 crm_resource: (    unpack.c:2928  )   trace: unpack_rsc_op: 	Resource prmPing:0 after monitor: role=Started
Jan 15 15:39:15 [31285] bl460g1n6 crm_resource: (    unpack.c:2794  )   trace: unpack_rsc_op: 	Unpacking task prmVM2_start_0/start (call_id=12, status=0, rc=0) on bl460g1n6 (role=Unknown)
Jan 15 15:39:15 [31285] bl460g1n6 crm_resource: (    unpack.c:2839  )   trace: unpack_rsc_op: 	Handling status: 0
Jan 15 15:39:15 [31285] bl460g1n6 crm_resource: (    unpack.c:2862  )   trace: unpack_rsc_op: 	prmVM2/start completed on bl460g1n6
Jan 15 15:39:15 [31285] bl460g1n6 crm_resource: (    unpack.c:2928  )   trace: unpack_rsc_op: 	Resource prmVM2 after start: role=Started
Jan 15 15:39:15 [31285] bl460g1n6 crm_resource: (    unpack.c:2794  )   trace: unpack_rsc_op: 	Unpacking task prmVM2_monitor_10000/monitor (call_id=14, status=0, rc=0) on bl460g1n6 (role=Started)
Jan 15 15:39:15 [31285] bl460g1n6 crm_resource: (    unpack.c:2839  )   trace: unpack_rsc_op: 	Handling status: 0
Jan 15 15:39:15 [31285] bl460g1n6 crm_resource: (    unpack.c:2862  )   trace: unpack_rsc_op: 	prmVM2/monitor completed on bl460g1n6
Jan 15 15:39:15 [31285] bl460g1n6 crm_resource: (    unpack.c:2928  )   trace: unpack_rsc_op: 	Resource prmVM2 after monitor: role=Started
Jan 15 15:39:15 [31285] bl460g1n6 crm_resource: (    unpack.c:1553  )   debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n7 to prmPing:1
Jan 15 15:39:15 [31285] bl460g1n6 crm_resource: (    unpack.c:2794  )   trace: unpack_rsc_op: 	Unpacking task prmPing_start_0/start (call_id=11, status=0, rc=0) on bl460g1n7 (role=Unknown)
Jan 15 15:39:15 [31285] bl460g1n6 crm_resource: (    unpack.c:2839  )   trace: unpack_rsc_op: 	Handling status: 0
Jan 15 15:39:15 [31285] bl460g1n6 crm_resource: (    unpack.c:2862  )   trace: unpack_rsc_op: 	prmPing:1/start completed on bl460g1n7
Jan 15 15:39:15 [31285] bl460g1n6 crm_resource: (    unpack.c:2928  )   trace: unpack_rsc_op: 	Resource prmPing:1 after start: role=Started
Jan 15 15:39:15 [31285] bl460g1n6 crm_resource: (    unpack.c:2794  )   trace: unpack_rsc_op: 	Unpacking task prmPing_monitor_10000/monitor (call_id=12, status=0, rc=0) on bl460g1n7 (role=Started)
Jan 15 15:39:15 [31285] bl460g1n6 crm_resource: (    unpack.c:2839  )   trace: unpack_rsc_op: 	Handling status: 0
Jan 15 15:39:15 [31285] bl460g1n6 crm_resource: (    unpack.c:2862  )   trace: unpack_rsc_op: 	prmPing:1/monitor completed on bl460g1n7
Jan 15 15:39:15 [31285] bl460g1n6 crm_resource: (    unpack.c:2928  )   trace: unpack_rsc_op: 	Resource prmPing:1 after monitor: role=Started
Jan 15 15:39:15 [31285] bl460g1n6 crm_resource: (    unpack.c:2794  )   trace: unpack_rsc_op: 	Unpacking task prmVM2_monitor_0/monitor (call_id=5, status=0, rc=7) on bl460g1n7 (role=Unknown)
Jan 15 15:39:15 [31285] bl460g1n6 crm_resource: (    unpack.c:2839  )   trace: unpack_rsc_op: 	Handling status: 0
Jan 15 15:39:15 [31285] bl460g1n6 crm_resource: (    unpack.c:2862  )   trace: unpack_rsc_op: 	prmVM2/monitor completed on bl460g1n7
Jan 15 15:39:15 [31285] bl460g1n6 crm_resource: (    unpack.c:2928  )   trace: unpack_rsc_op: 	Resource prmVM2 after monitor: role=Stopped
Jan 15 15:39:15 [31285] bl460g1n6 crm_resource: (crm_resource:382   )   debug: dump_resource_attr: 	Looking up hv_memory in prmVM2
Jan 15 15:39:15 [31285] bl460g1n6 crm_resource: (cib_native.c:282   )   debug: cib_native_signoff: 	Signing out of the CIB Service
Jan 15 15:39:15 [31285] bl460g1n6 crm_resource: (      ipcc.c:378   )   debug: qb_ipcc_disconnect: 	qb_ipcc_disconnect()
Jan 15 15:39:15 [30790] bl460g1n6        cib: (      ipcs.c:757   )   debug: qb_ipcs_dispatch_connection_request: 	HUP conn (30790-31285-14)
Jan 15 15:39:15 [31285] bl460g1n6 crm_resource: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-request-30790-31285-14-header
Jan 15 15:39:15 [30790] bl460g1n6        cib: (      ipcs.c:605   )   debug: qb_ipcs_disconnect: 	qb_ipcs_disconnect(30790-31285-14) state:2
Jan 15 15:39:15 [31285] bl460g1n6 crm_resource: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-response-30790-31285-14-header
Jan 15 15:39:15 [31285] bl460g1n6 crm_resource: (ringbuffer.c:302   )   debug: qb_rb_close: 	Closing ringbuffer: /dev/shm/qb-cib_rw-event-30790-31285-14-header
Jan 15 15:39:15 [30790] bl460g1n6        cib: (       ipc.c:368   )    info: crm_client_destroy: 	Destroying 0 events
Jan 15 15:39:15 [30790] bl460g1n6        cib: (ringbuffer.c:299   )   debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-response-30790-31285-14-header
Jan 15 15:39:15 [31285] bl460g1n6 crm_resource: (       xml.c:2719  )    info: crm_xml_cleanup: 	Cleaning up memory from libxml2
Jan 15 15:39:15 [30790] bl460g1n6        cib: (ringbuffer.c:299   )   debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-event-30790-31285-14-header
Jan 15 15:39:15 [30790] bl460g1n6        cib: (ringbuffer.c:299   )   debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-request-30790-31285-14-header
Jan 15 15:39:15 [30792] bl460g1n6       lrmd: (services_lin:301   )   debug: operation_finished: 	prmVM2_monitor_10000:31246 - exited with rc=0
Jan 15 15:39:15 [30792] bl460g1n6       lrmd: (services_lin:306   )   debug: operation_finished: 	prmVM2_monitor_10000:31246:stderr [ -- empty -- ]
Jan 15 15:39:15 [30792] bl460g1n6       lrmd: (services_lin:310   )   debug: operation_finished: 	prmVM2_monitor_10000:31246:stdout [ -- empty -- ]
Jan 15 15:39:15 [30792] bl460g1n6       lrmd: (      lrmd.c:104   )   debug: log_finished: 	finished - rsc:prmVM2 action:monitor call_id:14 pid:31246 exit-code:0 exec-time:0ms queue-time:0ms
Jan 15 15:39:20 [30790] bl460g1n6        cib: (       ipc.c:334   )    info: crm_client_new: 	Connecting 0x19816b0 for uid=0 gid=0 pid=31291 id=80ed5c61-5602-49d1-9840-0af08b3b3ed5
Jan 15 15:39:20 [30790] bl460g1n6        cib: ( ipc_setup.c:484   )   debug: handle_new_connection: 	IPC credentials authenticated (30790-31291-14)
Jan 15 15:39:20 [30790] bl460g1n6        cib: (   ipc_shm.c:295   )   debug: qb_ipcs_shm_connect: 	connecting to client [31291]
Jan 15 15:39:20 [30790] bl460g1n6        cib: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:39:20 [30790] bl460g1n6        cib: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:39:20 [30790] bl460g1n6        cib: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:39:20 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_query operation for section nodes: OK (rc=0, origin=local/cibadmin/2, version=0.9.5)
Jan 15 15:39:20 [30790] bl460g1n6        cib: (      ipcs.c:757   )   debug: qb_ipcs_dispatch_connection_request: 	HUP conn (30790-31291-14)
Jan 15 15:39:20 [30790] bl460g1n6        cib: (      ipcs.c:605   )   debug: qb_ipcs_disconnect: 	qb_ipcs_disconnect(30790-31291-14) state:2
Jan 15 15:39:20 [30790] bl460g1n6        cib: (       ipc.c:368   )    info: crm_client_destroy: 	Destroying 0 events
Jan 15 15:39:20 [30790] bl460g1n6        cib: (ringbuffer.c:299   )   debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-response-30790-31291-14-header
Jan 15 15:39:20 [30790] bl460g1n6        cib: (ringbuffer.c:299   )   debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-event-30790-31291-14-header
Jan 15 15:39:20 [30790] bl460g1n6        cib: (ringbuffer.c:299   )   debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-request-30790-31291-14-header
Jan 15 15:39:20 [30790] bl460g1n6        cib: (       ipc.c:334   )    info: crm_client_new: 	Connecting 0x19816b0 for uid=0 gid=0 pid=31292 id=8511aab6-7471-48ed-a5dd-f9e9b9338051
Jan 15 15:39:20 [30790] bl460g1n6        cib: ( ipc_setup.c:484   )   debug: handle_new_connection: 	IPC credentials authenticated (30790-31292-14)
Jan 15 15:39:20 [30790] bl460g1n6        cib: (   ipc_shm.c:295   )   debug: qb_ipcs_shm_connect: 	connecting to client [31292]
Jan 15 15:39:20 [30790] bl460g1n6        cib: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:39:20 [30790] bl460g1n6        cib: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:39:20 [30790] bl460g1n6        cib: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:39:20 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_query operation for section nodes: OK (rc=0, origin=local/crm_attribute/2, version=0.9.5)
Jan 15 15:39:20 [30790] bl460g1n6        cib: (   cib_ops.c:905   )   debug: cib_process_xpath: 	cib_query: //cib/configuration/nodes//node[@id='3232261592']//instance_attributes//nvpair[@name='standby'] does not exist
Jan 15 15:39:20 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_query operation for section //cib/configuration/nodes//node[@id='3232261592']//instance_attributes//nvpair[@name='standby']: No such device or address (rc=-6, origin=local/crm_attribute/3, version=0.9.5)
Jan 15 15:39:20 [30790] bl460g1n6        cib: (        io.c:596   )   debug: activateCibXml: 	Triggering CIB write for cib_modify op
Jan 15 15:39:20 [30795] bl460g1n6       crmd: (te_callbacks:122   )   debug: te_update_diff: 	Processing diff (cib_modify): 0.9.5 -> 0.10.1 (S_IDLE)
Jan 15 15:39:20 [30795] bl460g1n6       crmd: (  te_utils.c:419   )    info: abort_transition_graph: 	te_update_diff:126 - Triggered transition abort (complete=1, node=, tag=diff, id=(null), magic=NA, cib=0.10.1) : Non-status change
Jan 15 15:39:20 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause   <diff crm_feature_set="3.0.8" digest="c07edf894987b1e8aae55344b6a7804a">
Jan 15 15:39:20 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause     <diff-removed admin_epoch="0" epoch="9" num_updates="5">
Jan 15 15:39:20 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause       <cib admin_epoch="0" epoch="9" num_updates="5"/>
Jan 15 15:39:20 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause     </diff-removed>
Jan 15 15:39:20 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause     <diff-added>
Jan 15 15:39:20 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause       <cib epoch="10" num_updates="1" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.8" cib-last-written="Wed Jan 15 15:39:20 2014" update-origin="bl460g1n6" update-client="crm_attribute" have-quorum="1" dc-uuid="3232261592">
Jan 15 15:39:20 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause         <configuration>
Jan 15 15:39:20 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause           <nodes>
Jan 15 15:39:20 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause             <node id="3232261592" uname="bl460g1n6">
Jan 15 15:39:20 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause               <instance_attributes id="nodes-3232261592" __crm_diff_marker__="added:top">
Jan 15 15:39:20 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause                 <nvpair id="nodes-3232261592-standby" name="standby" value="on"/>
Jan 15 15:39:20 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause               </instance_attributes>
Jan 15 15:39:20 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause             </node>
Jan 15 15:39:20 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause           </nodes>
Jan 15 15:39:20 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause         </configuration>
Jan 15 15:39:20 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause       </cib>
Jan 15 15:39:20 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause     </diff-added>
Jan 15 15:39:20 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause   </diff>
Jan 15 15:39:20 [30795] bl460g1n6       crmd: (     utils.c:192   )   debug: crm_timer_start: 	Started New Transition Timer (I_PE_CALC:2000ms), src=89
Jan 15 15:39:20 [30791] bl460g1n6 stonith-ng: ( cib_utils.c:167   )   debug: Config update: 	Diff: --- 0.9.5
Jan 15 15:39:20 [30791] bl460g1n6 stonith-ng: ( cib_utils.c:169   )   debug: Config update: 	Diff: +++ 0.10.1 c07edf894987b1e8aae55344b6a7804a
Jan 15 15:39:20 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-- <cib admin_epoch="0" epoch="9" num_updates="5"/>
Jan 15 15:39:20 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+  <cib epoch="10" num_updates="1" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.8" cib-last-written="Wed Jan 15 15:39:20 2014" update-origin="bl460g1n6" update-client="crm_attribute" have-quorum="1" dc-uuid="3232261592">
Jan 15 15:39:20 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+    <configuration>
Jan 15 15:39:20 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+      <nodes>
Jan 15 15:39:20 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+        <node id="3232261592" uname="bl460g1n6">
Jan 15 15:39:20 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++         <instance_attributes id="nodes-3232261592">
Jan 15 15:39:20 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++           <nvpair id="nodes-3232261592-standby" name="standby" value="on"/>
Jan 15 15:39:20 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++         </instance_attributes>
Jan 15 15:39:20 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+        </node>
Jan 15 15:39:20 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+      </nodes>
Jan 15 15:39:20 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+    </configuration>
Jan 15 15:39:20 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+  </cib>
Jan 15 15:39:20 [30790] bl460g1n6        cib: ( cib_utils.c:167   )  notice: cib:diff: 	Diff: --- 0.9.5
Jan 15 15:39:20 [30790] bl460g1n6        cib: ( cib_utils.c:169   )  notice: cib:diff: 	Diff: +++ 0.10.1 c07edf894987b1e8aae55344b6a7804a
Jan 15 15:39:20 [30790] bl460g1n6        cib: (       xml.c:1496  )  notice: cib:diff: 	-- <cib admin_epoch="0" epoch="9" num_updates="5"/>
Jan 15 15:39:20 [30790] bl460g1n6        cib: (       xml.c:1507  )  notice: cib:diff: 	++         <instance_attributes id="nodes-3232261592">
Jan 15 15:39:20 [30790] bl460g1n6        cib: (       xml.c:1507  )  notice: cib:diff: 	++           <nvpair id="nodes-3232261592-standby" name="standby" value="on"/>
Jan 15 15:39:20 [30790] bl460g1n6        cib: (       xml.c:1507  )  notice: cib:diff: 	++         </instance_attributes>
Jan 15 15:39:20 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_modify operation for section nodes: OK (rc=0, origin=local/crm_attribute/4, version=0.10.1)
Jan 15 15:39:20 [30790] bl460g1n6        cib: (      ipcs.c:757   )   debug: qb_ipcs_dispatch_connection_request: 	HUP conn (30790-31292-14)
Jan 15 15:39:20 [30790] bl460g1n6        cib: (      ipcs.c:605   )   debug: qb_ipcs_disconnect: 	qb_ipcs_disconnect(30790-31292-14) state:2
Jan 15 15:39:20 [30790] bl460g1n6        cib: (       ipc.c:368   )    info: crm_client_destroy: 	Destroying 0 events
Jan 15 15:39:20 [30790] bl460g1n6        cib: (ringbuffer.c:299   )   debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-response-30790-31292-14-header
Jan 15 15:39:20 [30790] bl460g1n6        cib: (ringbuffer.c:299   )   debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-event-30790-31292-14-header
Jan 15 15:39:20 [30790] bl460g1n6        cib: (ringbuffer.c:299   )   debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-request-30790-31292-14-header
Jan 15 15:39:20 [30790] bl460g1n6        cib: (        io.c:738   )    info: write_cib_contents: 	Archived previous version as /var/lib/pacemaker/cib/cib-6.raw
Jan 15 15:39:20 [30790] bl460g1n6        cib: (        io.c:748   )   debug: write_cib_contents: 	Writing CIB to disk
Jan 15 15:39:20 [30790] bl460g1n6        cib: (        io.c:773   )    info: write_cib_contents: 	Wrote version 0.10.0 of the CIB to disk (digest: c3a684b15ebe3cc70d3e0b780cde1564)
Jan 15 15:39:20 [30790] bl460g1n6        cib: (        io.c:781   )   debug: write_cib_contents: 	Wrote digest c3a684b15ebe3cc70d3e0b780cde1564 to disk
Jan 15 15:39:20 [30790] bl460g1n6        cib: (        io.c:259   )    info: retrieveCib: 	Reading cluster configuration from: /var/lib/pacemaker/cib/cib.5ZU1nr (digest: /var/lib/pacemaker/cib/cib.1qM46a)
Jan 15 15:39:20 [30790] bl460g1n6        cib: (        io.c:786   )   debug: write_cib_contents: 	Activating /var/lib/pacemaker/cib/cib.5ZU1nr
Jan 15 15:39:22 [30795] bl460g1n6       crmd: (     utils.c:120   )    info: crm_timer_popped: 	New Transition Timer (I_PE_CALC) just popped (2000ms)
Jan 15 15:39:22 [30795] bl460g1n6       crmd: (       fsa.c:193   )   debug: s_crmd_fsa: 	Processing I_PE_CALC: [ state=S_IDLE cause=C_TIMER_POPPED origin=crm_timer_popped ]
Jan 15 15:39:22 [30795] bl460g1n6       crmd: (       fsa.c:502   )  notice: do_state_transition: 	State transition S_IDLE -> S_POLICY_ENGINE [ input=I_PE_CALC cause=C_TIMER_POPPED origin=crm_timer_popped ]
Jan 15 15:39:22 [30795] bl460g1n6       crmd: (       fsa.c:599   )    info: do_state_transition: 	Progressed to state S_POLICY_ENGINE after C_TIMER_POPPED
Jan 15 15:39:22 [30795] bl460g1n6       crmd: (       fsa.c:610   )   debug: do_state_transition: 	All 2 cluster nodes are eligible to run resources.
Jan 15 15:39:22 [30795] bl460g1n6       crmd: (   pengine.c:231   )   debug: do_pe_invoke: 	Query 64: Requesting the current CIB: S_POLICY_ENGINE
Jan 15 15:39:22 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_query operation for section 'all': OK (rc=0, origin=local/crmd/64, version=0.10.1)
Jan 15 15:39:22 [30795] bl460g1n6       crmd: (   pengine.c:299   )   debug: do_pe_invoke_callback: 	Invoking the PE: query=64, ref=pe_calc-dc-1389767962-44, seq=8, quorate=1
Jan 15 15:39:22 [30794] bl460g1n6    pengine: (    unpack.c:93    )   debug: unpack_config: 	STONITH timeout: 60000
Jan 15 15:39:22 [30794] bl460g1n6    pengine: (    unpack.c:97    )   debug: unpack_config: 	STONITH of failed nodes is disabled
Jan 15 15:39:22 [30794] bl460g1n6    pengine: (    unpack.c:104   )   debug: unpack_config: 	Stop all active resources: false
Jan 15 15:39:22 [30794] bl460g1n6    pengine: (    unpack.c:108   )   debug: unpack_config: 	Cluster is symmetric - resources can run anywhere by default
Jan 15 15:39:22 [30794] bl460g1n6    pengine: (    unpack.c:113   )   debug: unpack_config: 	Default stickiness: 0
Jan 15 15:39:22 [30794] bl460g1n6    pengine: (    unpack.c:156   )  notice: unpack_config: 	On loss of CCM Quorum: Ignore
Jan 15 15:39:22 [30794] bl460g1n6    pengine: (    unpack.c:196   )   debug: unpack_config: 	Node scores: 'red' = -INFINITY, 'yellow' = 0, 'green' = 0
Jan 15 15:39:22 [30794] bl460g1n6    pengine: (    unpack.c:504   )   debug: unpack_domains: 	Unpacking domains
Jan 15 15:39:22 [30794] bl460g1n6    pengine: (    unpack.c:969   )    info: unpack_status: 	Node bl460g1n6 is in standby-mode
Jan 15 15:39:22 [30794] bl460g1n6    pengine: (    unpack.c:1331  )    info: determine_online_status: 	Node bl460g1n6 is standby
Jan 15 15:39:22 [30794] bl460g1n6    pengine: (    unpack.c:1331  )    info: determine_online_status: 	Node bl460g1n7 is online
Jan 15 15:39:22 [30794] bl460g1n6    pengine: (    unpack.c:1553  )   debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n6 to prmPing:0
Jan 15 15:39:22 [30794] bl460g1n6    pengine: (    unpack.c:2794  )   trace: unpack_rsc_op: 	Unpacking task prmPing_start_0/start (call_id=11, status=0, rc=0) on bl460g1n6 (role=Unknown)
Jan 15 15:39:22 [30794] bl460g1n6    pengine: (    unpack.c:2839  )   trace: unpack_rsc_op: 	Handling status: 0
Jan 15 15:39:22 [30794] bl460g1n6    pengine: (    unpack.c:2862  )   trace: unpack_rsc_op: 	prmPing:0/start completed on bl460g1n6
Jan 15 15:39:22 [30794] bl460g1n6    pengine: (    unpack.c:2928  )   trace: unpack_rsc_op: 	Resource prmPing:0 after start: role=Started
Jan 15 15:39:22 [30794] bl460g1n6    pengine: (    unpack.c:2794  )   trace: unpack_rsc_op: 	Unpacking task prmPing_monitor_10000/monitor (call_id=13, status=0, rc=0) on bl460g1n6 (role=Started)
Jan 15 15:39:22 [30794] bl460g1n6    pengine: (    unpack.c:2839  )   trace: unpack_rsc_op: 	Handling status: 0
Jan 15 15:39:22 [30794] bl460g1n6    pengine: (    unpack.c:2862  )   trace: unpack_rsc_op: 	prmPing:0/monitor completed on bl460g1n6
Jan 15 15:39:22 [30794] bl460g1n6    pengine: (    unpack.c:2928  )   trace: unpack_rsc_op: 	Resource prmPing:0 after monitor: role=Started
Jan 15 15:39:22 [30794] bl460g1n6    pengine: (    unpack.c:2794  )   trace: unpack_rsc_op: 	Unpacking task prmVM2_start_0/start (call_id=12, status=0, rc=0) on bl460g1n6 (role=Unknown)
Jan 15 15:39:22 [30794] bl460g1n6    pengine: (    unpack.c:2839  )   trace: unpack_rsc_op: 	Handling status: 0
Jan 15 15:39:22 [30794] bl460g1n6    pengine: (    unpack.c:2862  )   trace: unpack_rsc_op: 	prmVM2/start completed on bl460g1n6
Jan 15 15:39:22 [30794] bl460g1n6    pengine: (    unpack.c:2928  )   trace: unpack_rsc_op: 	Resource prmVM2 after start: role=Started
Jan 15 15:39:22 [30794] bl460g1n6    pengine: (    unpack.c:2794  )   trace: unpack_rsc_op: 	Unpacking task prmVM2_monitor_10000/monitor (call_id=14, status=0, rc=0) on bl460g1n6 (role=Started)
Jan 15 15:39:22 [30794] bl460g1n6    pengine: (    unpack.c:2839  )   trace: unpack_rsc_op: 	Handling status: 0
Jan 15 15:39:22 [30794] bl460g1n6    pengine: (    unpack.c:2862  )   trace: unpack_rsc_op: 	prmVM2/monitor completed on bl460g1n6
Jan 15 15:39:22 [30794] bl460g1n6    pengine: (    unpack.c:2928  )   trace: unpack_rsc_op: 	Resource prmVM2 after monitor: role=Started
Jan 15 15:39:22 [30794] bl460g1n6    pengine: (    unpack.c:1553  )   debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n7 to prmPing:1
Jan 15 15:39:22 [30794] bl460g1n6    pengine: (    unpack.c:2794  )   trace: unpack_rsc_op: 	Unpacking task prmPing_start_0/start (call_id=11, status=0, rc=0) on bl460g1n7 (role=Unknown)
Jan 15 15:39:22 [30794] bl460g1n6    pengine: (    unpack.c:2839  )   trace: unpack_rsc_op: 	Handling status: 0
Jan 15 15:39:22 [30794] bl460g1n6    pengine: (    unpack.c:2862  )   trace: unpack_rsc_op: 	prmPing:1/start completed on bl460g1n7
Jan 15 15:39:22 [30794] bl460g1n6    pengine: (    unpack.c:2928  )   trace: unpack_rsc_op: 	Resource prmPing:1 after start: role=Started
Jan 15 15:39:22 [30794] bl460g1n6    pengine: (    unpack.c:2794  )   trace: unpack_rsc_op: 	Unpacking task prmPing_monitor_10000/monitor (call_id=12, status=0, rc=0) on bl460g1n7 (role=Started)
Jan 15 15:39:22 [30794] bl460g1n6    pengine: (    unpack.c:2839  )   trace: unpack_rsc_op: 	Handling status: 0
Jan 15 15:39:22 [30794] bl460g1n6    pengine: (    unpack.c:2862  )   trace: unpack_rsc_op: 	prmPing:1/monitor completed on bl460g1n7
Jan 15 15:39:22 [30794] bl460g1n6    pengine: (    unpack.c:2928  )   trace: unpack_rsc_op: 	Resource prmPing:1 after monitor: role=Started
Jan 15 15:39:22 [30794] bl460g1n6    pengine: (    unpack.c:2794  )   trace: unpack_rsc_op: 	Unpacking task prmVM2_monitor_0/monitor (call_id=5, status=0, rc=7) on bl460g1n7 (role=Unknown)
Jan 15 15:39:22 [30794] bl460g1n6    pengine: (    unpack.c:2839  )   trace: unpack_rsc_op: 	Handling status: 0
Jan 15 15:39:22 [30794] bl460g1n6    pengine: (    unpack.c:2862  )   trace: unpack_rsc_op: 	prmVM2/monitor completed on bl460g1n7
Jan 15 15:39:22 [30794] bl460g1n6    pengine: (    unpack.c:2928  )   trace: unpack_rsc_op: 	Resource prmVM2 after monitor: role=Stopped
Jan 15 15:39:22 [30794] bl460g1n6    pengine: (    native.c:446   )    info: native_print: 	prmVM2	(ocf::heartbeat:VirtualDomain):	Started bl460g1n6 
Jan 15 15:39:22 [30794] bl460g1n6    pengine: (     clone.c:417   )    info: clone_print: 	 Clone Set: clnPing [prmPing]
Jan 15 15:39:22 [30794] bl460g1n6    pengine: (    native.c:272   )   debug: native_active: 	Resource prmPing:0 active on bl460g1n6
Jan 15 15:39:22 [30794] bl460g1n6    pengine: (    native.c:272   )   debug: native_active: 	Resource prmPing:0 active on bl460g1n6
Jan 15 15:39:22 [30794] bl460g1n6    pengine: (    native.c:272   )   debug: native_active: 	Resource prmPing:1 active on bl460g1n7
Jan 15 15:39:22 [30794] bl460g1n6    pengine: (    native.c:272   )   debug: native_active: 	Resource prmPing:1 active on bl460g1n7
Jan 15 15:39:22 [30794] bl460g1n6    pengine: (     clone.c:311   )    info: short_print: 	     Started: [ bl460g1n6 bl460g1n7 ]
Jan 15 15:39:22 [30794] bl460g1n6    pengine: (  allocate.c:593   )   debug: common_apply_stickiness: 	Resource prmVM2: preferring current location (node=bl460g1n6, weight=1000000)
Jan 15 15:39:22 [30794] bl460g1n6    pengine: (  allocate.c:593   )   debug: common_apply_stickiness: 	Resource prmPing:0: preferring current location (node=bl460g1n6, weight=1000000)
Jan 15 15:39:22 [30794] bl460g1n6    pengine: (  allocate.c:593   )   debug: common_apply_stickiness: 	Resource prmPing:1: preferring current location (node=bl460g1n7, weight=1000000)
Jan 15 15:39:22 [30794] bl460g1n6    pengine: (     utils.c:386   )   debug: native_assign_node: 	Assigning bl460g1n7 to prmPing:1
Jan 15 15:39:22 [30794] bl460g1n6    pengine: (     utils.c:339   )   debug: native_assign_node: 	All nodes for resource prmPing:0 are unavailable, unclean or shutting down (bl460g1n6: 0, -1000000)
Jan 15 15:39:22 [30794] bl460g1n6    pengine: (     utils.c:356   )   debug: native_assign_node: 	Could not allocate a node for prmPing:0
Jan 15 15:39:22 [30794] bl460g1n6    pengine: (    native.c:564   )    info: native_color: 	Resource prmPing:0 cannot run anywhere
Jan 15 15:39:22 [30794] bl460g1n6    pengine: (     clone.c:625   )   debug: clone_color: 	Allocated 1 clnPing instances of a possible 2
Jan 15 15:39:22 [30794] bl460g1n6    pengine: (     utils.c:386   )   debug: native_assign_node: 	Assigning bl460g1n7 to prmVM2
Jan 15 15:39:22 [30794] bl460g1n6    pengine: (    native.c:790   )    info: RecurringOp: 	 Start recurring monitor (10s) for prmVM2 on bl460g1n7
Jan 15 15:39:22 [30794] bl460g1n6    pengine: (    native.c:2133  )  notice: LogActions: 	Migrate prmVM2	(Started bl460g1n6 -> bl460g1n7)
Jan 15 15:39:22 [30794] bl460g1n6    pengine: (    native.c:2216  )  notice: LogActions: 	Stop    prmPing:0	(bl460g1n6)
Jan 15 15:39:22 [30794] bl460g1n6    pengine: (    native.c:2141  )    info: LogActions: 	Leave   prmPing:1	(Started bl460g1n7)
Jan 15 15:39:22 [30794] bl460g1n6    pengine: (     graph.c:769   )   debug: action2xml: 	Using anonymous clone name prmPing for prmPing:0 (aka. prmPing)
Jan 15 15:39:22 [30794] bl460g1n6    pengine: (   pengine.c:178   )  notice: process_pe_message: 	Calculated Transition 5: /var/lib/pacemaker/pengine/pe-input-5.bz2
Jan 15 15:39:22 [30795] bl460g1n6       crmd: (       fsa.c:193   )   debug: s_crmd_fsa: 	Processing I_PE_SUCCESS: [ state=S_POLICY_ENGINE cause=C_IPC_MESSAGE origin=handle_response ]
Jan 15 15:39:22 [30795] bl460g1n6       crmd: (       fsa.c:502   )    info: do_state_transition: 	State transition S_POLICY_ENGINE -> S_TRANSITION_ENGINE [ input=I_PE_SUCCESS cause=C_IPC_MESSAGE origin=handle_response ]
Jan 15 15:39:22 [30795] bl460g1n6       crmd: (    unpack.c:230   )   debug: unpack_graph: 	Unpacked transition 5: 9 actions in 9 synapses
Jan 15 15:39:22 [30795] bl460g1n6       crmd: (   tengine.c:208   )    info: do_te_invoke: 	Processing graph 5 (ref=pe_calc-dc-1389767962-44) derived from /var/lib/pacemaker/pengine/pe-input-5.bz2
Jan 15 15:39:22 [30795] bl460g1n6       crmd: (te_actions.c:416   )  notice: te_rsc_command: 	Initiating action 11: migrate_to prmVM2_migrate_to_0 on bl460g1n6 (local)
Jan 15 15:39:22 [30795] bl460g1n6       crmd: (       lrm.c:1721  )   debug: stop_recurring_action_by_rsc: 	Cancelling op 14 for prmVM2 (prmVM2:14)
Jan 15 15:39:22 [30795] bl460g1n6       crmd: (       lrm.c:1063  )   debug: cancel_op: 	Cancelling op 14 for prmVM2 (prmVM2:14)
Jan 15 15:39:22 [30792] bl460g1n6       lrmd: (  services.c:352   )    info: cancel_recurring_action: 	Cancelling operation prmVM2_monitor_10000
Jan 15 15:39:22 [30792] bl460g1n6       lrmd: (      lrmd.c:104   )   debug: log_finished: 	finished - rsc:prmVM2 action:monitor call_id:14  exit-code:0 exec-time:0ms queue-time:0ms
Jan 15 15:39:22 [30792] bl460g1n6       lrmd: (      lrmd.c:1313  )   debug: process_lrmd_message: 	Processed lrmd_rsc_cancel operation from ea951299-8a4d-4fd6-8900-d6588e07ac38: rc=0, reply=1, notify=0, exit=4201864
Jan 15 15:39:22 [30795] bl460g1n6       crmd: (       lrm.c:1068  )   debug: cancel_op: 	Op 14 for prmVM2 (prmVM2:14): cancelled
Jan 15 15:39:22 [30795] bl460g1n6       crmd: (       lrm.c:1780  )   debug: do_lrm_rsc_op: 	Stopped 0 recurring operations in preparation for prmVM2_migrate_to_0
Jan 15 15:39:22 [30795] bl460g1n6       crmd: (       lrm.c:1784  )    info: do_lrm_rsc_op: 	Performing key=11:5:0:be72ea63-75a9-4de4-a591-e716f960743b op=prmVM2_migrate_to_0
Jan 15 15:39:22 [30792] bl460g1n6       lrmd: (      lrmd.c:1313  )   debug: process_lrmd_message: 	Processed lrmd_rsc_exec operation from ea951299-8a4d-4fd6-8900-d6588e07ac38: rc=16, reply=1, notify=0, exit=4201864
Jan 15 15:39:22 [30792] bl460g1n6       lrmd: (      lrmd.c:122   )    info: log_execute: 	executing - rsc:prmVM2 action:migrate_to call_id:16
Jan 15 15:39:22 [30795] bl460g1n6       crmd: (te_actions.c:55    )   debug: te_pseudo_action: 	Pseudo action 18 fired and confirmed
Jan 15 15:39:22 [30795] bl460g1n6       crmd: (     graph.c:336   )   debug: run_graph: 	Transition 5 (Complete=0, Pending=1, Fired=2, Skipped=0, Incomplete=7, Source=/var/lib/pacemaker/pengine/pe-input-5.bz2): In-progress
Jan 15 15:39:22 [30795] bl460g1n6       crmd: (       lrm.c:2106  )    info: process_lrm_event: 	LRM operation prmVM2_monitor_10000 (call=14, status=1, cib-update=0, confirmed=true) Cancelled
Jan 15 15:39:22 [30795] bl460g1n6       crmd: (       lrm.c:122   )   debug: update_history_cache: 	Updating history for 'prmVM2' with monitor op
Jan 15 15:39:22 [30795] bl460g1n6       crmd: (te_actions.c:416   )  notice: te_rsc_command: 	Initiating action 13: stop prmPing_stop_0 on bl460g1n6 (local)
Jan 15 15:39:22 [30795] bl460g1n6       crmd: (       lrm.c:1721  )   debug: stop_recurring_action_by_rsc: 	Cancelling op 13 for prmPing (prmPing:13)
Jan 15 15:39:22 [30795] bl460g1n6       crmd: (       lrm.c:1063  )   debug: cancel_op: 	Cancelling op 13 for prmPing (prmPing:13)
Jan 15 15:39:22 [30792] bl460g1n6       lrmd: (  services.c:352   )    info: cancel_recurring_action: 	Cancelling operation prmPing_monitor_10000
Jan 15 15:39:22 [30792] bl460g1n6       lrmd: (      lrmd.c:104   )   debug: log_finished: 	finished - rsc:prmPing action:monitor call_id:13  exit-code:0 exec-time:0ms queue-time:0ms
Jan 15 15:39:22 [30792] bl460g1n6       lrmd: (      lrmd.c:1313  )   debug: process_lrmd_message: 	Processed lrmd_rsc_cancel operation from ea951299-8a4d-4fd6-8900-d6588e07ac38: rc=0, reply=1, notify=0, exit=4201864
Jan 15 15:39:22 [30795] bl460g1n6       crmd: (       lrm.c:1068  )   debug: cancel_op: 	Op 13 for prmPing (prmPing:13): cancelled
Jan 15 15:39:22 [30795] bl460g1n6       crmd: (       lrm.c:1780  )   debug: do_lrm_rsc_op: 	Stopped 0 recurring operations in preparation for prmPing_stop_0
Jan 15 15:39:22 [30795] bl460g1n6       crmd: (       lrm.c:1784  )    info: do_lrm_rsc_op: 	Performing key=13:5:0:be72ea63-75a9-4de4-a591-e716f960743b op=prmPing_stop_0
Jan 15 15:39:22 [30792] bl460g1n6       lrmd: (      lrmd.c:1313  )   debug: process_lrmd_message: 	Processed lrmd_rsc_exec operation from ea951299-8a4d-4fd6-8900-d6588e07ac38: rc=18, reply=1, notify=0, exit=4201864
Jan 15 15:39:22 [30792] bl460g1n6       lrmd: (      lrmd.c:122   )    info: log_execute: 	executing - rsc:prmPing action:stop call_id:18
Jan 15 15:39:22 [30795] bl460g1n6       crmd: (     graph.c:336   )   debug: run_graph: 	Transition 5 (Complete=1, Pending=2, Fired=1, Skipped=0, Incomplete=6, Source=/var/lib/pacemaker/pengine/pe-input-5.bz2): In-progress
Jan 15 15:39:22 [30795] bl460g1n6       crmd: (       lrm.c:2106  )    info: process_lrm_event: 	LRM operation prmPing_monitor_10000 (call=13, status=1, cib-update=0, confirmed=true) Cancelled
Jan 15 15:39:22 [30795] bl460g1n6       crmd: (       lrm.c:122   )   debug: update_history_cache: 	Updating history for 'prmPing' with monitor op
Jan 15 15:39:22 [31308] bl460g1n6 attrd_updater: (     utils.c:1995  )    info: attrd_update_delegate: 	Connecting to cluster... 5 retries remaining
Jan 15 15:39:22 [30793] bl460g1n6      attrd: (       ipc.c:334   )    info: crm_client_new: 	Connecting 0x1754d60 for uid=0 gid=0 pid=31308 id=bbb10658-f6e9-49e6-ba24-438891fb2f7e
Jan 15 15:39:22 [30793] bl460g1n6      attrd: ( ipc_setup.c:484   )   debug: handle_new_connection: 	IPC credentials authenticated (30793-31308-10)
Jan 15 15:39:22 [30793] bl460g1n6      attrd: (   ipc_shm.c:295   )   debug: qb_ipcs_shm_connect: 	connecting to client [31308]
Jan 15 15:39:22 [30793] bl460g1n6      attrd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Jan 15 15:39:22 [30793] bl460g1n6      attrd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Jan 15 15:39:22 [30793] bl460g1n6      attrd: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Jan 15 15:39:22 [31308] bl460g1n6 attrd_updater: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Jan 15 15:39:22 [31308] bl460g1n6 attrd_updater: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Jan 15 15:39:22 [31308] bl460g1n6 attrd_updater: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:131085; real_size:135168; rb->word_size:33792
Jan 15 15:39:22 [31308] bl460g1n6 attrd_updater: (     utils.c:2023  )   debug: attrd_update_delegate: 	Sent update: default_ping_set=(null) for localhost
Jan 15 15:39:22 [30793] bl460g1n6      attrd: (  commands.c:244   )    info: attrd_client_message: 	Broadcasting default_ping_set[bl460g1n6] = (null) (writer)
Jan 15 15:39:22 [31308] bl460g1n6 attrd_updater: (       xml.c:2719  )    info: crm_xml_cleanup: 	Cleaning up memory from libxml2
Jan 15 15:39:22 [30793] bl460g1n6      attrd: (      ipcs.c:757   )   debug: qb_ipcs_dispatch_connection_request: 	HUP conn (30793-31308-10)
Jan 15 15:39:22 [30793] bl460g1n6      attrd: (      ipcs.c:605   )   debug: qb_ipcs_disconnect: 	qb_ipcs_disconnect(30793-31308-10) state:2
Jan 15 15:39:22 [30793] bl460g1n6      attrd: (       ipc.c:368   )    info: crm_client_destroy: 	Destroying 0 events
Jan 15 15:39:22 [30793] bl460g1n6      attrd: (ringbuffer.c:299   )   debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-attrd-response-30793-31308-10-header
Jan 15 15:39:22 [30793] bl460g1n6      attrd: (ringbuffer.c:299   )   debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-attrd-event-30793-31308-10-header
Jan 15 15:39:22 [30793] bl460g1n6      attrd: (ringbuffer.c:299   )   debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-attrd-request-30793-31308-10-header
Jan 15 15:39:22 [30792] bl460g1n6       lrmd: (services_lin:301   )   debug: operation_finished: 	prmPing_stop_0:31296 - exited with rc=0
Jan 15 15:39:22 [30792] bl460g1n6       lrmd: (services_lin:306   )   debug: operation_finished: 	prmPing_stop_0:31296:stderr [ -- empty -- ]
Jan 15 15:39:22 [30792] bl460g1n6       lrmd: (services_lin:310   )   debug: operation_finished: 	prmPing_stop_0:31296:stdout [ -- empty -- ]
Jan 15 15:39:22 [30792] bl460g1n6       lrmd: (      lrmd.c:104   )    info: log_finished: 	finished - rsc:prmPing action:stop call_id:18 pid:31296 exit-code:0 exec-time:34ms queue-time:0ms
Jan 15 15:39:22 [30793] bl460g1n6      attrd: (  commands.c:453   )    info: attrd_peer_update: 	Setting default_ping_set[bl460g1n6]: 100 -> (null) from bl460g1n6
Jan 15 15:39:22 [30795] bl460g1n6       crmd: (     utils.c:2126  )   debug: create_operation_update: 	do_update_resource: Updating resource prmPing after stop op complete (interval=0)
Jan 15 15:39:22 [30795] bl460g1n6       crmd: (       lrm.c:2101  )  notice: process_lrm_event: 	LRM operation prmPing_stop_0 (call=18, rc=0, cib-update=65, confirmed=true) ok
Jan 15 15:39:22 [30795] bl460g1n6       crmd: (       lrm.c:122   )   debug: update_history_cache: 	Updating history for 'prmPing' with stop op
Jan 15 15:39:22 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=local/crmd/65, version=0.10.2)
Jan 15 15:39:22 [30791] bl460g1n6 stonith-ng: ( cib_utils.c:167   )   debug: Config update: 	Diff: --- 0.10.1
Jan 15 15:39:22 [30791] bl460g1n6 stonith-ng: ( cib_utils.c:169   )   debug: Config update: 	Diff: +++ 0.10.2 b31e613a879f0cd27bb4458c2c58b44d
Jan 15 15:39:22 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-  <cib num_updates="1">
Jan 15 15:39:22 [30795] bl460g1n6       crmd: (te_callbacks:122   )   debug: te_update_diff: 	Processing diff (cib_modify): 0.10.1 -> 0.10.2 (S_TRANSITION_ENGINE)
Jan 15 15:39:22 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-    <status>
Jan 15 15:39:22 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-      <node_state id="3232261592">
Jan 15 15:39:22 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-        <lrm id="3232261592">
Jan 15 15:39:22 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-          <lrm_resources>
Jan 15 15:39:22 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-            <lrm_resource id="prmPing">
Jan 15 15:39:22 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	--             <lrm_rsc_op operation_key="prmPing_start_0" operation="start" transition-key="9:2:0:be72ea63-75a9-4de4-a591-e716f960743b" transition-magic="0:0;9:2:0:be72ea63-75a9-4de4-a591-e716f960743b" call-id="11" last-run="1389767929" last-rc-change="1389767929" exec-time="1052" id="prmPing_last_0"/>
Jan 15 15:39:22 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-            </lrm_resource>
Jan 15 15:39:22 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-          </lrm_resources>
Jan 15 15:39:22 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-        </lrm>
Jan 15 15:39:22 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-      </node_state>
Jan 15 15:39:22 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-    </status>
Jan 15 15:39:22 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-  </cib>
Jan 15 15:39:22 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+  <cib epoch="10" num_updates="2" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.8" cib-last-written="Wed Jan 15 15:39:20 2014" update-origin="bl460g1n6" update-client="crm_attribute" have-quorum="1" dc-uuid="3232261592">
Jan 15 15:39:22 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+    <status>
Jan 15 15:39:22 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+      <node_state id="3232261592" uname="bl460g1n6" in_ccm="true" crmd="online" crm-debug-origin="do_update_resource" join="member" expected="member">
Jan 15 15:39:22 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+        <lrm id="3232261592">
Jan 15 15:39:22 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+          <lrm_resources>
Jan 15 15:39:22 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+            <lrm_resource id="prmPing" type="ping" class="ocf" provider="pacemaker">
Jan 15 15:39:22 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++             <lrm_rsc_op id="prmPing_last_0" operation_key="prmPing_stop_0" operation="stop" crm-debug-origin="do_update_resource" crm_feature_set="3.0.8" transition-key="13:5:0:be72ea63-75a9-4de4-a591-e716f960743b" transition-magic="0:0;13:5:0:be72ea63-75a9-4de4-a591-e716f960743b" call-id="18" rc-code="0" op-status="0" interval="0" last-run="1389767962" last-rc-change="1389767962" exec-time="34" queue-time="0" o
Jan 15 15:39:22 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+            </lrm_resource>
Jan 15 15:39:22 [30795] bl460g1n6       crmd: ( te_events.c:375   )    info: match_graph_event: 	Action prmPing_stop_0 (13) confirmed on bl460g1n6 (rc=0)
Jan 15 15:39:22 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+          </lrm_resources>
Jan 15 15:39:22 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+        </lrm>
Jan 15 15:39:22 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+      </node_state>
Jan 15 15:39:22 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+    </status>
Jan 15 15:39:22 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+  </cib>
Jan 15 15:39:22 [30795] bl460g1n6       crmd: (te_actions.c:55    )   debug: te_pseudo_action: 	Pseudo action 19 fired and confirmed
Jan 15 15:39:22 [30795] bl460g1n6       crmd: (     graph.c:336   )   debug: run_graph: 	Transition 5 (Complete=2, Pending=1, Fired=1, Skipped=0, Incomplete=5, Source=/var/lib/pacemaker/pengine/pe-input-5.bz2): In-progress
Jan 15 15:39:22 [30795] bl460g1n6       crmd: (     graph.c:336   )   debug: run_graph: 	Transition 5 (Complete=3, Pending=1, Fired=0, Skipped=0, Incomplete=5, Source=/var/lib/pacemaker/pengine/pe-input-5.bz2): In-progress
VirtualDomain(prmVM2)[31295]:	2014/01/15_15:39:22 DEBUG: Virtual domain vm2 is currently running.
VirtualDomain(prmVM2)[31295]:	2014/01/15_15:39:22 INFO: vm2: Starting live migration to bl460g1n7 (using remote hypervisor URI qemu+ssh://bl460g1n7/system ).
Jan 15 15:39:27 [30793] bl460g1n6      attrd: (  commands.c:739   )   debug: write_attribute: 	Update: bl460g1n6[default_ping_set]=(null) (3232261592 3232261592 3232261592 bl460g1n6)
Jan 15 15:39:27 [30793] bl460g1n6      attrd: (  commands.c:739   )   debug: write_attribute: 	Update: bl460g1n7[default_ping_set]=100 (3232261593 3232261593 3232261593 bl460g1n7)
Jan 15 15:39:27 [30793] bl460g1n6      attrd: (  commands.c:765   )  notice: write_attribute: 	Sent update 11 with 2 changes for default_ping_set, id=<n/a>, set=(null)
Jan 15 15:39:27 [30790] bl460g1n6        cib: (   cib_ops.c:368   )   debug: cib_process_modify: 	Destroying /cib/status/node_state[1]/transient_attributes/instance_attributes/nvpair[3]
Jan 15 15:39:27 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=local/attrd/11, version=0.10.3)
Jan 15 15:39:27 [30791] bl460g1n6 stonith-ng: ( cib_utils.c:167   )   debug: Config update: 	Diff: --- 0.10.2
Jan 15 15:39:27 [30791] bl460g1n6 stonith-ng: ( cib_utils.c:169   )   debug: Config update: 	Diff: +++ 0.10.3 04d079fccf3d56707a176a3c76f8ab0a
Jan 15 15:39:27 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-  <cib num_updates="2">
Jan 15 15:39:27 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-    <status>
Jan 15 15:39:27 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-      <node_state id="3232261592">
Jan 15 15:39:27 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-        <transient_attributes id="3232261592">
Jan 15 15:39:27 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-          <instance_attributes id="status-3232261592">
Jan 15 15:39:27 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	--           <nvpair id="status-3232261592-default_ping_set" name="default_ping_set" value="100"/>
Jan 15 15:39:27 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-          </instance_attributes>
Jan 15 15:39:27 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-        </transient_attributes>
Jan 15 15:39:27 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-      </node_state>
Jan 15 15:39:27 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-    </status>
Jan 15 15:39:27 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-  </cib>
Jan 15 15:39:27 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++ <cib epoch="10" num_updates="3" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.8" cib-last-written="Wed Jan 15 15:39:20 2014" update-origin="bl460g1n6" update-client="crm_attribute" have-quorum="1" dc-uuid="3232261592"/>
Jan 15 15:39:27 [30795] bl460g1n6       crmd: (te_callbacks:122   )   debug: te_update_diff: 	Processing diff (cib_modify): 0.10.2 -> 0.10.3 (S_TRANSITION_ENGINE)
Jan 15 15:39:27 [30795] bl460g1n6       crmd: (  te_utils.c:419   )    info: abort_transition_graph: 	te_update_diff:188 - Triggered transition abort (complete=0, node=bl460g1n6, tag=transient_attributes, id=3232261592, magic=NA, cib=0.10.3) : Transient attribute: removal
Jan 15 15:39:27 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause   <transient_attributes id="3232261592">
Jan 15 15:39:27 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause     <instance_attributes id="status-3232261592">
Jan 15 15:39:27 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause       <nvpair id="status-3232261592-default_ping_set" name="default_ping_set" value="100" __crm_diff_marker__="removed:top"/>
Jan 15 15:39:27 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause     </instance_attributes>
Jan 15 15:39:27 [30795] bl460g1n6       crmd: (  te_utils.c:450   )   debug: abort_transition_graph: 	Cause   </transient_attributes>
Jan 15 15:39:27 [30795] bl460g1n6       crmd: (     utils.c:271   )   debug: update_abort_priority: 	Abort priority upgraded from 0 to 1000000
Jan 15 15:39:27 [30795] bl460g1n6       crmd: (     utils.c:281   )   debug: update_abort_priority: 	Abort action done superceeded by restart
Jan 15 15:39:27 [30795] bl460g1n6       crmd: (     graph.c:336   )   debug: run_graph: 	Transition 5 (Complete=3, Pending=1, Fired=0, Skipped=5, Incomplete=0, Source=/var/lib/pacemaker/pengine/pe-input-5.bz2): In-progress
Jan 15 15:39:27 [30793] bl460g1n6      attrd: (  commands.c:590   )    info: attrd_cib_callback: 	Update 11 for default_ping_set: OK (0)
Jan 15 15:39:27 [30793] bl460g1n6      attrd: (  commands.c:594   )  notice: attrd_cib_callback: 	Update 11 for default_ping_set[bl460g1n6]=(null): OK (0)
Jan 15 15:39:27 [30793] bl460g1n6      attrd: (  commands.c:594   )  notice: attrd_cib_callback: 	Update 11 for default_ping_set[bl460g1n7]=100: OK (0)
VirtualDomain(prmVM2)[31295]:	2014/01/15_15:39:28 INFO: vm2: live migration to bl460g1n7 succeeded.
Jan 15 15:39:28 [30792] bl460g1n6       lrmd: (services_lin:301   )   debug: operation_finished: 	prmVM2_migrate_to_0:31295 - exited with rc=0
Jan 15 15:39:28 [30792] bl460g1n6       lrmd: (services_lin:306   )   debug: operation_finished: 	prmVM2_migrate_to_0:31295:stderr [ -- empty -- ]
Jan 15 15:39:28 [30792] bl460g1n6       lrmd: (services_lin:310   )   debug: operation_finished: 	prmVM2_migrate_to_0:31295:stdout [ -- empty -- ]
Jan 15 15:39:28 [30792] bl460g1n6       lrmd: (      lrmd.c:104   )    info: log_finished: 	finished - rsc:prmVM2 action:migrate_to call_id:16 pid:31295 exit-code:0 exec-time:6208ms queue-time:0ms
Jan 15 15:39:28 [30795] bl460g1n6       crmd: (     utils.c:2126  )   debug: create_operation_update: 	do_update_resource: Updating resource prmVM2 after migrate_to op complete (interval=0)
Jan 15 15:39:28 [30795] bl460g1n6       crmd: (       lrm.c:2101  )  notice: process_lrm_event: 	LRM operation prmVM2_migrate_to_0 (call=16, rc=0, cib-update=66, confirmed=true) ok
Jan 15 15:39:28 [30795] bl460g1n6       crmd: (       lrm.c:122   )   debug: update_history_cache: 	Updating history for 'prmVM2' with migrate_to op
Jan 15 15:39:28 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=local/crmd/66, version=0.10.4)
Jan 15 15:39:28 [30791] bl460g1n6 stonith-ng: ( cib_utils.c:167   )   debug: Config update: 	Diff: --- 0.10.3
Jan 15 15:39:28 [30791] bl460g1n6 stonith-ng: ( cib_utils.c:169   )   debug: Config update: 	Diff: +++ 0.10.4 dbe78a7f2d5ab7ee7efb62219dbd5d8f
Jan 15 15:39:28 [30795] bl460g1n6       crmd: (te_callbacks:122   )   debug: te_update_diff: 	Processing diff (cib_modify): 0.10.3 -> 0.10.4 (S_TRANSITION_ENGINE)
Jan 15 15:39:28 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-  <cib num_updates="3">
Jan 15 15:39:28 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-    <status>
Jan 15 15:39:28 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-      <node_state id="3232261592">
Jan 15 15:39:28 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-        <lrm id="3232261592">
Jan 15 15:39:28 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-          <lrm_resources>
Jan 15 15:39:28 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-            <lrm_resource id="prmVM2">
Jan 15 15:39:28 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	--             <lrm_rsc_op operation_key="prmVM2_start_0" operation="start" transition-key="5:3:0:be72ea63-75a9-4de4-a591-e716f960743b" transition-magic="0:0;5:3:0:be72ea63-75a9-4de4-a591-e716f960743b" call-id="12" last-run="1389767932" last-rc-change="1389767932" exec-time="1030" id="prmVM2_last_0"/>
Jan 15 15:39:28 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-            </lrm_resource>
Jan 15 15:39:28 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-          </lrm_resources>
Jan 15 15:39:28 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-        </lrm>
Jan 15 15:39:28 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-      </node_state>
Jan 15 15:39:28 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-    </status>
Jan 15 15:39:28 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-  </cib>
Jan 15 15:39:28 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+  <cib epoch="10" num_updates="4" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.8" cib-last-written="Wed Jan 15 15:39:20 2014" update-origin="bl460g1n6" update-client="crm_attribute" have-quorum="1" dc-uuid="3232261592">
Jan 15 15:39:28 [30795] bl460g1n6       crmd: ( te_events.c:375   )    info: match_graph_event: 	Action prmVM2_migrate_to_0 (11) confirmed on bl460g1n6 (rc=0)
Jan 15 15:39:28 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+    <status>
Jan 15 15:39:28 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+      <node_state id="3232261592" uname="bl460g1n6" in_ccm="true" crmd="online" crm-debug-origin="do_update_resource" join="member" expected="member">
Jan 15 15:39:28 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+        <lrm id="3232261592">
Jan 15 15:39:28 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+          <lrm_resources>
Jan 15 15:39:28 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+            <lrm_resource id="prmVM2" type="VirtualDomain" class="ocf" provider="heartbeat">
Jan 15 15:39:28 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++             <lrm_rsc_op id="prmVM2_last_0" operation_key="prmVM2_migrate_to_0" operation="migrate_to" crm-debug-origin="do_update_resource" crm_feature_set="3.0.8" transition-key="11:5:0:be72ea63-75a9-4de4-a591-e716f960743b" transition-magic="0:0;11:5:0:be72ea63-75a9-4de4-a591-e716f960743b" call-id="16" rc-code="0" op-status="0" interval="0" last-run="1389767962" last-rc-change="1389767962" exec-time="6208" queu
Jan 15 15:39:28 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+            </lrm_resource>
Jan 15 15:39:28 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+          </lrm_resources>
Jan 15 15:39:28 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+        </lrm>
Jan 15 15:39:28 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+      </node_state>
Jan 15 15:39:28 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+    </status>
Jan 15 15:39:28 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+  </cib>
Jan 15 15:39:28 [30795] bl460g1n6       crmd: (     graph.c:336   )  notice: run_graph: 	Transition 5 (Complete=4, Pending=0, Fired=0, Skipped=5, Incomplete=0, Source=/var/lib/pacemaker/pengine/pe-input-5.bz2): Stopped
Jan 15 15:39:28 [30795] bl460g1n6       crmd: (  te_utils.c:355   )   debug: te_graph_trigger: 	Transition 5 is now complete
Jan 15 15:39:28 [30795] bl460g1n6       crmd: (te_actions.c:654   )   debug: notify_crmd: 	Processing transition completion in state S_TRANSITION_ENGINE
Jan 15 15:39:28 [30795] bl460g1n6       crmd: (     utils.c:192   )   debug: crm_timer_start: 	Started New Transition Timer (I_PE_CALC:2000ms), src=95
Jan 15 15:39:28 [30795] bl460g1n6       crmd: (te_actions.c:699   )   debug: notify_crmd: 	Transition 5 status: restart - Transient attribute: removal
Jan 15 15:39:30 [30795] bl460g1n6       crmd: (     utils.c:120   )    info: crm_timer_popped: 	New Transition Timer (I_PE_CALC) just popped (2000ms)
Jan 15 15:39:30 [30795] bl460g1n6       crmd: (       fsa.c:193   )   debug: s_crmd_fsa: 	Processing I_PE_CALC: [ state=S_TRANSITION_ENGINE cause=C_TIMER_POPPED origin=crm_timer_popped ]
Jan 15 15:39:30 [30795] bl460g1n6       crmd: (       fsa.c:502   )    info: do_state_transition: 	State transition S_TRANSITION_ENGINE -> S_POLICY_ENGINE [ input=I_PE_CALC cause=C_TIMER_POPPED origin=crm_timer_popped ]
Jan 15 15:39:30 [30795] bl460g1n6       crmd: (       fsa.c:599   )    info: do_state_transition: 	Progressed to state S_POLICY_ENGINE after C_TIMER_POPPED
Jan 15 15:39:30 [30795] bl460g1n6       crmd: (       fsa.c:610   )   debug: do_state_transition: 	All 2 cluster nodes are eligible to run resources.
Jan 15 15:39:30 [30795] bl460g1n6       crmd: (   pengine.c:231   )   debug: do_pe_invoke: 	Query 67: Requesting the current CIB: S_POLICY_ENGINE
Jan 15 15:39:30 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_query operation for section 'all': OK (rc=0, origin=local/crmd/67, version=0.10.4)
Jan 15 15:39:30 [30795] bl460g1n6       crmd: (   pengine.c:299   )   debug: do_pe_invoke_callback: 	Invoking the PE: query=67, ref=pe_calc-dc-1389767970-47, seq=8, quorate=1
Jan 15 15:39:30 [30794] bl460g1n6    pengine: (    unpack.c:93    )   debug: unpack_config: 	STONITH timeout: 60000
Jan 15 15:39:30 [30794] bl460g1n6    pengine: (    unpack.c:97    )   debug: unpack_config: 	STONITH of failed nodes is disabled
Jan 15 15:39:30 [30794] bl460g1n6    pengine: (    unpack.c:104   )   debug: unpack_config: 	Stop all active resources: false
Jan 15 15:39:30 [30794] bl460g1n6    pengine: (    unpack.c:108   )   debug: unpack_config: 	Cluster is symmetric - resources can run anywhere by default
Jan 15 15:39:30 [30794] bl460g1n6    pengine: (    unpack.c:113   )   debug: unpack_config: 	Default stickiness: 0
Jan 15 15:39:30 [30794] bl460g1n6    pengine: (    unpack.c:156   )  notice: unpack_config: 	On loss of CCM Quorum: Ignore
Jan 15 15:39:30 [30794] bl460g1n6    pengine: (    unpack.c:196   )   debug: unpack_config: 	Node scores: 'red' = -INFINITY, 'yellow' = 0, 'green' = 0
Jan 15 15:39:30 [30794] bl460g1n6    pengine: (    unpack.c:504   )   debug: unpack_domains: 	Unpacking domains
Jan 15 15:39:30 [30794] bl460g1n6    pengine: (    unpack.c:969   )    info: unpack_status: 	Node bl460g1n6 is in standby-mode
Jan 15 15:39:30 [30794] bl460g1n6    pengine: (    unpack.c:1331  )    info: determine_online_status: 	Node bl460g1n6 is standby
Jan 15 15:39:30 [30794] bl460g1n6    pengine: (    unpack.c:1331  )    info: determine_online_status: 	Node bl460g1n7 is online
Jan 15 15:39:30 [30794] bl460g1n6    pengine: (    unpack.c:1553  )   debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n6 to prmPing:0
Jan 15 15:39:30 [30794] bl460g1n6    pengine: (    unpack.c:2794  )   trace: unpack_rsc_op: 	Unpacking task prmPing_monitor_10000/monitor (call_id=13, status=0, rc=0) on bl460g1n6 (role=Unknown)
Jan 15 15:39:30 [30794] bl460g1n6    pengine: (    unpack.c:2839  )   trace: unpack_rsc_op: 	Handling status: 0
Jan 15 15:39:30 [30794] bl460g1n6    pengine: (    unpack.c:2862  )   trace: unpack_rsc_op: 	prmPing:0/monitor completed on bl460g1n6
Jan 15 15:39:30 [30794] bl460g1n6    pengine: (    unpack.c:2928  )   trace: unpack_rsc_op: 	Resource prmPing:0 after monitor: role=Started
Jan 15 15:39:30 [30794] bl460g1n6    pengine: (    unpack.c:2794  )   trace: unpack_rsc_op: 	Unpacking task prmPing_stop_0/stop (call_id=18, status=0, rc=0) on bl460g1n6 (role=Started)
Jan 15 15:39:30 [30794] bl460g1n6    pengine: (    unpack.c:2839  )   trace: unpack_rsc_op: 	Handling status: 0
Jan 15 15:39:30 [30794] bl460g1n6    pengine: (    unpack.c:2862  )   trace: unpack_rsc_op: 	prmPing:0/stop completed on bl460g1n6
Jan 15 15:39:30 [30794] bl460g1n6    pengine: (    unpack.c:2928  )   trace: unpack_rsc_op: 	Resource prmPing:0 after stop: role=Stopped
Jan 15 15:39:30 [30794] bl460g1n6    pengine: (    unpack.c:2794  )   trace: unpack_rsc_op: 	Unpacking task prmVM2_monitor_10000/monitor (call_id=14, status=0, rc=0) on bl460g1n6 (role=Unknown)
Jan 15 15:39:30 [30794] bl460g1n6    pengine: (    unpack.c:2839  )   trace: unpack_rsc_op: 	Handling status: 0
Jan 15 15:39:30 [30794] bl460g1n6    pengine: (    unpack.c:2862  )   trace: unpack_rsc_op: 	prmVM2/monitor completed on bl460g1n6
Jan 15 15:39:30 [30794] bl460g1n6    pengine: (    unpack.c:2928  )   trace: unpack_rsc_op: 	Resource prmVM2 after monitor: role=Started
Jan 15 15:39:30 [30794] bl460g1n6    pengine: (    unpack.c:2794  )   trace: unpack_rsc_op: 	Unpacking task prmVM2_migrate_to_0/migrate_to (call_id=16, status=0, rc=0) on bl460g1n6 (role=Started)
Jan 15 15:39:30 [30794] bl460g1n6    pengine: (    unpack.c:2839  )   trace: unpack_rsc_op: 	Handling status: 0
Jan 15 15:39:30 [30794] bl460g1n6    pengine: (    unpack.c:2862  )   trace: unpack_rsc_op: 	prmVM2/migrate_to completed on bl460g1n6
Jan 15 15:39:30 [30794] bl460g1n6    pengine: (       xml.c:3300  )   debug: get_xpath_object: 	No match for //node_state[@uname='3232261592']//lrm_resource[@id='prmVM2']/lrm_rsc_op[@operation='stop'] in /cib
Jan 15 15:39:30 [30794] bl460g1n6    pengine: (       xml.c:3300  )   debug: get_xpath_object: 	No match for //node_state[@uname='bl460g1n7']//lrm_resource[@id='prmVM2']/lrm_rsc_op[@operation='migrate_from' and @migrate_source='bl460g1n6'] in /cib
Jan 15 15:39:30 [30794] bl460g1n6    pengine: (    unpack.c:2928  )   trace: unpack_rsc_op: 	Resource prmVM2 after migrate_to: role=Started
Jan 15 15:39:30 [30794] bl460g1n6    pengine: (    unpack.c:1553  )   debug: find_anonymous_clone: 	Internally renamed prmPing on bl460g1n7 to prmPing:0
Jan 15 15:39:30 [30794] bl460g1n6    pengine: (    unpack.c:2794  )   trace: unpack_rsc_op: 	Unpacking task prmPing_start_0/start (call_id=11, status=0, rc=0) on bl460g1n7 (role=Unknown)
Jan 15 15:39:30 [30794] bl460g1n6    pengine: (    unpack.c:2839  )   trace: unpack_rsc_op: 	Handling status: 0
Jan 15 15:39:30 [30794] bl460g1n6    pengine: (    unpack.c:2862  )   trace: unpack_rsc_op: 	prmPing:0/start completed on bl460g1n7
Jan 15 15:39:30 [30794] bl460g1n6    pengine: (    unpack.c:2928  )   trace: unpack_rsc_op: 	Resource prmPing:0 after start: role=Started
Jan 15 15:39:30 [30794] bl460g1n6    pengine: (    unpack.c:2794  )   trace: unpack_rsc_op: 	Unpacking task prmPing_monitor_10000/monitor (call_id=12, status=0, rc=0) on bl460g1n7 (role=Started)
Jan 15 15:39:30 [30794] bl460g1n6    pengine: (    unpack.c:2839  )   trace: unpack_rsc_op: 	Handling status: 0
Jan 15 15:39:30 [30794] bl460g1n6    pengine: (    unpack.c:2862  )   trace: unpack_rsc_op: 	prmPing:0/monitor completed on bl460g1n7
Jan 15 15:39:30 [30794] bl460g1n6    pengine: (    unpack.c:2928  )   trace: unpack_rsc_op: 	Resource prmPing:0 after monitor: role=Started
Jan 15 15:39:30 [30794] bl460g1n6    pengine: (    unpack.c:2794  )   trace: unpack_rsc_op: 	Unpacking task prmVM2_monitor_0/monitor (call_id=5, status=0, rc=7) on bl460g1n7 (role=Unknown)
Jan 15 15:39:30 [30794] bl460g1n6    pengine: (    unpack.c:2839  )   trace: unpack_rsc_op: 	Handling status: 0
Jan 15 15:39:30 [30794] bl460g1n6    pengine: (    unpack.c:2862  )   trace: unpack_rsc_op: 	prmVM2/monitor completed on bl460g1n7
Jan 15 15:39:30 [30794] bl460g1n6    pengine: (    unpack.c:2928  )   trace: unpack_rsc_op: 	Resource prmVM2 after monitor: role=Stopped
Jan 15 15:39:30 [30794] bl460g1n6    pengine: (    native.c:446   )    info: native_print: 	prmVM2	(ocf::heartbeat:VirtualDomain):	FAILED bl460g1n6 
Jan 15 15:39:30 [30794] bl460g1n6    pengine: (     clone.c:417   )    info: clone_print: 	 Clone Set: clnPing [prmPing]
Jan 15 15:39:30 [30794] bl460g1n6    pengine: (    native.c:272   )   debug: native_active: 	Resource prmPing:0 active on bl460g1n7
Jan 15 15:39:30 [30794] bl460g1n6    pengine: (    native.c:272   )   debug: native_active: 	Resource prmPing:0 active on bl460g1n7
Jan 15 15:39:30 [30794] bl460g1n6    pengine: (     clone.c:311   )    info: short_print: 	     Started: [ bl460g1n7 ]
Jan 15 15:39:30 [30794] bl460g1n6    pengine: (     clone.c:311   )    info: short_print: 	     Stopped: [ bl460g1n6 ]
Jan 15 15:39:30 [30794] bl460g1n6    pengine: (  allocate.c:593   )   debug: common_apply_stickiness: 	Resource prmVM2: preferring current location (node=bl460g1n6, weight=1000000)
Jan 15 15:39:30 [30794] bl460g1n6    pengine: (  allocate.c:593   )   debug: common_apply_stickiness: 	Resource prmPing:0: preferring current location (node=bl460g1n7, weight=1000000)
Jan 15 15:39:30 [30794] bl460g1n6    pengine: (     utils.c:386   )   debug: native_assign_node: 	Assigning bl460g1n7 to prmPing:0
Jan 15 15:39:30 [30794] bl460g1n6    pengine: (     utils.c:339   )   debug: native_assign_node: 	All nodes for resource prmPing:1 are unavailable, unclean or shutting down (bl460g1n6: 0, -1000000)
Jan 15 15:39:30 [30794] bl460g1n6    pengine: (     utils.c:356   )   debug: native_assign_node: 	Could not allocate a node for prmPing:1
Jan 15 15:39:30 [30794] bl460g1n6    pengine: (    native.c:564   )    info: native_color: 	Resource prmPing:1 cannot run anywhere
Jan 15 15:39:30 [30794] bl460g1n6    pengine: (     clone.c:625   )   debug: clone_color: 	Allocated 1 clnPing instances of a possible 2
Jan 15 15:39:30 [30794] bl460g1n6    pengine: (     utils.c:386   )   debug: native_assign_node: 	Assigning bl460g1n7 to prmVM2
Jan 15 15:39:30 [30794] bl460g1n6    pengine: (    native.c:790   )    info: RecurringOp: 	 Start recurring monitor (10s) for prmVM2 on bl460g1n7
Jan 15 15:39:30 [30794] bl460g1n6    pengine: (    native.c:2150  )  notice: LogActions: 	Recover prmVM2	(Started bl460g1n6 -> bl460g1n7)
Jan 15 15:39:30 [30794] bl460g1n6    pengine: (    native.c:2141  )    info: LogActions: 	Leave   prmPing:0	(Started bl460g1n7)
Jan 15 15:39:30 [30794] bl460g1n6    pengine: (    native.c:2083  )    info: LogActions: 	Leave   prmPing:1	(Stopped)
Jan 15 15:39:30 [30794] bl460g1n6    pengine: (   pengine.c:178   )  notice: process_pe_message: 	Calculated Transition 6: /var/lib/pacemaker/pengine/pe-input-6.bz2
Jan 15 15:39:30 [30795] bl460g1n6       crmd: (       fsa.c:193   )   debug: s_crmd_fsa: 	Processing I_PE_SUCCESS: [ state=S_POLICY_ENGINE cause=C_IPC_MESSAGE origin=handle_response ]
Jan 15 15:39:30 [30795] bl460g1n6       crmd: (       fsa.c:502   )    info: do_state_transition: 	State transition S_POLICY_ENGINE -> S_TRANSITION_ENGINE [ input=I_PE_SUCCESS cause=C_IPC_MESSAGE origin=handle_response ]
Jan 15 15:39:30 [30795] bl460g1n6       crmd: (    unpack.c:230   )   debug: unpack_graph: 	Unpacked transition 6: 4 actions in 4 synapses
Jan 15 15:39:30 [30795] bl460g1n6       crmd: (   tengine.c:208   )    info: do_te_invoke: 	Processing graph 6 (ref=pe_calc-dc-1389767970-47) derived from /var/lib/pacemaker/pengine/pe-input-6.bz2
Jan 15 15:39:30 [30795] bl460g1n6       crmd: (te_actions.c:416   )  notice: te_rsc_command: 	Initiating action 7: stop prmVM2_stop_0 on bl460g1n6 (local)
Jan 15 15:39:30 [30795] bl460g1n6       crmd: (       lrm.c:1780  )   debug: do_lrm_rsc_op: 	Stopped 0 recurring operations in preparation for prmVM2_stop_0
Jan 15 15:39:30 [30795] bl460g1n6       crmd: (       lrm.c:1784  )    info: do_lrm_rsc_op: 	Performing key=7:6:0:be72ea63-75a9-4de4-a591-e716f960743b op=prmVM2_stop_0
Jan 15 15:39:30 [30792] bl460g1n6       lrmd: (      lrmd.c:1313  )   debug: process_lrmd_message: 	Processed lrmd_rsc_exec operation from ea951299-8a4d-4fd6-8900-d6588e07ac38: rc=19, reply=1, notify=0, exit=4201864
Jan 15 15:39:30 [30792] bl460g1n6       lrmd: (      lrmd.c:122   )    info: log_execute: 	executing - rsc:prmVM2 action:stop call_id:19
Jan 15 15:39:30 [30795] bl460g1n6       crmd: (     graph.c:336   )   debug: run_graph: 	Transition 6 (Complete=0, Pending=1, Fired=1, Skipped=0, Incomplete=3, Source=/var/lib/pacemaker/pengine/pe-input-6.bz2): In-progress
VirtualDomain(prmVM2)[31422]:	2014/01/15_15:39:30 DEBUG: Virtual domain vm2 is currently error: failed to get domain 'vm2'
error: domain not found: no domain with matching name 'vm2'.
VirtualDomain(prmVM2)[31422]:	2014/01/15_15:39:30 INFO: Domain vm2 already stopped.
Jan 15 15:39:30 [30792] bl460g1n6       lrmd: (services_lin:301   )   debug: operation_finished: 	prmVM2_stop_0:31422 - exited with rc=0
Jan 15 15:39:30 [30792] bl460g1n6       lrmd: (services_lin:306   )   debug: operation_finished: 	prmVM2_stop_0:31422:stderr [ -- empty -- ]
Jan 15 15:39:30 [30792] bl460g1n6       lrmd: (services_lin:310   )   debug: operation_finished: 	prmVM2_stop_0:31422:stdout [ -- empty -- ]
Jan 15 15:39:30 [30792] bl460g1n6       lrmd: (      lrmd.c:104   )    info: log_finished: 	finished - rsc:prmVM2 action:stop call_id:19 pid:31422 exit-code:0 exec-time:89ms queue-time:0ms
Jan 15 15:39:30 [30795] bl460g1n6       crmd: (     utils.c:2126  )   debug: create_operation_update: 	do_update_resource: Updating resource prmVM2 after stop op complete (interval=0)
Jan 15 15:39:30 [30795] bl460g1n6       crmd: (       lrm.c:2101  )  notice: process_lrm_event: 	LRM operation prmVM2_stop_0 (call=19, rc=0, cib-update=68, confirmed=true) ok
Jan 15 15:39:30 [30795] bl460g1n6       crmd: (       lrm.c:122   )   debug: update_history_cache: 	Updating history for 'prmVM2' with stop op
Jan 15 15:39:30 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=local/crmd/68, version=0.10.5)
Jan 15 15:39:30 [30795] bl460g1n6       crmd: (te_callbacks:122   )   debug: te_update_diff: 	Processing diff (cib_modify): 0.10.4 -> 0.10.5 (S_TRANSITION_ENGINE)
Jan 15 15:39:30 [30795] bl460g1n6       crmd: ( te_events.c:375   )    info: match_graph_event: 	Action prmVM2_stop_0 (7) confirmed on bl460g1n6 (rc=0)
Jan 15 15:39:30 [30795] bl460g1n6       crmd: (te_actions.c:416   )  notice: te_rsc_command: 	Initiating action 8: start prmVM2_start_0 on bl460g1n7
Jan 15 15:39:30 [30795] bl460g1n6       crmd: (te_actions.c:55    )   debug: te_pseudo_action: 	Pseudo action 3 fired and confirmed
Jan 15 15:39:30 [30795] bl460g1n6       crmd: (     graph.c:336   )   debug: run_graph: 	Transition 6 (Complete=1, Pending=1, Fired=2, Skipped=0, Incomplete=1, Source=/var/lib/pacemaker/pengine/pe-input-6.bz2): In-progress
Jan 15 15:39:30 [30795] bl460g1n6       crmd: (     graph.c:336   )   debug: run_graph: 	Transition 6 (Complete=2, Pending=1, Fired=0, Skipped=0, Incomplete=1, Source=/var/lib/pacemaker/pengine/pe-input-6.bz2): In-progress
Jan 15 15:39:30 [30791] bl460g1n6 stonith-ng: ( cib_utils.c:167   )   debug: Config update: 	Diff: --- 0.10.4
Jan 15 15:39:30 [30791] bl460g1n6 stonith-ng: ( cib_utils.c:169   )   debug: Config update: 	Diff: +++ 0.10.5 8cf6ff9e75fa47470cbbdbb32e483c42
Jan 15 15:39:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-  <cib num_updates="4">
Jan 15 15:39:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-    <status>
Jan 15 15:39:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-      <node_state id="3232261592">
Jan 15 15:39:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-        <lrm id="3232261592">
Jan 15 15:39:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-          <lrm_resources>
Jan 15 15:39:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-            <lrm_resource id="prmVM2">
Jan 15 15:39:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	--             <lrm_rsc_op operation_key="prmVM2_migrate_to_0" operation="migrate_to" transition-key="11:5:0:be72ea63-75a9-4de4-a591-e716f960743b" transition-magic="0:0;11:5:0:be72ea63-75a9-4de4-a591-e716f960743b" call-id="16" last-run="1389767962" last-rc-change="1389767962" exec-time="6208" id="prmVM2_last_0"/>
Jan 15 15:39:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-            </lrm_resource>
Jan 15 15:39:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-          </lrm_resources>
Jan 15 15:39:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-        </lrm>
Jan 15 15:39:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-      </node_state>
Jan 15 15:39:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-    </status>
Jan 15 15:39:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-  </cib>
Jan 15 15:39:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+  <cib epoch="10" num_updates="5" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.8" cib-last-written="Wed Jan 15 15:39:20 2014" update-origin="bl460g1n6" update-client="crm_attribute" have-quorum="1" dc-uuid="3232261592">
Jan 15 15:39:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+    <status>
Jan 15 15:39:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+      <node_state id="3232261592" uname="bl460g1n6" in_ccm="true" crmd="online" crm-debug-origin="do_update_resource" join="member" expected="member">
Jan 15 15:39:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+        <lrm id="3232261592">
Jan 15 15:39:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+          <lrm_resources>
Jan 15 15:39:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+            <lrm_resource id="prmVM2" type="VirtualDomain" class="ocf" provider="heartbeat">
Jan 15 15:39:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++             <lrm_rsc_op id="prmVM2_last_0" operation_key="prmVM2_stop_0" operation="stop" crm-debug-origin="do_update_resource" crm_feature_set="3.0.8" transition-key="7:6:0:be72ea63-75a9-4de4-a591-e716f960743b" transition-magic="0:0;7:6:0:be72ea63-75a9-4de4-a591-e716f960743b" call-id="19" rc-code="0" op-status="0" interval="0" last-run="1389767970" last-rc-change="1389767970" exec-time="89" queue-time="0" op-di
Jan 15 15:39:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+            </lrm_resource>
Jan 15 15:39:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+          </lrm_resources>
Jan 15 15:39:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+        </lrm>
Jan 15 15:39:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+      </node_state>
Jan 15 15:39:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+    </status>
Jan 15 15:39:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+  </cib>
Jan 15 15:39:30 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=bl460g1n7/crmd/17, version=0.10.6)
Jan 15 15:39:30 [30791] bl460g1n6 stonith-ng: ( cib_utils.c:167   )   debug: Config update: 	Diff: --- 0.10.5
Jan 15 15:39:30 [30791] bl460g1n6 stonith-ng: ( cib_utils.c:169   )   debug: Config update: 	Diff: +++ 0.10.6 f9b1410e5dd0cc696ed3f4f64c501198
Jan 15 15:39:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-  <cib num_updates="5">
Jan 15 15:39:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-    <status>
Jan 15 15:39:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-      <node_state id="3232261593">
Jan 15 15:39:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-        <lrm id="3232261593">
Jan 15 15:39:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-          <lrm_resources>
Jan 15 15:39:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-            <lrm_resource id="prmVM2">
Jan 15 15:39:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	--             <lrm_rsc_op operation_key="prmVM2_monitor_0" operation="monitor" transition-key="7:2:7:be72ea63-75a9-4de4-a591-e716f960743b" transition-magic="0:7;7:2:7:be72ea63-75a9-4de4-a591-e716f960743b" call-id="5" rc-code="7" last-run="1389767928" last-rc-change="1389767928" exec-time="101" id="prmVM2_last_0"/>
Jan 15 15:39:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-            </lrm_resource>
Jan 15 15:39:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-          </lrm_resources>
Jan 15 15:39:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-        </lrm>
Jan 15 15:39:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-      </node_state>
Jan 15 15:39:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-    </status>
Jan 15 15:39:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-  </cib>
Jan 15 15:39:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+  <cib epoch="10" num_updates="6" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.8" cib-last-written="Wed Jan 15 15:39:20 2014" update-origin="bl460g1n6" update-client="crm_attribute" have-quorum="1" dc-uuid="3232261592">
Jan 15 15:39:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+    <status>
Jan 15 15:39:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+      <node_state id="3232261593" uname="bl460g1n7" in_ccm="true" crmd="online" crm-debug-origin="do_update_resource" join="member" expected="member">
Jan 15 15:39:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+        <lrm id="3232261593">
Jan 15 15:39:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+          <lrm_resources>
Jan 15 15:39:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+            <lrm_resource id="prmVM2" type="VirtualDomain" class="ocf" provider="heartbeat">
Jan 15 15:39:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++             <lrm_rsc_op id="prmVM2_last_0" operation_key="prmVM2_start_0" operation="start" crm-debug-origin="do_update_resource" crm_feature_set="3.0.8" transition-key="8:6:0:be72ea63-75a9-4de4-a591-e716f960743b" transition-magic="0:0;8:6:0:be72ea63-75a9-4de4-a591-e716f960743b" call-id="13" rc-code="0" op-status="0" interval="0" last-run="1389767970" last-rc-change="1389767970" exec-time="73" queue-time="0" op-
Jan 15 15:39:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+            </lrm_resource>
Jan 15 15:39:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+          </lrm_resources>
Jan 15 15:39:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+        </lrm>
Jan 15 15:39:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+      </node_state>
Jan 15 15:39:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+    </status>
Jan 15 15:39:30 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+  </cib>
Jan 15 15:39:30 [30795] bl460g1n6       crmd: (te_callbacks:122   )   debug: te_update_diff: 	Processing diff (cib_modify): 0.10.5 -> 0.10.6 (S_TRANSITION_ENGINE)
Jan 15 15:39:30 [30795] bl460g1n6       crmd: ( te_events.c:375   )    info: match_graph_event: 	Action prmVM2_start_0 (8) confirmed on bl460g1n7 (rc=0)
Jan 15 15:39:30 [30795] bl460g1n6       crmd: (te_actions.c:416   )  notice: te_rsc_command: 	Initiating action 9: monitor prmVM2_monitor_10000 on bl460g1n7
Jan 15 15:39:30 [30795] bl460g1n6       crmd: (     graph.c:336   )   debug: run_graph: 	Transition 6 (Complete=3, Pending=1, Fired=1, Skipped=0, Incomplete=0, Source=/var/lib/pacemaker/pengine/pe-input-6.bz2): In-progress
Jan 15 15:39:31 [30795] bl460g1n6       crmd: (te_callbacks:122   )   debug: te_update_diff: 	Processing diff (cib_modify): 0.10.6 -> 0.10.7 (S_TRANSITION_ENGINE)
Jan 15 15:39:31 [30795] bl460g1n6       crmd: ( te_events.c:375   )    info: match_graph_event: 	Action prmVM2_monitor_10000 (9) confirmed on bl460g1n7 (rc=0)
Jan 15 15:39:31 [30795] bl460g1n6       crmd: (     graph.c:336   )  notice: run_graph: 	Transition 6 (Complete=4, Pending=0, Fired=0, Skipped=0, Incomplete=0, Source=/var/lib/pacemaker/pengine/pe-input-6.bz2): Complete
Jan 15 15:39:31 [30795] bl460g1n6       crmd: (  te_utils.c:355   )   debug: te_graph_trigger: 	Transition 6 is now complete
Jan 15 15:39:31 [30795] bl460g1n6       crmd: (te_actions.c:654   )   debug: notify_crmd: 	Processing transition completion in state S_TRANSITION_ENGINE
Jan 15 15:39:31 [30795] bl460g1n6       crmd: (te_actions.c:699   )   debug: notify_crmd: 	Transition 6 status: done - <null>
Jan 15 15:39:31 [30795] bl460g1n6       crmd: (       fsa.c:193   )   debug: s_crmd_fsa: 	Processing I_TE_SUCCESS: [ state=S_TRANSITION_ENGINE cause=C_FSA_INTERNAL origin=notify_crmd ]
Jan 15 15:39:31 [30795] bl460g1n6       crmd: (      misc.c:47    )    info: do_log: 	FSA: Input I_TE_SUCCESS from notify_crmd() received in state S_TRANSITION_ENGINE
Jan 15 15:39:31 [30795] bl460g1n6       crmd: (       fsa.c:502   )  notice: do_state_transition: 	State transition S_TRANSITION_ENGINE -> S_IDLE [ input=I_TE_SUCCESS cause=C_FSA_INTERNAL origin=notify_crmd ]
Jan 15 15:39:31 [30795] bl460g1n6       crmd: (       fsa.c:645   )   debug: do_state_transition: 	Starting PEngine Recheck Timer
Jan 15 15:39:31 [30795] bl460g1n6       crmd: (     utils.c:192   )   debug: crm_timer_start: 	Started PEngine Recheck Timer (I_PE_CALC:900000ms), src=101
Jan 15 15:39:31 [30791] bl460g1n6 stonith-ng: ( cib_utils.c:167   )   debug: Config update: 	Diff: --- 0.10.6
Jan 15 15:39:31 [30791] bl460g1n6 stonith-ng: ( cib_utils.c:169   )   debug: Config update: 	Diff: +++ 0.10.7 f775e8a29aff3b71aec33fa95a2137de
Jan 15 15:39:31 [30791] bl460g1n6 stonith-ng: (       xml.c:1496  )   debug: Config update: 	-- <cib num_updates="6"/>
Jan 15 15:39:31 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+  <cib epoch="10" num_updates="7" admin_epoch="0" validate-with="pacemaker-1.2" crm_feature_set="3.0.8" cib-last-written="Wed Jan 15 15:39:20 2014" update-origin="bl460g1n6" update-client="crm_attribute" have-quorum="1" dc-uuid="3232261592">
Jan 15 15:39:31 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+    <status>
Jan 15 15:39:31 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+      <node_state id="3232261593" uname="bl460g1n7" in_ccm="true" crmd="online" crm-debug-origin="do_update_resource" join="member" expected="member">
Jan 15 15:39:31 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+        <lrm id="3232261593">
Jan 15 15:39:31 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+          <lrm_resources>
Jan 15 15:39:31 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+            <lrm_resource id="prmVM2" type="VirtualDomain" class="ocf" provider="heartbeat">
Jan 15 15:39:31 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	++             <lrm_rsc_op id="prmVM2_monitor_10000" operation_key="prmVM2_monitor_10000" operation="monitor" crm-debug-origin="do_update_resource" crm_feature_set="3.0.8" transition-key="9:6:0:be72ea63-75a9-4de4-a591-e716f960743b" transition-magic="0:0;9:6:0:be72ea63-75a9-4de4-a591-e716f960743b" call-id="14" rc-code="0" op-status="0" interval="10000" last-rc-change="1389767970" exec-time="144" queue-time="0" op-di
Jan 15 15:39:31 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_modify operation for section status: OK (rc=0, origin=bl460g1n7/crmd/18, version=0.10.7)
Jan 15 15:39:31 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+            </lrm_resource>
Jan 15 15:39:31 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+          </lrm_resources>
Jan 15 15:39:31 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+        </lrm>
Jan 15 15:39:31 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+      </node_state>
Jan 15 15:39:31 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+    </status>
Jan 15 15:39:31 [30791] bl460g1n6 stonith-ng: (       xml.c:1507  )   debug: Config update: 	+  </cib>
Jan 15 15:39:39 [30795] bl460g1n6       crmd: (  throttle.c:260   )   debug: throttle_cib_load: 	cib load: 0.001667 (5 ticks in 30s)
Jan 15 15:39:39 [30795] bl460g1n6       crmd: (  throttle.c:302   )   debug: throttle_load_avg: 	Current load is 0.280000 (full: 0.28 0.09 0.02 1/401 31478)
Jan 15 15:39:39 [30795] bl460g1n6       crmd: (  throttle.c:382   )   debug: throttle_io_load: 	Current IO load is 0.000000
Jan 15 15:39:43 [30773] bl460g1n6 corosync debug   [QB    ] ipc_setup.c:handle_new_connection:484 IPC credentials authenticated (30775-31526-33)
Jan 15 15:39:43 [30773] bl460g1n6 corosync debug   [QB    ] ipc_shm.c:qb_ipcs_shm_connect:295 connecting to client [31526]
Jan 15 15:39:43 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:39:43 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:39:43 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:39:43 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_created:272 connection created
Jan 15 15:39:43 [30773] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_init_fn:306 lib_init_fn: conn=0x7f438576de40
Jan 15 15:39:43 [30773] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_dispatch_connection_request:757 HUP conn (30775-31526-33)
Jan 15 15:39:43 [30773] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_disconnect:605 qb_ipcs_disconnect(30775-31526-33) state:2
Jan 15 15:39:43 [30773] bl460g1n6 corosync debug   [QB    ] loop_poll_epoll.c:_del:117 epoll_ctl(del): Bad file descriptor (9)
Jan 15 15:39:43 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_closed:417 cs_ipcs_connection_closed() 
Jan 15 15:39:43 [30773] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_exit_fn:325 exit_fn for conn=0x7f438576de40
Jan 15 15:39:43 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_destroyed:390 cs_ipcs_connection_destroyed() 
Jan 15 15:39:43 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-response-30775-31526-33-header
Jan 15 15:39:43 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-event-30775-31526-33-header
Jan 15 15:39:43 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-request-30775-31526-33-header
Jan 15 15:39:49 [30790] bl460g1n6        cib: (       ipc.c:334   )    info: crm_client_new: 	Connecting 0x19820d0 for uid=0 gid=0 pid=32761 id=fc4fbdb1-6e26-416b-9964-489c62187164
Jan 15 15:39:49 [30790] bl460g1n6        cib: ( ipc_setup.c:484   )   debug: handle_new_connection: 	IPC credentials authenticated (30790-32761-14)
Jan 15 15:39:49 [30790] bl460g1n6        cib: (   ipc_shm.c:295   )   debug: qb_ipcs_shm_connect: 	connecting to client [32761]
Jan 15 15:39:49 [30790] bl460g1n6        cib: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:39:49 [30790] bl460g1n6        cib: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:39:49 [30790] bl460g1n6        cib: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:39:49 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_query operation for section 'all': OK (rc=0, origin=local/crm_mon/2, version=0.10.7)
Jan 15 15:39:49 [30790] bl460g1n6        cib: (      ipcs.c:757   )   debug: qb_ipcs_dispatch_connection_request: 	HUP conn (30790-32761-14)
Jan 15 15:39:49 [30790] bl460g1n6        cib: (      ipcs.c:605   )   debug: qb_ipcs_disconnect: 	qb_ipcs_disconnect(30790-32761-14) state:2
Jan 15 15:39:49 [30790] bl460g1n6        cib: (       ipc.c:368   )    info: crm_client_destroy: 	Destroying 0 events
Jan 15 15:39:49 [30790] bl460g1n6        cib: (ringbuffer.c:299   )   debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_ro-response-30790-32761-14-header
Jan 15 15:39:49 [30790] bl460g1n6        cib: (ringbuffer.c:299   )   debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_ro-event-30790-32761-14-header
Jan 15 15:39:49 [30790] bl460g1n6        cib: (ringbuffer.c:299   )   debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_ro-request-30790-32761-14-header
Jan 15 15:39:49 [30790] bl460g1n6        cib: (       ipc.c:334   )    info: crm_client_new: 	Connecting 0x19820d0 for uid=0 gid=0 pid=32763 id=d398498e-0ee1-49c6-8dd5-1c078eed314b
Jan 15 15:39:49 [30790] bl460g1n6        cib: ( ipc_setup.c:484   )   debug: handle_new_connection: 	IPC credentials authenticated (30790-32763-14)
Jan 15 15:39:49 [30790] bl460g1n6        cib: (   ipc_shm.c:295   )   debug: qb_ipcs_shm_connect: 	connecting to client [32763]
Jan 15 15:39:49 [30790] bl460g1n6        cib: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:39:49 [30790] bl460g1n6        cib: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:39:49 [30790] bl460g1n6        cib: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:39:49 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_query operation for section 'all': OK (rc=0, origin=local/cibadmin/2, version=0.10.7)
Jan 15 15:39:49 [30790] bl460g1n6        cib: (      ipcs.c:757   )   debug: qb_ipcs_dispatch_connection_request: 	HUP conn (30790-32763-14)
Jan 15 15:39:49 [30790] bl460g1n6        cib: (      ipcs.c:605   )   debug: qb_ipcs_disconnect: 	qb_ipcs_disconnect(30790-32763-14) state:2
Jan 15 15:39:49 [30790] bl460g1n6        cib: (       ipc.c:368   )    info: crm_client_destroy: 	Destroying 0 events
Jan 15 15:39:49 [30790] bl460g1n6        cib: (ringbuffer.c:299   )   debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-response-30790-32763-14-header
Jan 15 15:39:49 [30790] bl460g1n6        cib: (ringbuffer.c:299   )   debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-event-30790-32763-14-header
Jan 15 15:39:49 [30790] bl460g1n6        cib: (ringbuffer.c:299   )   debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_rw-request-30790-32763-14-header
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [QB    ] ipc_setup.c:handle_new_connection:484 IPC credentials authenticated (30775-370-33)
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [QB    ] ipc_shm.c:qb_ipcs_shm_connect:295 connecting to client [370]
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_created:272 connection created
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_init_fn:306 lib_init_fn: conn=0x7f438576de40
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_dispatch_connection_request:757 HUP conn (30775-370-33)
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_disconnect:605 qb_ipcs_disconnect(30775-370-33) state:2
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [QB    ] loop_poll_epoll.c:_del:117 epoll_ctl(del): Bad file descriptor (9)
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_closed:417 cs_ipcs_connection_closed() 
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_exit_fn:325 exit_fn for conn=0x7f438576de40
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_destroyed:390 cs_ipcs_connection_destroyed() 
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-response-30775-370-33-header
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-event-30775-370-33-header
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-request-30775-370-33-header
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [QB    ] ipc_setup.c:handle_new_connection:484 IPC credentials authenticated (30775-372-33)
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [QB    ] ipc_shm.c:qb_ipcs_shm_connect:295 connecting to client [372]
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_created:272 connection created
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_init_fn:306 lib_init_fn: conn=0x7f438576de40
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_write_to_file:808  writing total of: 8392724
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_dispatch_connection_request:757 HUP conn (30775-372-33)
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_disconnect:605 qb_ipcs_disconnect(30775-372-33) state:2
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [QB    ] loop_poll_epoll.c:_del:117 epoll_ctl(del): Bad file descriptor (9)
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_closed:417 cs_ipcs_connection_closed() 
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_exit_fn:325 exit_fn for conn=0x7f438576de40
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_destroyed:390 cs_ipcs_connection_destroyed() 
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-response-30775-372-33-header
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-event-30775-372-33-header
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-request-30775-372-33-header
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [QB    ] ipc_setup.c:handle_new_connection:484 IPC credentials authenticated (30775-377-33)
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [QB    ] ipc_shm.c:qb_ipcs_shm_connect:295 connecting to client [377]
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_created:272 connection created
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_init_fn:306 lib_init_fn: conn=0x7f438576de40
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_dispatch_connection_request:757 HUP conn (30775-377-33)
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_disconnect:605 qb_ipcs_disconnect(30775-377-33) state:2
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [QB    ] loop_poll_epoll.c:_del:117 epoll_ctl(del): Bad file descriptor (9)
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_closed:417 cs_ipcs_connection_closed() 
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_exit_fn:325 exit_fn for conn=0x7f438576de40
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_destroyed:390 cs_ipcs_connection_destroyed() 
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-response-30775-377-33-header
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-event-30775-377-33-header
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-request-30775-377-33-header
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [QB    ] ipc_setup.c:handle_new_connection:484 IPC credentials authenticated (30775-378-33)
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [QB    ] ipc_shm.c:qb_ipcs_shm_connect:295 connecting to client [378]
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_created:272 connection created
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_init_fn:306 lib_init_fn: conn=0x7f438576de40
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [QB    ] ipc_setup.c:handle_new_connection:484 IPC credentials authenticated (30775-378-34)
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [QB    ] ipc_shm.c:qb_ipcs_shm_connect:295 connecting to client [378]
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_created:272 connection created
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [QUORUM] vsf_quorum.c:quorum_lib_init_fn:316 lib_init_fn: conn=0x7f4385768760
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [QUORUM] vsf_quorum.c:message_handler_req_lib_quorum_gettype:471 got quorum_type request on 0x7f4385768760
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [QB    ] ipc_setup.c:handle_new_connection:484 IPC credentials authenticated (30775-378-35)
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [QB    ] ipc_shm.c:qb_ipcs_shm_connect:295 connecting to client [378]
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_created:272 connection created
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [QB    ] ipc_setup.c:handle_new_connection:484 IPC credentials authenticated (30775-378-36)
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [QB    ] ipc_shm.c:qb_ipcs_shm_connect:295 connecting to client [378]
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_open_2:236 shm size:1048589; real_size:1052672; rb->word_size:263168
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_created:272 connection created
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [QUORUM] vsf_quorum.c:message_handler_req_lib_quorum_getquorate:395 got quorate request on 0x7f4385768760
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [QUORUM] vsf_quorum.c:message_handler_req_lib_quorum_trackstart:412 got trackstart request on 0x7f4385768760
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [QUORUM] vsf_quorum.c:message_handler_req_lib_quorum_trackstart:420 sending initial status to 0x7f4385768760
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [QUORUM] vsf_quorum.c:send_library_notification:359 sending quorum notification to 0x7f4385768760, length = 56
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [QUORUM] vsf_quorum.c:message_handler_req_lib_quorum_trackstop:448 got trackstop request on 0x7f4385768760
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:message_handler_req_lib_votequorum_getinfo:2147 got getinfo request on 0x7f4385770a70 for node 3232261592
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:message_handler_req_lib_votequorum_getinfo:2239 getinfo response error: 1
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:message_handler_req_lib_votequorum_getinfo:2147 got getinfo request on 0x7f4385770a70 for node 3232261592
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:message_handler_req_lib_votequorum_getinfo:2239 getinfo response error: 1
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:message_handler_req_lib_votequorum_getinfo:2147 got getinfo request on 0x7f4385770a70 for node 3232261593
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [VOTEQ ] votequorum.c:message_handler_req_lib_votequorum_getinfo:2239 getinfo response error: 1
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_dispatch_connection_request:757 HUP conn (30775-378-33)
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_disconnect:605 qb_ipcs_disconnect(30775-378-33) state:2
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [QB    ] loop_poll_epoll.c:_del:117 epoll_ctl(del): Bad file descriptor (9)
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_closed:417 cs_ipcs_connection_closed() 
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [CMAP  ] cmap.c:cmap_lib_exit_fn:325 exit_fn for conn=0x7f438576de40
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_destroyed:390 cs_ipcs_connection_destroyed() 
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-response-30775-378-33-header
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-event-30775-378-33-header
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cmap-request-30775-378-33-header
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_dispatch_connection_request:757 HUP conn (30775-378-34)
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_disconnect:605 qb_ipcs_disconnect(30775-378-34) state:2
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [QB    ] loop_poll_epoll.c:_del:117 epoll_ctl(del): Bad file descriptor (9)
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_closed:417 cs_ipcs_connection_closed() 
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [QUORUM] vsf_quorum.c:quorum_lib_exit_fn:328 lib_exit_fn: conn=0x7f4385768760
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_destroyed:390 cs_ipcs_connection_destroyed() 
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-quorum-response-30775-378-34-header
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-quorum-event-30775-378-34-header
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-quorum-request-30775-378-34-header
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_dispatch_connection_request:757 HUP conn (30775-378-35)
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_disconnect:605 qb_ipcs_disconnect(30775-378-35) state:2
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [QB    ] loop_poll_epoll.c:_del:117 epoll_ctl(del): Bad file descriptor (9)
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_closed:417 cs_ipcs_connection_closed() 
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_destroyed:390 cs_ipcs_connection_destroyed() 
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cfg-response-30775-378-35-header
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cfg-event-30775-378-35-header
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-cfg-request-30775-378-35-header
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_dispatch_connection_request:757 HUP conn (30775-378-36)
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [QB    ] ipcs.c:qb_ipcs_disconnect:605 qb_ipcs_disconnect(30775-378-36) state:2
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [QB    ] loop_poll_epoll.c:_del:117 epoll_ctl(del): Bad file descriptor (9)
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_closed:417 cs_ipcs_connection_closed() 
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [MAIN  ] ipc_glue.c:cs_ipcs_connection_destroyed:390 cs_ipcs_connection_destroyed() 
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-votequorum-response-30775-378-36-header
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-votequorum-event-30775-378-36-header
Jan 15 15:39:50 [30773] bl460g1n6 corosync debug   [QB    ] ringbuffer.c:qb_rb_close:299 Free'ing ringbuffer: /dev/shm/qb-votequorum-request-30775-378-36-header
Jan 15 15:39:50 [30790] bl460g1n6        cib: (       ipc.c:334   )    info: crm_client_new: 	Connecting 0x19820d0 for uid=0 gid=0 pid=380 id=0473a140-22de-49b6-b2c2-7e7c16fff018
Jan 15 15:39:50 [30790] bl460g1n6        cib: ( ipc_setup.c:484   )   debug: handle_new_connection: 	IPC credentials authenticated (30790-380-14)
Jan 15 15:39:50 [30790] bl460g1n6        cib: (   ipc_shm.c:295   )   debug: qb_ipcs_shm_connect: 	connecting to client [380]
Jan 15 15:39:50 [30790] bl460g1n6        cib: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:39:50 [30790] bl460g1n6        cib: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:39:50 [30790] bl460g1n6        cib: (ringbuffer.c:236   )   debug: qb_rb_open_2: 	shm size:524301; real_size:528384; rb->word_size:132096
Jan 15 15:39:50 [30790] bl460g1n6        cib: ( callbacks.c:765   )    info: cib_process_request: 	Completed cib_query operation for section 'all': OK (rc=0, origin=local/crm_mon/2, version=0.10.7)
Jan 15 15:39:50 [30790] bl460g1n6        cib: (      ipcs.c:757   )   debug: qb_ipcs_dispatch_connection_request: 	HUP conn (30790-380-14)
Jan 15 15:39:50 [30790] bl460g1n6        cib: (      ipcs.c:605   )   debug: qb_ipcs_disconnect: 	qb_ipcs_disconnect(30790-380-14) state:2
Jan 15 15:39:50 [30790] bl460g1n6        cib: (       ipc.c:368   )    info: crm_client_destroy: 	Destroying 0 events
Jan 15 15:39:50 [30790] bl460g1n6        cib: (ringbuffer.c:299   )   debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_ro-response-30790-380-14-header
Jan 15 15:39:50 [30790] bl460g1n6        cib: (ringbuffer.c:299   )   debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_ro-event-30790-380-14-header
Jan 15 15:39:50 [30790] bl460g1n6        cib: (ringbuffer.c:299   )   debug: qb_rb_close: 	Free'ing ringbuffer: /dev/shm/qb-cib_ro-request-30790-380-14-header
