Feb 10 13:16:09 corosync [MAIN ] Corosync Cluster Engine ('1.4.1'): started and ready to provide service. Feb 10 13:16:09 corosync [MAIN ] Corosync built-in features: nss dbus rdma snmp Feb 10 13:16:09 corosync [MAIN ] Successfully read main configuration file '/etc/corosync/corosync.conf'. Feb 10 13:16:09 corosync [TOTEM ] Initializing transport (UDP/IP Unicast). Feb 10 13:16:09 corosync [TOTEM ] Initializing transmit/receive security: libtomcrypt SOBER128/SHA1HMAC (mode 0). Feb 10 13:16:09 corosync [TOTEM ] Initializing transport (UDP/IP Unicast). Feb 10 13:16:09 corosync [TOTEM ] Initializing transmit/receive security: libtomcrypt SOBER128/SHA1HMAC (mode 0). Feb 10 13:16:09 corosync [TOTEM ] The network interface [192.168.2.2] is now up. Set r/w permissions for uid=0, gid=0 on /var/log/cluster/corosync.log Feb 10 13:16:09 corosync [pcmk ] info: process_ais_conf: Reading configure Feb 10 13:16:09 corosync [pcmk ] info: config_find_init: Local handle: 4835695805891346436 for logging Feb 10 13:16:09 corosync [pcmk ] info: config_find_next: Processing additional logging options... Feb 10 13:16:09 corosync [pcmk ] info: get_config_opt: Found 'off' for option: debug Feb 10 13:16:09 corosync [pcmk ] info: get_config_opt: Found 'yes' for option: to_logfile Feb 10 13:16:09 corosync [pcmk ] info: get_config_opt: Found '/var/log/cluster/corosync.log' for option: logfile Feb 10 13:16:09 corosync [pcmk ] info: get_config_opt: Found 'yes' for option: to_syslog Feb 10 13:16:09 corosync [pcmk ] info: get_config_opt: Defaulting to 'daemon' for option: syslog_facility Feb 10 13:16:09 corosync [pcmk ] info: config_find_init: Local handle: 4552499517957603333 for quorum Feb 10 13:16:09 corosync [pcmk ] info: config_find_next: No additional configuration supplied for: quorum Feb 10 13:16:09 corosync [pcmk ] info: get_config_opt: No default for option: provider Feb 10 13:16:09 corosync [pcmk ] info: config_find_init: Local handle: 8972265949260414982 for service Feb 10 13:16:09 corosync [pcmk ] info: config_find_next: Processing additional service options... Feb 10 13:16:09 corosync [pcmk ] info: get_config_opt: Found '1' for option: ver Feb 10 13:16:09 corosync [pcmk ] info: process_ais_conf: Enabling MCP mode: Use the Pacemaker init script to complete Pacemaker startup Feb 10 13:16:09 corosync [pcmk ] info: get_config_opt: Defaulting to 'pcmk' for option: clustername Feb 10 13:16:09 corosync [pcmk ] info: get_config_opt: Defaulting to 'no' for option: use_logd Feb 10 13:16:09 corosync [pcmk ] info: get_config_opt: Defaulting to 'no' for option: use_mgmtd Feb 10 13:16:09 corosync [pcmk ] info: pcmk_startup: CRM: Initialized Feb 10 13:16:09 corosync [pcmk ] Logging: Initialized pcmk_startup Feb 10 13:16:09 corosync [pcmk ] info: pcmk_startup: Maximum core file size is: 18446744073709551615 Feb 10 13:16:09 corosync [pcmk ] info: pcmk_startup: Service: 10 Feb 10 13:16:09 corosync [pcmk ] info: pcmk_startup: Local hostname: nodeb Feb 10 13:16:09 corosync [pcmk ] info: pcmk_update_nodeid: Local node id: 731255468 Feb 10 13:16:09 corosync [pcmk ] info: update_member: Creating entry for node 731255468 born on 0 Feb 10 13:16:09 corosync [pcmk ] info: update_member: 0x1e89070 Node 731255468 now known as nodeb (was: (null)) Feb 10 13:16:09 corosync [pcmk ] info: update_member: Node nodeb now has 1 quorum votes (was 0) Feb 10 13:16:09 corosync [pcmk ] info: update_member: Node 731255468/nodeb is now: member Feb 10 13:16:09 corosync [SERV ] Service engine loaded: Pacemaker Cluster Manager 1.1.6 Feb 10 13:16:09 corosync [SERV ] Service engine loaded: corosync extended virtual synchrony service Feb 10 13:16:09 corosync [SERV ] Service engine loaded: corosync configuration service Feb 10 13:16:09 corosync [SERV ] Service engine loaded: corosync cluster closed process group service v1.01 Feb 10 13:16:09 corosync [SERV ] Service engine loaded: corosync cluster config database access v1.01 Feb 10 13:16:09 corosync [SERV ] Service engine loaded: corosync profile loading service Feb 10 13:16:09 corosync [SERV ] Service engine loaded: corosync cluster quorum service v0.1 Feb 10 13:16:09 corosync [MAIN ] Compatibility mode set to whitetank. Using V1 and V2 of the synchronization engine. Feb 10 13:16:09 corosync [TOTEM ] The network interface [192.168.1.2] is now up. Feb 10 13:16:09 corosync [TOTEM ] Incrementing problem counter for seqid 1 iface 192.168.1.2 to [1 of 10] Feb 10 13:16:09 corosync [pcmk ] notice: pcmk_peer_update: Transitional membership event on ring 204: memb=0, new=0, lost=0 Feb 10 13:16:09 corosync [pcmk ] notice: pcmk_peer_update: Stable membership event on ring 204: memb=1, new=1, lost=0 Feb 10 13:16:09 corosync [pcmk ] info: pcmk_peer_update: NEW: nodeb 731255468 Feb 10 13:16:09 corosync [pcmk ] info: pcmk_peer_update: MEMB: nodeb 731255468 Feb 10 13:16:09 corosync [TOTEM ] A processor joined or left the membership and a new membership was formed. Feb 10 13:16:09 corosync [CPG ] chosen downlist: sender r(0) ip(192.168.2.2) r(1) ip(192.168.1.2) ; members(old:0 left:0) Feb 10 13:16:09 corosync [MAIN ] Completed service synchronization, ready to provide service. Feb 10 13:16:12 corosync [TOTEM ] ring 1 active with no faults Set r/w permissions for uid=498, gid=0 on /var/log/cluster/corosync.log Feb 10 13:16:18 nodeb pacemakerd: [7489]: info: crm_log_init_worker: Changed active directory to /var/lib/heartbeat/cores/root Feb 10 13:16:18 nodeb pacemakerd: [7489]: info: main: Starting Pacemaker 1.1.6-3.el6 (Build: a02c0f19a00c1eb2527ad38f146ebc0834814558): generated-manpages agent-manpages ascii-docs publican-docs ncurses trace-logging cman corosync-quorum corosync Feb 10 13:16:18 nodeb pacemakerd: [7489]: info: main: Maximum core file size is: 18446744073709551615 Feb 10 13:16:18 nodeb pacemakerd: [7489]: info: update_node_processes: 0x1b87770 Node 731255468 now known as nodeb (was: (null)) Feb 10 13:16:18 nodeb pacemakerd: [7489]: info: update_node_processes: Node nodeb now has process list: 00000000000000000000000000000002 (was 00000000000000000000000000000000) Feb 10 13:16:18 nodeb pacemakerd: [7489]: info: G_main_add_SignalHandler: Added signal handler for signal 17 Feb 10 13:16:18 nodeb pacemakerd: [7489]: info: start_child: Forked child 7493 for process stonith-ng Feb 10 13:16:18 nodeb pacemakerd: [7489]: info: update_node_processes: Node nodeb now has process list: 00000000000000000000000000100002 (was 00000000000000000000000000000002) Feb 10 13:16:18 nodeb pacemakerd: [7489]: info: start_child: Forked child 7494 for process cib Feb 10 13:16:18 nodeb pacemakerd: [7489]: info: update_node_processes: Node nodeb now has process list: 00000000000000000000000000100102 (was 00000000000000000000000000100002) Feb 10 13:16:18 nodeb pacemakerd: [7489]: info: start_child: Forked child 7495 for process lrmd Feb 10 13:16:18 nodeb pacemakerd: [7489]: info: update_node_processes: Node nodeb now has process list: 00000000000000000000000000100112 (was 00000000000000000000000000100102) Feb 10 13:16:18 nodeb pacemakerd: [7489]: info: start_child: Forked child 7496 for process attrd Feb 10 13:16:18 nodeb pacemakerd: [7489]: info: update_node_processes: Node nodeb now has process list: 00000000000000000000000000101112 (was 00000000000000000000000000100112) Feb 10 13:16:18 nodeb pacemakerd: [7489]: info: start_child: Forked child 7497 for process pengine Feb 10 13:16:18 nodeb pacemakerd: [7489]: info: update_node_processes: Node nodeb now has process list: 00000000000000000000000000111112 (was 00000000000000000000000000101112) Feb 10 13:16:18 nodeb pacemakerd: [7489]: info: start_child: Forked child 7498 for process crmd Feb 10 13:16:18 nodeb pacemakerd: [7489]: info: update_node_processes: Node nodeb now has process list: 00000000000000000000000000111312 (was 00000000000000000000000000111112) Feb 10 13:16:18 nodeb lrmd: [7495]: info: G_main_add_SignalHandler: Added signal handler for signal 15 Feb 10 13:16:18 nodeb pacemakerd: [7489]: info: main: Starting mainloop Feb 10 13:16:18 nodeb stonith-ng: [7493]: info: Invoked: /usr/lib64/heartbeat/stonithd Feb 10 13:16:18 nodeb stonith-ng: [7493]: info: crm_log_init_worker: Changed active directory to /var/lib/heartbeat/cores/root Feb 10 13:16:18 nodeb stonith-ng: [7493]: info: G_main_add_SignalHandler: Added signal handler for signal 17 Feb 10 13:16:18 nodeb stonith-ng: [7493]: info: get_cluster_type: Cluster type is: 'openais' Feb 10 13:16:18 nodeb stonith-ng: [7493]: notice: crm_cluster_connect: Connecting to cluster infrastructure: classic openais (with plugin) Feb 10 13:16:18 nodeb stonith-ng: [7493]: info: init_ais_connection_classic: Creating connection to our Corosync plugin Feb 10 13:16:18 nodeb cib: [7494]: info: crm_log_init_worker: Changed active directory to /var/lib/heartbeat/cores/hacluster Feb 10 13:16:18 nodeb cib: [7494]: info: G_main_add_TriggerHandler: Added signal manual handler Feb 10 13:16:18 nodeb cib: [7494]: info: G_main_add_SignalHandler: Added signal handler for signal 17 Feb 10 13:16:18 nodeb cib: [7494]: info: retrieveCib: Reading cluster configuration from: /var/lib/heartbeat/crm/cib.xml (digest: /var/lib/heartbeat/crm/cib.xml.sig) Feb 10 13:16:18 nodeb lrmd: [7495]: info: G_main_add_SignalHandler: Added signal handler for signal 17 Feb 10 13:16:18 nodeb lrmd: [7495]: info: enabling coredumps Feb 10 13:16:18 nodeb lrmd: [7495]: info: G_main_add_SignalHandler: Added signal handler for signal 10 Feb 10 13:16:18 nodeb lrmd: [7495]: info: G_main_add_SignalHandler: Added signal handler for signal 12 Feb 10 13:16:18 nodeb lrmd: [7495]: info: Started. Feb 10 13:16:18 nodeb cib: [7494]: info: validate_with_relaxng: Creating RNG parser context Feb 10 13:16:18 nodeb attrd: [7496]: info: Invoked: /usr/lib64/heartbeat/attrd Feb 10 13:16:18 nodeb attrd: [7496]: notice: crm_cluster_connect: Connecting to cluster infrastructure: classic openais (with plugin) Feb 10 13:16:18 nodeb stonith-ng: [7493]: info: init_ais_connection_classic: AIS connection established Feb 10 13:16:18 corosync [pcmk ] info: pcmk_ipc: Recorded connection 0x1e9e8f0 for stonith-ng/0 Feb 10 13:16:18 nodeb crmd: [7498]: info: Invoked: /usr/lib64/heartbeat/crmd Feb 10 13:16:18 nodeb stonith-ng: [7493]: info: get_ais_nodeid: Server details: id=731255468 uname=nodeb cname=pcmk Feb 10 13:16:18 nodeb stonith-ng: [7493]: info: init_ais_connection_once: Connection to 'classic openais (with plugin)': established Feb 10 13:16:18 nodeb stonith-ng: [7493]: info: crm_new_peer: Node nodeb now has id: 731255468 Feb 10 13:16:18 nodeb stonith-ng: [7493]: info: crm_new_peer: Node 731255468 is now known as nodeb Feb 10 13:16:18 nodeb crmd: [7498]: info: crm_log_init_worker: Changed active directory to /var/lib/heartbeat/cores/hacluster Feb 10 13:16:18 nodeb crmd: [7498]: info: main: CRM Hg Version: a02c0f19a00c1eb2527ad38f146ebc0834814558 Feb 10 13:16:18 nodeb stonith-ng: [7493]: info: main: Starting stonith-ng mainloop Feb 10 13:16:18 nodeb crmd: [7498]: info: crmd_init: Starting crmd Feb 10 13:16:18 nodeb crmd: [7498]: info: G_main_add_SignalHandler: Added signal handler for signal 17 Feb 10 13:16:18 nodeb stonith-ng: [7493]: info: crm_update_peer: Node nodeb: id=731255468 state=unknown addr=(null) votes=0 born=0 seen=0 proc=00000000000000000000000000111312 (new) Feb 10 13:16:18 nodeb pengine: [7497]: info: Invoked: /usr/lib64/heartbeat/pengine Feb 10 13:16:18 corosync [pcmk ] info: pcmk_ipc: Recorded connection 0x1ea2c50 for attrd/0 Feb 10 13:16:18 nodeb attrd: [7496]: notice: main: Starting mainloop... Feb 10 13:16:18 nodeb cib: [7494]: info: startCib: CIB Initialization completed successfully Feb 10 13:16:18 nodeb cib: [7494]: info: get_cluster_type: Cluster type is: 'openais' Feb 10 13:16:18 nodeb cib: [7494]: notice: crm_cluster_connect: Connecting to cluster infrastructure: classic openais (with plugin) Feb 10 13:16:18 nodeb cib: [7494]: info: init_ais_connection_classic: Creating connection to our Corosync plugin Feb 10 13:16:18 nodeb cib: [7494]: info: init_ais_connection_classic: AIS connection established Feb 10 13:16:18 corosync [pcmk ] info: pcmk_ipc: Recorded connection 0x1ea6fb0 for cib/0 Feb 10 13:16:18 corosync [pcmk ] info: pcmk_ipc: Sending membership update 204 to cib Feb 10 13:16:18 nodeb cib: [7494]: info: get_ais_nodeid: Server details: id=731255468 uname=nodeb cname=pcmk Feb 10 13:16:18 nodeb cib: [7494]: info: init_ais_connection_once: Connection to 'classic openais (with plugin)': established Feb 10 13:16:18 nodeb cib: [7494]: info: crm_new_peer: Node nodeb now has id: 731255468 Feb 10 13:16:18 nodeb cib: [7494]: info: crm_new_peer: Node 731255468 is now known as nodeb Feb 10 13:16:18 nodeb cib: [7494]: info: cib_init: Starting cib mainloop Feb 10 13:16:18 nodeb cib: [7494]: info: ais_dispatch_message: Membership 204: quorum still lost Feb 10 13:16:18 nodeb cib: [7494]: info: crm_update_peer: Node nodeb: id=731255468 state=member (new) addr=r(0) ip(192.168.2.2) r(1) ip(192.168.1.2) (new) votes=1 (new) born=0 seen=204 proc=00000000000000000000000000000000 Feb 10 13:16:18 nodeb cib: [7494]: info: crm_update_peer: Node nodeb: id=731255468 state=member addr=r(0) ip(192.168.2.2) r(1) ip(192.168.1.2) votes=1 born=0 seen=204 proc=00000000000000000000000000111312 (new) Feb 10 13:16:19 nodeb crmd: [7498]: info: do_cib_control: CIB connection established Feb 10 13:16:19 nodeb crmd: [7498]: info: get_cluster_type: Cluster type is: 'openais' Feb 10 13:16:19 nodeb crmd: [7498]: notice: crm_cluster_connect: Connecting to cluster infrastructure: classic openais (with plugin) Feb 10 13:16:19 nodeb crmd: [7498]: info: init_ais_connection_classic: Creating connection to our Corosync plugin Feb 10 13:16:19 nodeb crmd: [7498]: info: init_ais_connection_classic: AIS connection established Feb 10 13:16:19 corosync [pcmk ] info: pcmk_ipc: Recorded connection 0x1eabb60 for crmd/0 Feb 10 13:16:19 corosync [pcmk ] info: pcmk_ipc: Sending membership update 204 to crmd Feb 10 13:16:19 nodeb crmd: [7498]: info: get_ais_nodeid: Server details: id=731255468 uname=nodeb cname=pcmk Feb 10 13:16:19 nodeb crmd: [7498]: info: init_ais_connection_once: Connection to 'classic openais (with plugin)': established Feb 10 13:16:19 nodeb crmd: [7498]: info: crm_new_peer: Node nodeb now has id: 731255468 Feb 10 13:16:19 nodeb crmd: [7498]: info: crm_new_peer: Node 731255468 is now known as nodeb Feb 10 13:16:19 nodeb crmd: [7498]: info: ais_status_callback: status: nodeb is now unknown Feb 10 13:16:19 nodeb crmd: [7498]: info: do_ha_control: Connected to the cluster Feb 10 13:16:19 nodeb crmd: [7498]: info: do_started: Delaying start, no membership data (0000000000100000) Feb 10 13:16:19 nodeb crmd: [7498]: info: crmd_init: Starting crmd's mainloop Feb 10 13:16:19 nodeb crmd: [7498]: info: config_query_callback: Shutdown escalation occurs after: 1200000ms Feb 10 13:16:19 nodeb crmd: [7498]: info: config_query_callback: Checking for expired actions every 900000ms Feb 10 13:16:19 nodeb crmd: [7498]: info: config_query_callback: Sending expected-votes=2 to corosync Feb 10 13:16:19 nodeb crmd: [7498]: info: ais_dispatch_message: Membership 204: quorum still lost Feb 10 13:16:19 nodeb crmd: [7498]: info: ais_status_callback: status: nodeb is now member (was unknown) Feb 10 13:16:19 nodeb crmd: [7498]: info: crm_update_peer: Node nodeb: id=731255468 state=member (new) addr=r(0) ip(192.168.2.2) r(1) ip(192.168.1.2) (new) votes=1 (new) born=0 seen=204 proc=00000000000000000000000000000000 Feb 10 13:16:19 nodeb crmd: [7498]: info: ais_dispatch_message: Membership 204: quorum still lost Feb 10 13:16:19 nodeb crmd: [7498]: notice: crmd_peer_update: Status update: Client nodeb/crmd now has status [online] (DC=) Feb 10 13:16:19 nodeb crmd: [7498]: info: crm_update_peer: Node nodeb: id=731255468 state=member addr=r(0) ip(192.168.2.2) r(1) ip(192.168.1.2) votes=1 born=0 seen=204 proc=00000000000000000000000000111312 (new) Feb 10 13:16:19 nodeb crmd: [7498]: info: do_started: The local CRM is operational Feb 10 13:16:19 nodeb crmd: [7498]: info: do_state_transition: State transition S_STARTING -> S_PENDING [ input=I_PENDING cause=C_FSA_INTERNAL origin=do_started ] Feb 10 13:16:20 nodeb crmd: [7498]: info: te_connect_stonith: Attempting connection to fencing daemon... Feb 10 13:16:21 nodeb crmd: [7498]: info: te_connect_stonith: Connected Feb 10 13:16:40 nodeb crmd: [7498]: info: crm_timer_popped: Election Trigger (I_DC_TIMEOUT) just popped (20000ms) Feb 10 13:16:40 nodeb crmd: [7498]: WARN: do_log: FSA: Input I_DC_TIMEOUT from crm_timer_popped() received in state S_PENDING Feb 10 13:16:40 nodeb crmd: [7498]: info: do_state_transition: State transition S_PENDING -> S_ELECTION [ input=I_DC_TIMEOUT cause=C_TIMER_POPPED origin=crm_timer_popped ] Feb 10 13:16:40 nodeb crmd: [7498]: info: do_state_transition: State transition S_ELECTION -> S_INTEGRATION [ input=I_ELECTION_DC cause=C_FSA_INTERNAL origin=do_election_check ] Feb 10 13:16:40 nodeb crmd: [7498]: info: do_te_control: Registering TE UUID: b5f8b43d-1d0b-48d4-a661-5475c59cf127 Feb 10 13:16:40 nodeb crmd: [7498]: info: set_graph_functions: Setting custom graph functions Feb 10 13:16:40 nodeb crmd: [7498]: info: unpack_graph: Unpacked transition -1: 0 actions in 0 synapses Feb 10 13:16:40 nodeb crmd: [7498]: info: do_dc_takeover: Taking over DC status for this partition Feb 10 13:16:40 nodeb cib: [7494]: info: cib_process_readwrite: We are now in R/W mode Feb 10 13:16:40 nodeb cib: [7494]: info: cib_process_request: Operation complete: op cib_master for section 'all' (origin=local/crmd/5, version=0.57.1): ok (rc=0) Feb 10 13:16:40 nodeb cib: [7494]: info: cib_process_request: Operation complete: op cib_modify for section cib (origin=local/crmd/6, version=0.57.2): ok (rc=0) Feb 10 13:16:40 nodeb cib: [7494]: info: cib_process_request: Operation complete: op cib_modify for section crm_config (origin=local/crmd/8, version=0.57.3): ok (rc=0) Feb 10 13:16:40 nodeb crmd: [7498]: info: join_make_offer: Making join offers based on membership 204 Feb 10 13:16:40 nodeb crmd: [7498]: info: do_dc_join_offer_all: join-1: Waiting on 1 outstanding join acks Feb 10 13:16:40 nodeb cib: [7494]: info: cib_process_request: Operation complete: op cib_modify for section crm_config (origin=local/crmd/10, version=0.57.4): ok (rc=0) Feb 10 13:16:40 nodeb crmd: [7498]: info: config_query_callback: Shutdown escalation occurs after: 1200000ms Feb 10 13:16:40 nodeb crmd: [7498]: info: config_query_callback: Checking for expired actions every 900000ms Feb 10 13:16:40 nodeb crmd: [7498]: info: config_query_callback: Sending expected-votes=2 to corosync Feb 10 13:16:40 nodeb crmd: [7498]: info: ais_dispatch_message: Membership 204: quorum still lost Feb 10 13:16:40 nodeb crmd: [7498]: info: crmd_ais_dispatch: Setting expected votes to 2 Feb 10 13:16:40 nodeb crmd: [7498]: info: ais_dispatch_message: Membership 204: quorum still lost Feb 10 13:16:40 nodeb cib: [7494]: info: cib_process_request: Operation complete: op cib_modify for section crm_config (origin=local/crmd/13, version=0.57.5): ok (rc=0) Feb 10 13:16:40 nodeb crmd: [7498]: info: crmd_ais_dispatch: Setting expected votes to 2 Feb 10 13:16:40 nodeb crmd: [7498]: info: update_dc: Set DC to nodeb (3.0.5) Feb 10 13:16:40 nodeb cib: [7494]: info: cib_process_request: Operation complete: op cib_modify for section crm_config (origin=local/crmd/15, version=0.57.6): ok (rc=0) Feb 10 13:16:40 nodeb crmd: [7498]: info: do_state_transition: State transition S_INTEGRATION -> S_FINALIZE_JOIN [ input=I_INTEGRATED cause=C_FSA_INTERNAL origin=check_join_state ] Feb 10 13:16:40 nodeb crmd: [7498]: info: do_state_transition: All 1 cluster nodes responded to the join offer. Feb 10 13:16:40 nodeb crmd: [7498]: info: do_dc_join_finalize: join-1: Syncing the CIB from nodeb to the rest of the cluster Feb 10 13:16:40 nodeb cib: [7494]: info: cib_process_request: Operation complete: op cib_sync for section 'all' (origin=local/crmd/17, version=0.57.6): ok (rc=0) Feb 10 13:16:40 nodeb cib: [7494]: info: cib_process_request: Operation complete: op cib_modify for section nodes (origin=local/crmd/18, version=0.57.7): ok (rc=0) Feb 10 13:16:40 nodeb crmd: [7498]: info: update_attrd: Connecting to attrd... Feb 10 13:16:40 nodeb crmd: [7498]: info: do_dc_join_ack: join-1: Updating node state to member for nodeb Feb 10 13:16:40 nodeb cib: [7494]: info: cib_process_request: Operation complete: op cib_delete for section //node_state[@uname='nodeb']/transient_attributes (origin=local/crmd/19, version=0.57.8): ok (rc=0) Feb 10 13:16:40 nodeb crmd: [7498]: info: erase_xpath_callback: Deletion of "//node_state[@uname='nodeb']/transient_attributes": ok (rc=0) Feb 10 13:16:40 nodeb cib: [7494]: info: cib_process_request: Operation complete: op cib_delete for section //node_state[@uname='nodeb']/lrm (origin=local/crmd/20, version=0.57.9): ok (rc=0) Feb 10 13:16:40 nodeb crmd: [7498]: info: erase_xpath_callback: Deletion of "//node_state[@uname='nodeb']/lrm": ok (rc=0) Feb 10 13:16:40 nodeb crmd: [7498]: info: do_state_transition: State transition S_FINALIZE_JOIN -> S_POLICY_ENGINE [ input=I_FINALIZED cause=C_FSA_INTERNAL origin=check_join_state ] Feb 10 13:16:40 nodeb crmd: [7498]: info: do_state_transition: All 1 cluster nodes are eligible to run resources. Feb 10 13:16:40 nodeb crmd: [7498]: info: do_dc_join_final: Ensuring DC, quorum and node attributes are up-to-date Feb 10 13:16:40 nodeb crmd: [7498]: info: crm_update_quorum: Updating quorum status to false (call=24) Feb 10 13:16:40 nodeb attrd: [7496]: notice: attrd_local_callback: Sending full refresh (origin=crmd) Feb 10 13:16:40 nodeb crmd: [7498]: info: abort_transition_graph: do_te_invoke:162 - Triggered transition abort (complete=1) : Peer Cancelled Feb 10 13:16:40 nodeb crmd: [7498]: info: do_pe_invoke: Query 25: Requesting the current CIB: S_POLICY_ENGINE Feb 10 13:16:40 nodeb cib: [7494]: info: cib_process_request: Operation complete: op cib_modify for section nodes (origin=local/crmd/22, version=0.57.11): ok (rc=0) Feb 10 13:16:40 nodeb cib: [7494]: info: cib_process_request: Operation complete: op cib_modify for section cib (origin=local/crmd/24, version=0.57.13): ok (rc=0) Feb 10 13:16:40 nodeb crmd: [7498]: info: do_pe_invoke_callback: Invoking the PE: query=25, ref=pe_calc-dc-1328897800-7, seq=204, quorate=0 Feb 10 13:16:40 nodeb pengine: [7497]: notice: unpack_config: On loss of CCM Quorum: Ignore Feb 10 13:16:40 nodeb pengine: [7497]: notice: RecurringOp: Start recurring monitor (60s) for drbd0:0 on nodeb Feb 10 13:16:40 nodeb pengine: [7497]: notice: RecurringOp: Start recurring monitor (60s) for drbd0:0 on nodeb Feb 10 13:16:40 nodeb pengine: [7497]: notice: RecurringOp: Start recurring monitor (60s) for drbd1:0 on nodeb Feb 10 13:16:40 nodeb pengine: [7497]: notice: RecurringOp: Start recurring monitor (60s) for drbd1:0 on nodeb Feb 10 13:16:40 nodeb pengine: [7497]: notice: RecurringOp: Start recurring monitor (60s) for ipmi-fencing:0 on nodeb Feb 10 13:16:40 nodeb pengine: [7497]: WARN: stage6: Scheduling Node nodea for STONITH Feb 10 13:16:40 nodeb pengine: [7497]: notice: LogActions: Start drbd0:0 (nodeb) Feb 10 13:16:40 nodeb pengine: [7497]: notice: LogActions: Leave drbd0:1 (Stopped) Feb 10 13:16:40 nodeb pengine: [7497]: notice: LogActions: Start drbd1:0 (nodeb) Feb 10 13:16:40 nodeb pengine: [7497]: notice: LogActions: Leave drbd1:1 (Stopped) Feb 10 13:16:40 nodeb pengine: [7497]: notice: LogActions: Leave datafs (Stopped) Feb 10 13:16:40 nodeb pengine: [7497]: notice: LogActions: Leave patchfs (Stopped) Feb 10 13:16:40 nodeb pengine: [7497]: notice: LogActions: Leave ClusterIP (Stopped) Feb 10 13:16:40 nodeb pengine: [7497]: notice: LogActions: Leave httpd (Stopped) Feb 10 13:16:40 nodeb pengine: [7497]: notice: LogActions: Start ipmi-fencing:0 (nodeb) Feb 10 13:16:40 nodeb pengine: [7497]: notice: LogActions: Leave ipmi-fencing:1 (Stopped) Feb 10 13:16:40 nodeb crmd: [7498]: info: do_state_transition: State transition S_POLICY_ENGINE -> S_TRANSITION_ENGINE [ input=I_PE_SUCCESS cause=C_IPC_MESSAGE origin=handle_response ] Feb 10 13:16:40 nodeb crmd: [7498]: info: unpack_graph: Unpacked transition 0: 35 actions in 35 synapses Feb 10 13:16:40 nodeb crmd: [7498]: info: do_te_invoke: Processing graph 0 (ref=pe_calc-dc-1328897800-7) derived from /var/lib/pengine/pe-warn-17.bz2 Feb 10 13:16:40 nodeb crmd: [7498]: info: te_rsc_command: Initiating action 4: monitor drbd0:0_monitor_0 on nodeb (local) Feb 10 13:16:40 nodeb pengine: [7497]: WARN: process_pe_message: Transition 0: WARNINGs found during PE processing. PEngine Input stored in: /var/lib/pengine/pe-warn-17.bz2 Feb 10 13:16:40 nodeb pengine: [7497]: notice: process_pe_message: Configuration WARNINGs found during PE processing. Please run "crm_verify -L" to identify issues. Feb 10 13:16:40 nodeb crmd: [7498]: info: do_lrm_rsc_op: Performing key=4:0:7:b5f8b43d-1d0b-48d4-a661-5475c59cf127 op=drbd0:0_monitor_0 ) Feb 10 13:16:40 nodeb lrmd: [7495]: info: rsc:drbd0:0:2: probe Feb 10 13:16:40 nodeb crmd: [7498]: info: te_pseudo_action: Pseudo action 15 fired and confirmed Feb 10 13:16:40 nodeb crmd: [7498]: info: te_rsc_command: Initiating action 5: monitor drbd1:0_monitor_0 on nodeb (local) Feb 10 13:16:40 nodeb crmd: [7498]: info: do_lrm_rsc_op: Performing key=5:0:7:b5f8b43d-1d0b-48d4-a661-5475c59cf127 op=drbd1:0_monitor_0 ) Feb 10 13:16:40 nodeb lrmd: [7495]: info: rsc:drbd1:0:3: probe Feb 10 13:16:40 nodeb crmd: [7498]: info: te_pseudo_action: Pseudo action 41 fired and confirmed Feb 10 13:16:40 nodeb crmd: [7498]: info: te_rsc_command: Initiating action 6: monitor datafs_monitor_0 on nodeb (local) Feb 10 13:16:40 nodeb crmd: [7498]: info: do_lrm_rsc_op: Performing key=6:0:7:b5f8b43d-1d0b-48d4-a661-5475c59cf127 op=datafs_monitor_0 ) Feb 10 13:16:40 nodeb lrmd: [7495]: info: rsc:datafs:4: probe Feb 10 13:16:40 nodeb crmd: [7498]: info: te_rsc_command: Initiating action 7: monitor patchfs_monitor_0 on nodeb (local) Feb 10 13:16:40 nodeb crmd: [7498]: info: do_lrm_rsc_op: Performing key=7:0:7:b5f8b43d-1d0b-48d4-a661-5475c59cf127 op=patchfs_monitor_0 ) Feb 10 13:16:40 nodeb lrmd: [7495]: info: rsc:patchfs:5: probe Feb 10 13:16:40 nodeb crmd: [7498]: info: te_rsc_command: Initiating action 8: monitor ClusterIP_monitor_0 on nodeb (local) Feb 10 13:16:40 nodeb crmd: [7498]: info: do_lrm_rsc_op: Performing key=8:0:7:b5f8b43d-1d0b-48d4-a661-5475c59cf127 op=ClusterIP_monitor_0 ) Feb 10 13:16:40 nodeb crmd: [7498]: info: te_rsc_command: Initiating action 9: monitor httpd_monitor_0 on nodeb (local) Feb 10 13:16:40 nodeb crmd: [7498]: info: do_lrm_rsc_op: Performing key=9:0:7:b5f8b43d-1d0b-48d4-a661-5475c59cf127 op=httpd_monitor_0 ) Feb 10 13:16:40 nodeb crmd: [7498]: info: te_rsc_command: Initiating action 10: monitor ipmi-fencing:0_monitor_0 on nodeb (local) Feb 10 13:16:40 nodeb lrmd: [7495]: notice: lrmd_rsc_new(): No lrm_rprovider field in message Feb 10 13:16:40 nodeb crmd: [7498]: info: do_lrm_rsc_op: Performing key=10:0:7:b5f8b43d-1d0b-48d4-a661-5475c59cf127 op=ipmi-fencing:0_monitor_0 ) Feb 10 13:16:40 nodeb crmd: [7498]: info: te_pseudo_action: Pseudo action 69 fired and confirmed Feb 10 13:16:40 nodeb crmd: [7498]: info: te_pseudo_action: Pseudo action 16 fired and confirmed Feb 10 13:16:40 nodeb crmd: [7498]: info: te_pseudo_action: Pseudo action 13 fired and confirmed Feb 10 13:16:40 nodeb crmd: [7498]: info: te_pseudo_action: Pseudo action 42 fired and confirmed Feb 10 13:16:40 nodeb crmd: [7498]: info: te_pseudo_action: Pseudo action 39 fired and confirmed Feb 10 13:16:40 nodeb attrd: [7496]: notice: attrd_trigger_update: Sending flush op to all hosts for: master-drbd1:0 (5) Feb 10 13:16:40 nodeb attrd: [7496]: notice: attrd_perform_update: Sent update 4: master-drbd1:0=5 Feb 10 13:16:40 nodeb attrd: [7496]: notice: attrd_trigger_update: Sending flush op to all hosts for: master-drbd0:0 (5) Feb 10 13:16:40 nodeb crmd: [7498]: info: process_lrm_event: LRM operation drbd1:0_monitor_0 (call=3, rc=7, cib-update=26, confirmed=true) not running Feb 10 13:16:40 nodeb crmd: [7498]: info: process_lrm_event: LRM operation drbd0:0_monitor_0 (call=2, rc=7, cib-update=27, confirmed=true) not running Feb 10 13:16:40 nodeb crmd: [7498]: info: abort_transition_graph: te_update_diff:176 - Triggered transition abort (complete=0, tag=nvpair, id=status-nodeb-master-drbd1.0, name=master-drbd1:0, value=5, magic=NA, cib=0.57.14) : Transient attribute: update Feb 10 13:16:40 nodeb crmd: [7498]: info: update_abort_priority: Abort priority upgraded from 0 to 1000000 Feb 10 13:16:40 nodeb crmd: [7498]: info: update_abort_priority: Abort action done superceeded by restart Feb 10 13:16:40 nodeb crmd: [7498]: info: match_graph_event: Action drbd1:0_monitor_0 (5) confirmed on nodeb (rc=0) Feb 10 13:16:40 nodeb crmd: [7498]: info: match_graph_event: Action drbd0:0_monitor_0 (4) confirmed on nodeb (rc=0) Feb 10 13:16:40 nodeb attrd: [7496]: notice: attrd_perform_update: Sent update 7: master-drbd0:0=5 Feb 10 13:16:40 nodeb crmd: [7498]: info: process_lrm_event: LRM operation patchfs_monitor_0 (call=5, rc=7, cib-update=28, confirmed=true) not running Feb 10 13:16:40 nodeb crmd: [7498]: info: process_lrm_event: LRM operation datafs_monitor_0 (call=4, rc=7, cib-update=29, confirmed=true) not running Feb 10 13:16:40 nodeb crmd: [7498]: info: abort_transition_graph: te_update_diff:176 - Triggered transition abort (complete=0, tag=nvpair, id=status-nodeb-master-drbd0.0, name=master-drbd0:0, value=5, magic=NA, cib=0.57.17) : Transient attribute: update Feb 10 13:16:40 nodeb crmd: [7498]: info: match_graph_event: Action patchfs_monitor_0 (7) confirmed on nodeb (rc=0) Feb 10 13:16:40 nodeb crmd: [7498]: info: match_graph_event: Action datafs_monitor_0 (6) confirmed on nodeb (rc=0) Feb 10 13:16:41 nodeb lrmd: [7495]: info: rsc:ClusterIP:6: probe Feb 10 13:16:41 nodeb lrmd: [7495]: info: rsc:httpd:7: probe Feb 10 13:16:41 nodeb lrmd: [7495]: info: rsc:ipmi-fencing:0:8: probe Feb 10 13:16:41 nodeb stonith-ng: [7493]: notice: stonith_device_action: Device ipmi-fencing:0 not found Feb 10 13:16:41 nodeb stonith-ng: [7493]: info: stonith_command: Processed st_execute from lrmd: rc=-12 Feb 10 13:16:41 nodeb crmd: [7498]: info: process_lrm_event: LRM operation ipmi-fencing:0_monitor_0 (call=8, rc=7, cib-update=30, confirmed=true) not running Feb 10 13:16:41 nodeb crmd: [7498]: info: match_graph_event: Action ipmi-fencing:0_monitor_0 (10) confirmed on nodeb (rc=0) Feb 10 13:16:41 nodeb crmd: [7498]: info: process_lrm_event: LRM operation ClusterIP_monitor_0 (call=6, rc=7, cib-update=31, confirmed=true) not running Feb 10 13:16:41 nodeb crmd: [7498]: info: match_graph_event: Action ClusterIP_monitor_0 (8) confirmed on nodeb (rc=0) Feb 10 13:16:41 nodeb crmd: [7498]: info: process_lrm_event: LRM operation httpd_monitor_0 (call=7, rc=7, cib-update=32, confirmed=true) not running Feb 10 13:16:41 nodeb crmd: [7498]: info: match_graph_event: Action httpd_monitor_0 (9) confirmed on nodeb (rc=0) Feb 10 13:16:41 nodeb crmd: [7498]: info: te_rsc_command: Initiating action 3: probe_complete probe_complete on nodeb (local) - no waiting Feb 10 13:16:41 nodeb crmd: [7498]: info: run_graph: ==================================================== Feb 10 13:16:41 nodeb attrd: [7496]: notice: attrd_trigger_update: Sending flush op to all hosts for: probe_complete (true) Feb 10 13:16:41 nodeb crmd: [7498]: notice: run_graph: Transition 0 (Complete=15, Pending=0, Fired=0, Skipped=11, Incomplete=9, Source=/var/lib/pengine/pe-warn-17.bz2): Stopped Feb 10 13:16:41 nodeb crmd: [7498]: info: te_graph_trigger: Transition 0 is now complete Feb 10 13:16:41 nodeb crmd: [7498]: info: do_state_transition: State transition S_TRANSITION_ENGINE -> S_POLICY_ENGINE [ input=I_PE_CALC cause=C_FSA_INTERNAL origin=notify_crmd ] Feb 10 13:16:41 nodeb crmd: [7498]: info: do_state_transition: All 1 cluster nodes are eligible to run resources. Feb 10 13:16:41 nodeb crmd: [7498]: info: do_pe_invoke: Query 33: Requesting the current CIB: S_POLICY_ENGINE Feb 10 13:16:41 nodeb attrd: [7496]: notice: attrd_perform_update: Sent update 10: probe_complete=true Feb 10 13:16:41 nodeb crmd: [7498]: info: do_pe_invoke_callback: Invoking the PE: query=33, ref=pe_calc-dc-1328897801-16, seq=204, quorate=0 Feb 10 13:16:41 nodeb pengine: [7497]: notice: unpack_config: On loss of CCM Quorum: Ignore Feb 10 13:16:41 nodeb pengine: [7497]: notice: RecurringOp: Start recurring monitor (60s) for drbd0:0 on nodeb Feb 10 13:16:41 nodeb pengine: [7497]: notice: RecurringOp: Start recurring monitor (60s) for drbd0:0 on nodeb Feb 10 13:16:41 nodeb pengine: [7497]: notice: RecurringOp: Start recurring monitor (60s) for drbd1:0 on nodeb Feb 10 13:16:41 nodeb pengine: [7497]: notice: RecurringOp: Start recurring monitor (60s) for drbd1:0 on nodeb Feb 10 13:16:41 nodeb pengine: [7497]: notice: RecurringOp: Start recurring monitor (60s) for ipmi-fencing:0 on nodeb Feb 10 13:16:41 nodeb pengine: [7497]: WARN: stage6: Scheduling Node nodea for STONITH Feb 10 13:16:41 nodeb pengine: [7497]: notice: LogActions: Start drbd0:0 (nodeb) Feb 10 13:16:41 nodeb pengine: [7497]: notice: LogActions: Leave drbd0:1 (Stopped) Feb 10 13:16:41 nodeb pengine: [7497]: notice: LogActions: Start drbd1:0 (nodeb) Feb 10 13:16:41 nodeb pengine: [7497]: notice: LogActions: Leave drbd1:1 (Stopped) Feb 10 13:16:41 nodeb pengine: [7497]: notice: LogActions: Leave datafs (Stopped) Feb 10 13:16:41 nodeb pengine: [7497]: notice: LogActions: Leave patchfs (Stopped) Feb 10 13:16:41 nodeb pengine: [7497]: notice: LogActions: Leave ClusterIP (Stopped) Feb 10 13:16:41 nodeb pengine: [7497]: notice: LogActions: Leave httpd (Stopped) Feb 10 13:16:41 nodeb pengine: [7497]: notice: LogActions: Start ipmi-fencing:0 (nodeb) Feb 10 13:16:41 nodeb pengine: [7497]: notice: LogActions: Leave ipmi-fencing:1 (Stopped) Feb 10 13:16:41 nodeb crmd: [7498]: info: do_state_transition: State transition S_POLICY_ENGINE -> S_TRANSITION_ENGINE [ input=I_PE_SUCCESS cause=C_IPC_MESSAGE origin=handle_response ] Feb 10 13:16:41 nodeb crmd: [7498]: info: unpack_graph: Unpacked transition 1: 27 actions in 27 synapses Feb 10 13:16:41 nodeb crmd: [7498]: info: do_te_invoke: Processing graph 1 (ref=pe_calc-dc-1328897801-16) derived from /var/lib/pengine/pe-warn-18.bz2 Feb 10 13:16:41 nodeb crmd: [7498]: info: te_pseudo_action: Pseudo action 8 fired and confirmed Feb 10 13:16:41 nodeb crmd: [7498]: info: te_pseudo_action: Pseudo action 34 fired and confirmed Feb 10 13:16:41 nodeb crmd: [7498]: info: te_pseudo_action: Pseudo action 62 fired and confirmed Feb 10 13:16:41 nodeb crmd: [7498]: info: te_rsc_command: Initiating action 3: probe_complete probe_complete on nodeb (local) - no waiting Feb 10 13:16:41 nodeb crmd: [7498]: info: te_pseudo_action: Pseudo action 9 fired and confirmed Feb 10 13:16:41 nodeb crmd: [7498]: info: te_pseudo_action: Pseudo action 6 fired and confirmed Feb 10 13:16:41 nodeb crmd: [7498]: info: te_pseudo_action: Pseudo action 35 fired and confirmed Feb 10 13:16:41 nodeb crmd: [7498]: info: te_pseudo_action: Pseudo action 32 fired and confirmed Feb 10 13:16:41 nodeb crmd: [7498]: info: te_rsc_command: Initiating action 60: start ipmi-fencing:0_start_0 on nodeb (local) Feb 10 13:16:41 nodeb crmd: [7498]: info: do_lrm_rsc_op: Performing key=60:1:0:b5f8b43d-1d0b-48d4-a661-5475c59cf127 op=ipmi-fencing:0_start_0 ) Feb 10 13:16:41 nodeb lrmd: [7495]: info: rsc:ipmi-fencing:0:9: start Feb 10 13:16:41 nodeb pengine: [7497]: WARN: process_pe_message: Transition 1: WARNINGs found during PE processing. PEngine Input stored in: /var/lib/pengine/pe-warn-18.bz2 Feb 10 13:16:41 nodeb pengine: [7497]: notice: process_pe_message: Configuration WARNINGs found during PE processing. Please run "crm_verify -L" to identify issues. Feb 10 13:16:41 nodeb stonith-ng: [7493]: info: stonith_device_register: Added 'ipmi-fencing:0' to the device list (1 active devices) Feb 10 13:16:41 nodeb stonith-ng: [7493]: info: stonith_command: Processed st_device_register from lrmd: rc=0 Feb 10 13:16:41 nodeb lrmd: [7495]: info: stonith_api_device_metadata: looking up fence_ipmilan/redhat metadata Feb 10 13:16:41 nodeb crmd: [7498]: info: process_lrm_event: LRM operation ipmi-fencing:0_start_0 (call=9, rc=0, cib-update=34, confirmed=true) ok Feb 10 13:16:41 nodeb crmd: [7498]: info: match_graph_event: Action ipmi-fencing:0_start_0 (60) confirmed on nodeb (rc=0) Feb 10 13:16:41 nodeb crmd: [7498]: info: te_rsc_command: Initiating action 61: monitor ipmi-fencing:0_monitor_60000 on nodeb (local) Feb 10 13:16:41 nodeb crmd: [7498]: info: do_lrm_rsc_op: Performing key=61:1:0:b5f8b43d-1d0b-48d4-a661-5475c59cf127 op=ipmi-fencing:0_monitor_60000 ) Feb 10 13:16:41 nodeb lrmd: [7495]: info: rsc:ipmi-fencing:0:10: monitor Feb 10 13:16:41 nodeb crmd: [7498]: info: te_pseudo_action: Pseudo action 63 fired and confirmed Feb 10 13:16:41 nodeb crmd: [7498]: info: te_pseudo_action: Pseudo action 66 fired and confirmed Feb 10 13:16:41 nodeb crmd: [7498]: info: te_fence_node: Executing reboot fencing operation (68) on nodea (timeout=60000) Feb 10 13:16:41 nodeb stonith-ng: [7493]: info: initiate_remote_stonith_op: Initiating remote operation reboot for nodea: 6fdca2ba-e390-4663-a853-48cb801403bd Feb 10 13:16:41 nodeb stonith-ng: [7493]: info: stonith_command: Processed st_execute from lrmd: rc=-1 Feb 10 13:16:41 nodeb stonith-ng: [7493]: info: can_fence_host_with_device: ipmi-fencing:0 can fence nodea: static-list Feb 10 13:16:41 nodeb stonith-ng: [7493]: info: call_remote_stonith: Requesting that nodeb perform op reboot nodea Feb 10 13:16:41 nodeb stonith-ng: [7493]: info: stonith_fence: Exec Feb 10 13:16:41 nodeb stonith-ng: [7493]: info: can_fence_host_with_device: ipmi-fencing:0 can fence nodea: static-list Feb 10 13:16:41 nodeb stonith-ng: [7493]: info: stonith_fence: Found 1 matching devices for 'nodea' Feb 10 13:16:41 nodeb stonith-ng: [7493]: info: stonith_command: Processed st_fence from nodeb: rc=-1 Feb 10 13:16:41 nodeb stonith-ng: [7493]: info: log_operation: ipmi-fencing:0: Getting status of IPMI:192.168.3.2...Done Feb 10 13:16:41 nodeb stonith-ng: [7493]: info: make_args: reboot-ing node 'nodea' as 'port=nodea' Feb 10 13:16:41 nodeb crmd: [7498]: info: process_lrm_event: LRM operation ipmi-fencing:0_monitor_60000 (call=10, rc=0, cib-update=35, confirmed=false) ok Feb 10 13:16:41 nodeb crmd: [7498]: info: match_graph_event: Action ipmi-fencing:0_monitor_60000 (61) confirmed on nodeb (rc=0) Feb 10 13:16:50 nodeb stonith-ng: [7493]: info: log_operation: Operation 'reboot' [7716] (call 0 from (null)) for host 'nodea' with device 'ipmi-fencing:0' returned: 0 Feb 10 13:16:50 nodeb stonith-ng: [7493]: info: log_operation: ipmi-fencing:0: Rebooting machine @ IPMI:192.168.3.2...Done Feb 10 13:16:50 nodeb stonith-ng: [7493]: info: stonith_device_execute: Nothing to do for ipmi-fencing:0 Feb 10 13:16:50 nodeb stonith-ng: [7493]: info: process_remote_stonith_exec: ExecResult Feb 10 13:16:50 nodeb stonith-ng: [7493]: info: remote_op_done: Notifing clients of 6fdca2ba-e390-4663-a853-48cb801403bd (reboot of nodea from f9d41859-945e-4f51-bcb0-284e8235d6b7 by nodeb): 2, rc=0 Feb 10 13:16:50 nodeb stonith-ng: [7493]: info: stonith_notify_client: Sending st_fence-notification to client 7498/ad40f126-1c30-4313-9c50-1efabc495657 Feb 10 13:16:50 nodeb crmd: [7498]: info: tengine_stonith_callback: StonithOp Feb 10 13:16:50 nodeb crmd: [7498]: info: tengine_stonith_callback: Stonith operation 2/68:1:0:b5f8b43d-1d0b-48d4-a661-5475c59cf127: OK (0) Feb 10 13:16:50 nodeb crmd: [7498]: info: tengine_stonith_callback: Stonith of nodea passed Feb 10 13:16:50 nodeb crmd: [7498]: info: send_stonith_update: Sending fencing update 36 for nodea Feb 10 13:16:50 nodeb crmd: [7498]: info: crm_new_peer: Node 0 is now known as nodea Feb 10 13:16:50 nodeb crmd: [7498]: info: ais_status_callback: status: nodea is now unknown Feb 10 13:16:50 nodeb crmd: [7498]: info: ais_status_callback: status: nodea is now lost (was unknown) Feb 10 13:16:50 nodeb crmd: [7498]: info: crm_update_peer: Node nodea: id=0 state=lost (new) addr=(null) votes=-1 born=0 seen=0 proc=00000000000000000000000000000001 Feb 10 13:16:50 nodeb crmd: [7498]: info: tengine_stonith_notify: Peer nodea was terminated (reboot) by nodeb for nodeb (ref=6fdca2ba-e390-4663-a853-48cb801403bd): OK Feb 10 13:16:50 nodeb crmd: [7498]: info: te_pseudo_action: Pseudo action 67 fired and confirmed Feb 10 13:16:50 nodeb crmd: [7498]: info: te_pseudo_action: Pseudo action 1 fired and confirmed Feb 10 13:16:50 nodeb crmd: [7498]: info: te_rsc_command: Initiating action 4: start drbd0:0_start_0 on nodeb (local) Feb 10 13:16:50 nodeb crmd: [7498]: info: do_lrm_rsc_op: Performing key=4:1:0:b5f8b43d-1d0b-48d4-a661-5475c59cf127 op=drbd0:0_start_0 ) Feb 10 13:16:50 nodeb lrmd: [7495]: info: rsc:drbd0:0:11: start Feb 10 13:16:50 nodeb crmd: [7498]: info: te_rsc_command: Initiating action 30: start drbd1:0_start_0 on nodeb (local) Feb 10 13:16:50 nodeb crmd: [7498]: info: do_lrm_rsc_op: Performing key=30:1:0:b5f8b43d-1d0b-48d4-a661-5475c59cf127 op=drbd1:0_start_0 ) Feb 10 13:16:50 nodeb lrmd: [7495]: info: rsc:drbd1:0:12: start Feb 10 13:16:50 nodeb cib: [7494]: info: cib_process_request: Operation complete: op cib_delete for section //node_state[@uname='nodea']/lrm (origin=local/crmd/37, version=0.57.27): ok (rc=0) Feb 10 13:16:50 nodeb crmd: [7498]: info: cib_fencing_updated: Fencing update 36 for nodea: complete Feb 10 13:16:50 nodeb crmd: [7498]: info: erase_xpath_callback: Deletion of "//node_state[@uname='nodea']/lrm": ok (rc=0) Feb 10 13:16:50 nodeb cib: [7494]: info: cib_process_request: Operation complete: op cib_delete for section //node_state[@uname='nodea']/transient_attributes (origin=local/crmd/38, version=0.57.28): ok (rc=0) Feb 10 13:16:50 nodeb crmd: [7498]: info: erase_xpath_callback: Deletion of "//node_state[@uname='nodea']/transient_attributes": ok (rc=0) Feb 10 13:16:50 nodeb lrmd: [7495]: info: RA output: (drbd0:0:start:stdout) Feb 10 13:16:50 nodeb lrmd: [7495]: info: RA output: (drbd1:0:start:stdout) Feb 10 13:16:50 nodeb lrmd: [7495]: info: RA output: (drbd0:0:start:stdout) Feb 10 13:16:50 nodeb lrmd: [7495]: info: RA output: (drbd1:0:start:stdout) Feb 10 13:16:50 nodeb lrmd: [7495]: info: RA output: (drbd0:0:start:stdout) Feb 10 13:16:50 nodeb lrmd: [7495]: info: RA output: (drbd1:0:start:stdout) Feb 10 13:16:50 nodeb lrmd: [7495]: info: RA output: (drbd0:0:start:stdout) Feb 10 13:16:50 nodeb lrmd: [7495]: info: RA output: (drbd1:0:start:stdout) Feb 10 13:16:50 nodeb lrmd: [7495]: info: RA output: (drbd0:0:start:stdout) Feb 10 13:16:50 nodeb lrmd: [7495]: info: RA output: (drbd1:0:start:stdout) Feb 10 13:16:50 nodeb lrmd: [7495]: info: RA output: (drbd0:0:start:stdout) Feb 10 13:16:50 nodeb lrmd: [7495]: info: RA output: (drbd1:0:start:stdout) Feb 10 13:16:50 nodeb crmd: [7498]: info: process_lrm_event: LRM operation drbd0:0_start_0 (call=11, rc=0, cib-update=39, confirmed=true) ok Feb 10 13:16:50 nodeb crmd: [7498]: info: process_lrm_event: LRM operation drbd1:0_start_0 (call=12, rc=0, cib-update=40, confirmed=true) ok Feb 10 13:16:50 nodeb crmd: [7498]: info: match_graph_event: Action drbd0:0_start_0 (4) confirmed on nodeb (rc=0) Feb 10 13:16:50 nodeb crmd: [7498]: info: te_pseudo_action: Pseudo action 7 fired and confirmed Feb 10 13:16:50 nodeb crmd: [7498]: info: te_pseudo_action: Pseudo action 10 fired and confirmed Feb 10 13:16:50 nodeb crmd: [7498]: info: te_rsc_command: Initiating action 84: notify drbd0:0_post_notify_start_0 on nodeb (local) Feb 10 13:16:50 nodeb crmd: [7498]: info: do_lrm_rsc_op: Performing key=84:1:0:b5f8b43d-1d0b-48d4-a661-5475c59cf127 op=drbd0:0_notify_0 ) Feb 10 13:16:50 nodeb lrmd: [7495]: info: rsc:drbd0:0:13: notify Feb 10 13:16:50 nodeb crmd: [7498]: info: match_graph_event: Action drbd1:0_start_0 (30) confirmed on nodeb (rc=0) Feb 10 13:16:50 nodeb crmd: [7498]: info: te_pseudo_action: Pseudo action 33 fired and confirmed Feb 10 13:16:50 nodeb crmd: [7498]: info: te_pseudo_action: Pseudo action 36 fired and confirmed Feb 10 13:16:50 nodeb crmd: [7498]: info: te_rsc_command: Initiating action 85: notify drbd1:0_post_notify_start_0 on nodeb (local) Feb 10 13:16:50 nodeb crmd: [7498]: info: do_lrm_rsc_op: Performing key=85:1:0:b5f8b43d-1d0b-48d4-a661-5475c59cf127 op=drbd1:0_notify_0 ) Feb 10 13:16:50 nodeb lrmd: [7495]: info: rsc:drbd1:0:14: notify Feb 10 13:16:51 nodeb lrmd: [7495]: info: RA output: (drbd0:0:notify:stdout) Feb 10 13:16:51 nodeb crmd: [7498]: info: send_direct_ack: ACK'ing resource op drbd0:0_notify_0 from 84:1:0:b5f8b43d-1d0b-48d4-a661-5475c59cf127: lrm_invoke-lrmd-1328897811-24 Feb 10 13:16:51 nodeb crmd: [7498]: info: process_te_message: Processing (N)ACK lrm_invoke-lrmd-1328897811-24 from nodeb Feb 10 13:16:51 nodeb crmd: [7498]: info: match_graph_event: Action drbd0:0_notify_0 (84) confirmed on nodeb (rc=0) Feb 10 13:16:51 nodeb crmd: [7498]: info: process_lrm_event: LRM operation drbd0:0_notify_0 (call=13, rc=0, cib-update=0, confirmed=true) ok Feb 10 13:16:51 nodeb crmd: [7498]: info: te_pseudo_action: Pseudo action 11 fired and confirmed Feb 10 13:16:51 nodeb crmd: [7498]: info: te_rsc_command: Initiating action 5: monitor drbd0:0_monitor_60000 on nodeb (local) Feb 10 13:16:51 nodeb crmd: [7498]: info: do_lrm_rsc_op: Performing key=5:1:0:b5f8b43d-1d0b-48d4-a661-5475c59cf127 op=drbd0:0_monitor_60000 ) Feb 10 13:16:51 nodeb lrmd: [7495]: info: rsc:drbd0:0:15: monitor Feb 10 13:16:51 nodeb lrmd: [7495]: info: RA output: (drbd1:0:notify:stdout) Feb 10 13:16:51 nodeb crmd: [7498]: info: send_direct_ack: ACK'ing resource op drbd1:0_notify_0 from 85:1:0:b5f8b43d-1d0b-48d4-a661-5475c59cf127: lrm_invoke-lrmd-1328897811-26 Feb 10 13:16:51 nodeb crmd: [7498]: info: process_te_message: Processing (N)ACK lrm_invoke-lrmd-1328897811-26 from nodeb Feb 10 13:16:51 nodeb crmd: [7498]: info: match_graph_event: Action drbd1:0_notify_0 (85) confirmed on nodeb (rc=0) Feb 10 13:16:51 nodeb crmd: [7498]: info: process_lrm_event: LRM operation drbd1:0_notify_0 (call=14, rc=0, cib-update=0, confirmed=true) ok Feb 10 13:16:51 nodeb crmd: [7498]: info: te_pseudo_action: Pseudo action 37 fired and confirmed Feb 10 13:16:51 nodeb crmd: [7498]: info: te_rsc_command: Initiating action 31: monitor drbd1:0_monitor_60000 on nodeb (local) Feb 10 13:16:51 nodeb crmd: [7498]: info: do_lrm_rsc_op: Performing key=31:1:0:b5f8b43d-1d0b-48d4-a661-5475c59cf127 op=drbd1:0_monitor_60000 ) Feb 10 13:16:51 nodeb lrmd: [7495]: info: rsc:drbd1:0:16: monitor Feb 10 13:16:51 nodeb crmd: [7498]: info: process_lrm_event: LRM operation drbd0:0_monitor_60000 (call=15, rc=0, cib-update=41, confirmed=false) ok Feb 10 13:16:51 nodeb crmd: [7498]: info: match_graph_event: Action drbd0:0_monitor_60000 (5) confirmed on nodeb (rc=0) Feb 10 13:16:51 nodeb crmd: [7498]: info: process_lrm_event: LRM operation drbd1:0_monitor_60000 (call=16, rc=0, cib-update=42, confirmed=false) ok Feb 10 13:16:51 nodeb crmd: [7498]: info: match_graph_event: Action drbd1:0_monitor_60000 (31) confirmed on nodeb (rc=0) Feb 10 13:16:51 nodeb crmd: [7498]: info: run_graph: ==================================================== Feb 10 13:16:51 nodeb crmd: [7498]: notice: run_graph: Transition 1 (Complete=27, Pending=0, Fired=0, Skipped=0, Incomplete=0, Source=/var/lib/pengine/pe-warn-18.bz2): Complete Feb 10 13:16:51 nodeb crmd: [7498]: info: te_graph_trigger: Transition 1 is now complete Feb 10 13:16:51 nodeb crmd: [7498]: info: notify_crmd: Transition 1 status: done - Feb 10 13:16:51 nodeb crmd: [7498]: info: do_state_transition: State transition S_TRANSITION_ENGINE -> S_IDLE [ input=I_TE_SUCCESS cause=C_FSA_INTERNAL origin=notify_crmd ] Feb 10 13:16:51 nodeb crmd: [7498]: info: do_state_transition: Starting PEngine Recheck Timer Feb 10 13:17:41 nodeb stonith-ng: [7493]: info: stonith_command: Processed st_execute from lrmd: rc=-1 Feb 10 13:17:41 nodeb stonith-ng: [7493]: info: log_operation: ipmi-fencing:0: Getting status of IPMI:192.168.3.2...Done Feb 10 13:17:41 nodeb stonith-ng: [7493]: info: stonith_device_execute: Nothing to do for ipmi-fencing:0 Feb 10 13:18:41 nodeb stonith-ng: [7493]: info: stonith_command: Processed st_execute from lrmd: rc=-1 Feb 10 13:18:42 nodeb stonith-ng: [7493]: info: log_operation: ipmi-fencing:0: Getting status of IPMI:192.168.3.2...Done Feb 10 13:18:42 nodeb stonith-ng: [7493]: info: stonith_device_execute: Nothing to do for ipmi-fencing:0 Feb 10 13:19:42 nodeb stonith-ng: [7493]: info: stonith_command: Processed st_execute from lrmd: rc=-1 Feb 10 13:19:42 nodeb stonith-ng: [7493]: info: log_operation: ipmi-fencing:0: Getting status of IPMI:192.168.3.2...Done Feb 10 13:19:42 nodeb stonith-ng: [7493]: info: stonith_device_execute: Nothing to do for ipmi-fencing:0 Feb 10 13:20:42 nodeb stonith-ng: [7493]: info: stonith_command: Processed st_execute from lrmd: rc=-1 Feb 10 13:20:42 nodeb stonith-ng: [7493]: info: log_operation: ipmi-fencing:0: Getting status of IPMI:192.168.3.2...Done Feb 10 13:20:42 nodeb stonith-ng: [7493]: info: stonith_device_execute: Nothing to do for ipmi-fencing:0 Feb 10 13:21:42 nodeb stonith-ng: [7493]: info: stonith_command: Processed st_execute from lrmd: rc=-1 Feb 10 13:21:42 nodeb stonith-ng: [7493]: info: log_operation: ipmi-fencing:0: Getting status of IPMI:192.168.3.2...Done Feb 10 13:21:42 nodeb stonith-ng: [7493]: info: stonith_device_execute: Nothing to do for ipmi-fencing:0 Feb 10 13:22:42 nodeb stonith-ng: [7493]: info: stonith_command: Processed st_execute from lrmd: rc=-1 Feb 10 13:22:42 nodeb stonith-ng: [7493]: info: log_operation: ipmi-fencing:0: Getting status of IPMI:192.168.3.2...Done Feb 10 13:22:42 nodeb stonith-ng: [7493]: info: stonith_device_execute: Nothing to do for ipmi-fencing:0 Feb 10 13:23:42 nodeb stonith-ng: [7493]: info: stonith_command: Processed st_execute from lrmd: rc=-1 Feb 10 13:23:42 nodeb stonith-ng: [7493]: info: log_operation: ipmi-fencing:0: Getting status of IPMI:192.168.3.2...Done Feb 10 13:23:42 nodeb stonith-ng: [7493]: info: stonith_device_execute: Nothing to do for ipmi-fencing:0 Feb 10 13:24:42 nodeb stonith-ng: [7493]: info: stonith_command: Processed st_execute from lrmd: rc=-1 Feb 10 13:24:43 nodeb stonith-ng: [7493]: info: log_operation: ipmi-fencing:0: Getting status of IPMI:192.168.3.2...Done Feb 10 13:24:43 nodeb stonith-ng: [7493]: info: stonith_device_execute: Nothing to do for ipmi-fencing:0 Feb 10 13:25:43 nodeb stonith-ng: [7493]: info: stonith_command: Processed st_execute from lrmd: rc=-1 Feb 10 13:25:43 nodeb stonith-ng: [7493]: info: log_operation: ipmi-fencing:0: Getting status of IPMI:192.168.3.2...Done Feb 10 13:25:43 nodeb stonith-ng: [7493]: info: stonith_device_execute: Nothing to do for ipmi-fencing:0 Feb 10 13:26:18 nodeb cib: [7494]: info: cib_stats: Processed 77 operations (519.00us average, 0% utilization) in the last 10min Feb 10 13:26:43 nodeb stonith-ng: [7493]: info: stonith_command: Processed st_execute from lrmd: rc=-1 Feb 10 13:26:43 nodeb stonith-ng: [7493]: info: log_operation: ipmi-fencing:0: Getting status of IPMI:192.168.3.2...Done Feb 10 13:26:43 nodeb stonith-ng: [7493]: info: stonith_device_execute: Nothing to do for ipmi-fencing:0 Feb 10 13:27:43 nodeb stonith-ng: [7493]: info: stonith_command: Processed st_execute from lrmd: rc=-1 Feb 10 13:27:43 nodeb stonith-ng: [7493]: info: log_operation: ipmi-fencing:0: Getting status of IPMI:192.168.3.2...Done Feb 10 13:27:43 nodeb stonith-ng: [7493]: info: stonith_device_execute: Nothing to do for ipmi-fencing:0 Feb 10 13:28:43 nodeb stonith-ng: [7493]: info: stonith_command: Processed st_execute from lrmd: rc=-1 Feb 10 13:28:43 nodeb stonith-ng: [7493]: info: log_operation: ipmi-fencing:0: Getting status of IPMI:192.168.3.2...Done Feb 10 13:28:43 nodeb stonith-ng: [7493]: info: stonith_device_execute: Nothing to do for ipmi-fencing:0 Feb 10 13:29:43 nodeb stonith-ng: [7493]: info: stonith_command: Processed st_execute from lrmd: rc=-1 Feb 10 13:29:43 nodeb stonith-ng: [7493]: info: log_operation: ipmi-fencing:0: Getting status of IPMI:192.168.3.2...Done Feb 10 13:29:43 nodeb stonith-ng: [7493]: info: stonith_device_execute: Nothing to do for ipmi-fencing:0 Feb 10 13:30:43 nodeb stonith-ng: [7493]: info: stonith_command: Processed st_execute from lrmd: rc=-1 Feb 10 13:30:43 nodeb stonith-ng: [7493]: info: log_operation: ipmi-fencing:0: Getting status of IPMI:192.168.3.2...Done Feb 10 13:30:43 nodeb stonith-ng: [7493]: info: stonith_device_execute: Nothing to do for ipmi-fencing:0 Feb 10 13:31:43 nodeb stonith-ng: [7493]: info: stonith_command: Processed st_execute from lrmd: rc=-1 Feb 10 13:31:43 nodeb stonith-ng: [7493]: info: log_operation: ipmi-fencing:0: Getting status of IPMI:192.168.3.2...Done Feb 10 13:31:43 nodeb stonith-ng: [7493]: info: stonith_device_execute: Nothing to do for ipmi-fencing:0 Feb 10 13:31:51 nodeb crmd: [7498]: info: crm_timer_popped: PEngine Recheck Timer (I_PE_CALC) just popped (900000ms) Feb 10 13:31:51 nodeb crmd: [7498]: info: do_state_transition: State transition S_IDLE -> S_POLICY_ENGINE [ input=I_PE_CALC cause=C_TIMER_POPPED origin=crm_timer_popped ] Feb 10 13:31:51 nodeb crmd: [7498]: info: do_state_transition: Progressed to state S_POLICY_ENGINE after C_TIMER_POPPED Feb 10 13:31:51 nodeb crmd: [7498]: info: do_state_transition: All 1 cluster nodes are eligible to run resources. Feb 10 13:31:51 nodeb crmd: [7498]: info: do_pe_invoke: Query 43: Requesting the current CIB: S_POLICY_ENGINE Feb 10 13:31:51 nodeb crmd: [7498]: info: do_pe_invoke_callback: Invoking the PE: query=43, ref=pe_calc-dc-1328898711-28, seq=204, quorate=0 Feb 10 13:31:51 nodeb pengine: [7497]: notice: unpack_config: On loss of CCM Quorum: Ignore Feb 10 13:31:51 nodeb pengine: [7497]: notice: RecurringOp: Start recurring monitor (30s) for ClusterIP on nodeb Feb 10 13:31:51 nodeb pengine: [7497]: notice: RecurringOp: Start recurring monitor (60s) for httpd on nodeb Feb 10 13:31:51 nodeb pengine: [7497]: notice: LogActions: Promote drbd0:0 (Slave -> Master nodeb) Feb 10 13:31:51 nodeb pengine: [7497]: notice: LogActions: Leave drbd0:1 (Stopped) Feb 10 13:31:51 nodeb pengine: [7497]: notice: LogActions: Promote drbd1:0 (Slave -> Master nodeb) Feb 10 13:31:51 nodeb pengine: [7497]: notice: LogActions: Leave drbd1:1 (Stopped) Feb 10 13:31:51 nodeb pengine: [7497]: notice: LogActions: Start datafs (nodeb) Feb 10 13:31:51 nodeb pengine: [7497]: notice: LogActions: Start patchfs (nodeb) Feb 10 13:31:51 nodeb pengine: [7497]: notice: LogActions: Start ClusterIP (nodeb) Feb 10 13:31:51 nodeb pengine: [7497]: notice: LogActions: Start httpd (nodeb) Feb 10 13:31:51 nodeb pengine: [7497]: notice: LogActions: Leave ipmi-fencing:0 (Started nodeb) Feb 10 13:31:51 nodeb pengine: [7497]: notice: LogActions: Leave ipmi-fencing:1 (Stopped) Feb 10 13:31:51 nodeb crmd: [7498]: info: do_state_transition: State transition S_POLICY_ENGINE -> S_TRANSITION_ENGINE [ input=I_PE_SUCCESS cause=C_IPC_MESSAGE origin=handle_response ] Feb 10 13:31:51 nodeb crmd: [7498]: info: unpack_graph: Unpacked transition 2: 28 actions in 28 synapses Feb 10 13:31:51 nodeb crmd: [7498]: info: do_te_invoke: Processing graph 2 (ref=pe_calc-dc-1328898711-28) derived from /var/lib/pengine/pe-input-100.bz2 Feb 10 13:31:51 nodeb crmd: [7498]: info: te_rsc_command: Initiating action 2: cancel drbd0:0_monitor_60000 on nodeb (local) Feb 10 13:31:51 nodeb lrmd: [7495]: info: cancel_op: operation monitor[15] on ocf::drbd::drbd0:0 for client 7498, its parameters: CRM_meta_clone=[0] CRM_meta_master_max=[1] CRM_meta_notify_slave_resource=[ ] CRM_meta_notify_active_resource=[ ] CRM_meta_notify_demote_uname=[ ] drbd_resource=[drbd0] CRM_meta_master_node_max=[1] CRM_meta_notify_stop_resource=[ ] CRM_meta_notify_master_resource=[ ] CRM_meta_clone_node_max=[1] CRM_meta_notify=[true] CRM_meta_notify_demote_resource=[ ] CRM_meta_notify_start_resource=[drbd0:0 ] CRM_meta_clone_max=[2] crm_feature_se cancelled Feb 10 13:31:51 nodeb crmd: [7498]: info: send_direct_ack: ACK'ing resource op drbd0:0_monitor_60000 from 2:2:0:b5f8b43d-1d0b-48d4-a661-5475c59cf127: lrm_invoke-lrmd-1328898711-30 Feb 10 13:31:51 nodeb crmd: [7498]: info: process_te_message: Processing (N)ACK lrm_invoke-lrmd-1328898711-30 from nodeb Feb 10 13:31:51 nodeb crmd: [7498]: info: match_graph_event: Action drbd0:0_monitor_60000 (2) confirmed on nodeb (rc=0) Feb 10 13:31:51 nodeb crmd: [7498]: info: te_pseudo_action: Pseudo action 25 fired and confirmed Feb 10 13:31:51 nodeb crmd: [7498]: info: te_rsc_command: Initiating action 1: cancel drbd1:0_monitor_60000 on nodeb (local) Feb 10 13:31:51 nodeb lrmd: [7495]: info: cancel_op: operation monitor[16] on ocf::drbd::drbd1:0 for client 7498, its parameters: CRM_meta_clone=[0] CRM_meta_master_max=[1] CRM_meta_notify_slave_resource=[ ] CRM_meta_notify_active_resource=[ ] CRM_meta_notify_demote_uname=[ ] drbd_resource=[drbd1] CRM_meta_master_node_max=[1] CRM_meta_notify_stop_resource=[ ] CRM_meta_notify_master_resource=[ ] CRM_meta_clone_node_max=[1] CRM_meta_notify=[true] CRM_meta_notify_demote_resource=[ ] CRM_meta_notify_start_resource=[drbd1:0 ] CRM_meta_clone_max=[2] crm_feature_se cancelled Feb 10 13:31:51 nodeb pengine: [7497]: notice: process_pe_message: Transition 2: PEngine Input stored in: /var/lib/pengine/pe-input-100.bz2 Feb 10 13:31:51 nodeb crmd: [7498]: info: send_direct_ack: ACK'ing resource op drbd1:0_monitor_60000 from 1:2:0:b5f8b43d-1d0b-48d4-a661-5475c59cf127: lrm_invoke-lrmd-1328898711-32 Feb 10 13:31:51 nodeb crmd: [7498]: info: process_te_message: Processing (N)ACK lrm_invoke-lrmd-1328898711-32 from nodeb Feb 10 13:31:51 nodeb crmd: [7498]: info: match_graph_event: Action drbd1:0_monitor_60000 (1) confirmed on nodeb (rc=0) Feb 10 13:31:51 nodeb crmd: [7498]: info: te_pseudo_action: Pseudo action 53 fired and confirmed Feb 10 13:31:51 nodeb crmd: [7498]: info: process_lrm_event: LRM operation drbd0:0_monitor_60000 (call=15, status=1, cib-update=0, confirmed=true) Cancelled Feb 10 13:31:51 nodeb crmd: [7498]: info: process_lrm_event: LRM operation drbd1:0_monitor_60000 (call=16, status=1, cib-update=0, confirmed=true) Cancelled Feb 10 13:31:51 nodeb crmd: [7498]: info: te_rsc_command: Initiating action 92: notify drbd0:0_pre_notify_promote_0 on nodeb (local) Feb 10 13:31:51 nodeb crmd: [7498]: info: do_lrm_rsc_op: Performing key=92:2:0:b5f8b43d-1d0b-48d4-a661-5475c59cf127 op=drbd0:0_notify_0 ) Feb 10 13:31:51 nodeb lrmd: [7495]: info: rsc:drbd0:0:17: notify Feb 10 13:31:51 nodeb crmd: [7498]: info: te_rsc_command: Initiating action 100: notify drbd1:0_pre_notify_promote_0 on nodeb (local) Feb 10 13:31:51 nodeb crmd: [7498]: info: do_lrm_rsc_op: Performing key=100:2:0:b5f8b43d-1d0b-48d4-a661-5475c59cf127 op=drbd1:0_notify_0 ) Feb 10 13:31:51 nodeb lrmd: [7495]: info: rsc:drbd1:0:18: notify Feb 10 13:31:51 nodeb crmd: [7498]: info: send_direct_ack: ACK'ing resource op drbd0:0_notify_0 from 92:2:0:b5f8b43d-1d0b-48d4-a661-5475c59cf127: lrm_invoke-lrmd-1328898711-35 Feb 10 13:31:51 nodeb crmd: [7498]: info: process_te_message: Processing (N)ACK lrm_invoke-lrmd-1328898711-35 from nodeb Feb 10 13:31:51 nodeb crmd: [7498]: info: match_graph_event: Action drbd0:0_notify_0 (92) confirmed on nodeb (rc=0) Feb 10 13:31:51 nodeb crmd: [7498]: info: process_lrm_event: LRM operation drbd0:0_notify_0 (call=17, rc=0, cib-update=0, confirmed=true) ok Feb 10 13:31:51 nodeb crmd: [7498]: info: te_pseudo_action: Pseudo action 26 fired and confirmed Feb 10 13:31:51 nodeb crmd: [7498]: info: te_pseudo_action: Pseudo action 23 fired and confirmed Feb 10 13:31:51 nodeb crmd: [7498]: info: send_direct_ack: ACK'ing resource op drbd1:0_notify_0 from 100:2:0:b5f8b43d-1d0b-48d4-a661-5475c59cf127: lrm_invoke-lrmd-1328898711-36 Feb 10 13:31:51 nodeb crmd: [7498]: info: process_te_message: Processing (N)ACK lrm_invoke-lrmd-1328898711-36 from nodeb Feb 10 13:31:51 nodeb crmd: [7498]: info: match_graph_event: Action drbd1:0_notify_0 (100) confirmed on nodeb (rc=0) Feb 10 13:31:51 nodeb crmd: [7498]: info: process_lrm_event: LRM operation drbd1:0_notify_0 (call=18, rc=0, cib-update=0, confirmed=true) ok Feb 10 13:31:51 nodeb crmd: [7498]: info: te_rsc_command: Initiating action 10: promote drbd0:0_promote_0 on nodeb (local) Feb 10 13:31:51 nodeb crmd: [7498]: info: do_lrm_rsc_op: Performing key=10:2:0:b5f8b43d-1d0b-48d4-a661-5475c59cf127 op=drbd0:0_promote_0 ) Feb 10 13:31:51 nodeb lrmd: [7495]: info: rsc:drbd0:0:19: promote Feb 10 13:31:51 nodeb crmd: [7498]: info: te_pseudo_action: Pseudo action 54 fired and confirmed Feb 10 13:31:51 nodeb crmd: [7498]: info: te_pseudo_action: Pseudo action 51 fired and confirmed Feb 10 13:31:51 nodeb crmd: [7498]: info: te_rsc_command: Initiating action 38: promote drbd1:0_promote_0 on nodeb (local) Feb 10 13:31:51 nodeb crmd: [7498]: info: do_lrm_rsc_op: Performing key=38:2:0:b5f8b43d-1d0b-48d4-a661-5475c59cf127 op=drbd1:0_promote_0 ) Feb 10 13:31:51 nodeb lrmd: [7495]: info: rsc:drbd1:0:20: promote Feb 10 13:31:51 nodeb lrmd: [7495]: info: RA output: (drbd0:0:promote:stdout) Feb 10 13:31:51 nodeb lrmd: [7495]: info: RA output: (drbd1:0:promote:stdout) Feb 10 13:31:51 nodeb crmd: [7498]: info: process_lrm_event: LRM operation drbd0:0_promote_0 (call=19, rc=0, cib-update=46, confirmed=true) ok Feb 10 13:31:51 nodeb crmd: [7498]: info: match_graph_event: Action drbd0:0_promote_0 (10) confirmed on nodeb (rc=0) Feb 10 13:31:51 nodeb crmd: [7498]: info: process_lrm_event: LRM operation drbd1:0_promote_0 (call=20, rc=0, cib-update=47, confirmed=true) ok Feb 10 13:31:51 nodeb crmd: [7498]: info: te_pseudo_action: Pseudo action 24 fired and confirmed Feb 10 13:31:51 nodeb crmd: [7498]: info: te_pseudo_action: Pseudo action 27 fired and confirmed Feb 10 13:31:51 nodeb crmd: [7498]: info: te_rsc_command: Initiating action 93: notify drbd0:0_post_notify_promote_0 on nodeb (local) Feb 10 13:31:51 nodeb crmd: [7498]: info: do_lrm_rsc_op: Performing key=93:2:0:b5f8b43d-1d0b-48d4-a661-5475c59cf127 op=drbd0:0_notify_0 ) Feb 10 13:31:51 nodeb lrmd: [7495]: info: rsc:drbd0:0:21: notify Feb 10 13:31:51 nodeb crmd: [7498]: info: match_graph_event: Action drbd1:0_promote_0 (38) confirmed on nodeb (rc=0) Feb 10 13:31:51 nodeb crmd: [7498]: info: te_pseudo_action: Pseudo action 52 fired and confirmed Feb 10 13:31:51 nodeb crmd: [7498]: info: te_pseudo_action: Pseudo action 55 fired and confirmed Feb 10 13:31:51 nodeb crmd: [7498]: info: te_rsc_command: Initiating action 101: notify drbd1:0_post_notify_promote_0 on nodeb (local) Feb 10 13:31:51 nodeb crmd: [7498]: info: do_lrm_rsc_op: Performing key=101:2:0:b5f8b43d-1d0b-48d4-a661-5475c59cf127 op=drbd1:0_notify_0 ) Feb 10 13:31:51 nodeb lrmd: [7495]: info: rsc:drbd1:0:22: notify Feb 10 13:31:51 nodeb lrmd: [7495]: info: RA output: (drbd0:0:notify:stdout) Feb 10 13:31:51 nodeb crmd: [7498]: info: send_direct_ack: ACK'ing resource op drbd0:0_notify_0 from 93:2:0:b5f8b43d-1d0b-48d4-a661-5475c59cf127: lrm_invoke-lrmd-1328898711-41 Feb 10 13:31:51 nodeb crmd: [7498]: info: process_te_message: Processing (N)ACK lrm_invoke-lrmd-1328898711-41 from nodeb Feb 10 13:31:51 nodeb crmd: [7498]: info: match_graph_event: Action drbd0:0_notify_0 (93) confirmed on nodeb (rc=0) Feb 10 13:31:51 nodeb crmd: [7498]: info: process_lrm_event: LRM operation drbd0:0_notify_0 (call=21, rc=0, cib-update=0, confirmed=true) ok Feb 10 13:31:51 nodeb crmd: [7498]: info: te_pseudo_action: Pseudo action 28 fired and confirmed Feb 10 13:31:51 nodeb lrmd: [7495]: info: RA output: (drbd1:0:notify:stdout) Feb 10 13:31:51 nodeb crmd: [7498]: info: send_direct_ack: ACK'ing resource op drbd1:0_notify_0 from 101:2:0:b5f8b43d-1d0b-48d4-a661-5475c59cf127: lrm_invoke-lrmd-1328898711-42 Feb 10 13:31:51 nodeb crmd: [7498]: info: process_te_message: Processing (N)ACK lrm_invoke-lrmd-1328898711-42 from nodeb Feb 10 13:31:51 nodeb crmd: [7498]: info: match_graph_event: Action drbd1:0_notify_0 (101) confirmed on nodeb (rc=0) Feb 10 13:31:51 nodeb crmd: [7498]: info: process_lrm_event: LRM operation drbd1:0_notify_0 (call=22, rc=0, cib-update=0, confirmed=true) ok Feb 10 13:31:51 nodeb crmd: [7498]: info: te_pseudo_action: Pseudo action 56 fired and confirmed Feb 10 13:31:51 nodeb crmd: [7498]: info: te_pseudo_action: Pseudo action 69 fired and confirmed Feb 10 13:31:51 nodeb crmd: [7498]: info: te_rsc_command: Initiating action 63: start datafs_start_0 on nodeb (local) Feb 10 13:31:51 nodeb crmd: [7498]: info: do_lrm_rsc_op: Performing key=63:2:0:b5f8b43d-1d0b-48d4-a661-5475c59cf127 op=datafs_start_0 ) Feb 10 13:31:51 nodeb lrmd: [7495]: info: rsc:datafs:23: start Feb 10 13:31:51 nodeb crmd: [7498]: info: process_lrm_event: LRM operation datafs_start_0 (call=23, rc=0, cib-update=48, confirmed=true) ok Feb 10 13:31:51 nodeb crmd: [7498]: info: match_graph_event: Action datafs_start_0 (63) confirmed on nodeb (rc=0) Feb 10 13:31:51 nodeb crmd: [7498]: info: te_rsc_command: Initiating action 64: start patchfs_start_0 on nodeb (local) Feb 10 13:31:51 nodeb crmd: [7498]: info: do_lrm_rsc_op: Performing key=64:2:0:b5f8b43d-1d0b-48d4-a661-5475c59cf127 op=patchfs_start_0 ) Feb 10 13:31:51 nodeb lrmd: [7495]: info: rsc:patchfs:24: start Feb 10 13:31:51 nodeb crmd: [7498]: info: process_lrm_event: LRM operation patchfs_start_0 (call=24, rc=0, cib-update=49, confirmed=true) ok Feb 10 13:31:51 nodeb crmd: [7498]: info: match_graph_event: Action patchfs_start_0 (64) confirmed on nodeb (rc=0) Feb 10 13:31:51 nodeb crmd: [7498]: info: te_rsc_command: Initiating action 65: start ClusterIP_start_0 on nodeb (local) Feb 10 13:31:51 nodeb crmd: [7498]: info: do_lrm_rsc_op: Performing key=65:2:0:b5f8b43d-1d0b-48d4-a661-5475c59cf127 op=ClusterIP_start_0 ) Feb 10 13:31:51 nodeb lrmd: [7495]: info: rsc:ClusterIP:25: start Feb 10 13:31:51 nodeb crmd: [7498]: info: process_lrm_event: LRM operation ClusterIP_start_0 (call=25, rc=0, cib-update=50, confirmed=true) ok Feb 10 13:31:51 nodeb crmd: [7498]: info: match_graph_event: Action ClusterIP_start_0 (65) confirmed on nodeb (rc=0) Feb 10 13:31:51 nodeb crmd: [7498]: info: te_rsc_command: Initiating action 66: monitor ClusterIP_monitor_30000 on nodeb (local) Feb 10 13:31:51 nodeb crmd: [7498]: info: do_lrm_rsc_op: Performing key=66:2:0:b5f8b43d-1d0b-48d4-a661-5475c59cf127 op=ClusterIP_monitor_30000 ) Feb 10 13:31:51 nodeb lrmd: [7495]: info: rsc:ClusterIP:26: monitor Feb 10 13:31:51 nodeb crmd: [7498]: info: te_rsc_command: Initiating action 67: start httpd_start_0 on nodeb (local) Feb 10 13:31:51 nodeb crmd: [7498]: info: do_lrm_rsc_op: Performing key=67:2:0:b5f8b43d-1d0b-48d4-a661-5475c59cf127 op=httpd_start_0 ) Feb 10 13:31:51 nodeb lrmd: [7495]: info: rsc:httpd:27: start Feb 10 13:31:51 nodeb crmd: [7498]: info: process_lrm_event: LRM operation ClusterIP_monitor_30000 (call=26, rc=0, cib-update=51, confirmed=false) ok Feb 10 13:31:51 nodeb crmd: [7498]: info: match_graph_event: Action ClusterIP_monitor_30000 (66) confirmed on nodeb (rc=0) Feb 10 13:31:51 nodeb crmd: [7498]: info: process_lrm_event: LRM operation httpd_start_0 (call=27, rc=0, cib-update=52, confirmed=true) ok Feb 10 13:31:51 nodeb crmd: [7498]: info: match_graph_event: Action httpd_start_0 (67) confirmed on nodeb (rc=0) Feb 10 13:31:51 nodeb crmd: [7498]: info: te_pseudo_action: Pseudo action 70 fired and confirmed Feb 10 13:31:51 nodeb crmd: [7498]: info: te_rsc_command: Initiating action 68: monitor httpd_monitor_60000 on nodeb (local) Feb 10 13:31:51 nodeb crmd: [7498]: info: do_lrm_rsc_op: Performing key=68:2:0:b5f8b43d-1d0b-48d4-a661-5475c59cf127 op=httpd_monitor_60000 ) Feb 10 13:31:51 nodeb lrmd: [7495]: info: rsc:httpd:28: monitor Feb 10 13:31:51 nodeb crmd: [7498]: info: process_lrm_event: LRM operation httpd_monitor_60000 (call=28, rc=0, cib-update=53, confirmed=false) ok Feb 10 13:31:51 nodeb crmd: [7498]: info: match_graph_event: Action httpd_monitor_60000 (68) confirmed on nodeb (rc=0) Feb 10 13:31:51 nodeb crmd: [7498]: info: run_graph: ==================================================== Feb 10 13:31:51 nodeb crmd: [7498]: notice: run_graph: Transition 2 (Complete=28, Pending=0, Fired=0, Skipped=0, Incomplete=0, Source=/var/lib/pengine/pe-input-100.bz2): Complete Feb 10 13:31:51 nodeb crmd: [7498]: info: te_graph_trigger: Transition 2 is now complete Feb 10 13:31:51 nodeb crmd: [7498]: info: notify_crmd: Transition 2 status: done - Feb 10 13:31:51 nodeb crmd: [7498]: info: do_state_transition: State transition S_TRANSITION_ENGINE -> S_IDLE [ input=I_TE_SUCCESS cause=C_FSA_INTERNAL origin=notify_crmd ] Feb 10 13:31:51 nodeb crmd: [7498]: info: do_state_transition: Starting PEngine Recheck Timer Feb 10 13:31:55 nodeb lrmd: [7495]: info: RA output: (ClusterIP:start:stderr) ARPING 192.168.1.3 from 192.168.1.3 eth0 Sent 5 probes (5 broadcast(s)) Received 0 response(s) Feb 10 13:32:44 nodeb stonith-ng: [7493]: info: stonith_command: Processed st_execute from lrmd: rc=-1 Feb 10 13:32:44 nodeb stonith-ng: [7493]: info: log_operation: ipmi-fencing:0: Getting status of IPMI:192.168.3.2...Done Feb 10 13:32:44 nodeb stonith-ng: [7493]: info: stonith_device_execute: Nothing to do for ipmi-fencing:0 Feb 10 13:33:11 nodeb crmd: [7498]: WARN: action_timer_callback: Timer popped (timeout=20000, abort_level=0, complete=true) Feb 10 13:33:11 nodeb crmd: [7498]: WARN: action_timer_callback: Ignoring timeout while not in transition Feb 10 13:33:11 nodeb crmd: [7498]: WARN: action_timer_callback: Timer popped (timeout=20000, abort_level=0, complete=true) Feb 10 13:33:11 nodeb crmd: [7498]: WARN: action_timer_callback: Ignoring timeout while not in transition Feb 10 13:33:44 nodeb stonith-ng: [7493]: info: stonith_command: Processed st_execute from lrmd: rc=-1 Feb 10 13:33:44 nodeb stonith-ng: [7493]: info: log_operation: ipmi-fencing:0: Getting status of IPMI:192.168.3.2...Done Feb 10 13:33:44 nodeb stonith-ng: [7493]: info: stonith_device_execute: Nothing to do for ipmi-fencing:0