[ClusterLabs] unable to start fence_scsi
Ken Gaillot
kgaillot at redhat.com
Wed May 18 22:14:16 UTC 2016
On 05/18/2016 04:46 PM, Marco A. Carcano wrote:
> Hi Ken, and thanks again
>
> In the meantime I deleted the stonith device and restarted the cluster -
> This time I created the scsi fence device with the following command
>
> pcs stonith create scsi fence_scsi pcmk_reboot_action="off" devices="/dev/mapper/36001405973e201b3fdb4a999175b942f" meta provides="unfencing" op monitor interval=60s
>
>>
>> * fence_scsi -o metadata
>>
>> Make sure "on" is in the list of supported actions. The stock one does,
>> but just to be sure you don't have a modified version …
>
> it seems OK
>
> <actions>
> <action name="on" on_target="1" automatic="1"/>
> <action name="off" />
> <action name="status" />
> <action name="list" />
> <action name="list-status" />
> <action name="monitor" />
> <action name="metadata" />
> </actions>
>
>>
>> * stonith_admin -L
>>
>> Make sure "scsi" is in the output (list of configured fence devices).
>
> even here seems fine
>
> stonith_admin -L
> scsi
> 1 devices found
>
>>
>> * stonith_admin -l apache-up003.ring0
>
> here I got:
>
> stonith_admin -l apache-up003.node0
> No devices found
>
> however, after waiting for at least 10 minutes, situation changes (weird, … but why?):
>
> stonith_admin -l apache-up003.ring0
> scsi
> 1 devices found
That is weird. I was under the impression that fence_scsi required
pcmk_host_list/pcmk_host_map, and didn't support dynamic determination
of fenceable hosts (which is what happens without those options or at
least pcmk_host_check=none).
> tonith_admin -l apache-up002.ring0
> scsi
> 1 devices found
>
> stonith_admin -l apache-up001.ring0
> scsi
> 1 devices found
>
> stonith_admin -l apache-up001.ring1
> scsi
> 1 devices found
>
> stonith_admin -l apache-up002.ring1
> scsi
> 1 devices found
>
> stonith_admin -l apache-up003.ring1
> scsi
> 1 devices found
>
>
>>
>> to see what devices the cluster thinks can fence that node
>>
>> * Does the cluster status show the fence device running on some node?
>
> here is the output of pcs status
>
> Cluster name: apache-0
> Last updated: Wed May 18 23:15:52 2016 Last change: Wed May 18 22:48:36 2016 by root via cibadmin on apache-up001.ring0
> Stack: corosync
> Current DC: apache-up001.ring0 (version 1.1.14-10.el7.centos-70404b0) - partition with quorum
> 3 nodes and 1 resource configured
>
> Online: [ apache-up001.ring0 apache-up002.ring0 apache-up003.ring0 ]
>
> Full list of resources:
>
> scsi (stonith:fence_scsi): Stopped
>
> PCSD Status:
> apache-up001.ring0: Online
> apache-up002.ring0: Online
> apache-up003.ring0: Online
>
> Daemon Status:
> corosync: active/enabled
> pacemaker: active/enabled
> pcsd: active/disabled
>
>
>> Does it list any failed actions?
>
> no. However if I try to go on and do the following:
>
> pcs cluster cib dlm_cfg
> pcs -f dlm_cfg resource create dlm ocf:pacemaker:controld op monitor interval=120s on-fail=fence clone interleave=true ordered=true
> pcs -f dlm_cfg resource create clvmd ocf:heartbeat:clvm op monitor interval=120s on-fail=fence clone interleave=true ordered=true
> pcs -f dlm_cfg constraint order start dlm-clone then clvmd-clone
> pcs -f dlm_cfg constraint colocation add clvmd-clone with dlm-clone
> pcs cluster cib-push dlm_cfg
>
> this is what happen
>
> Cluster name: apache-0
> Last updated: Wed May 18 23:17:40 2016 Last change: Wed May 18 23:17:34 2016 by root via cibadmin on apache-up001.ring0
> Stack: corosync
> Current DC: apache-up001.ring0 (version 1.1.14-10.el7.centos-70404b0) - partition with quorum
> 3 nodes and 7 resources configured
>
> Online: [ apache-up001.ring0 apache-up002.ring0 apache-up003.ring0 ]
>
> Full list of resources:
>
> scsi (stonith:fence_scsi): Stopped
> Clone Set: dlm-clone [dlm]
> Stopped: [ apache-up001.ring0 apache-up002.ring0 apache-up003.ring0 ]
> Clone Set: clvmd-clone [clvmd]
> Stopped: [ apache-up001.ring0 apache-up002.ring0 apache-up003.ring0 ]
>
> Failed Actions:
> * scsi_start_0 on apache-up001.ring0 'unknown error' (1): call=6, status=Error, exitreason='none',
> last-rc-change='Wed May 18 23:17:34 2016', queued=0ms, exec=1129ms
> * scsi_start_0 on apache-up002.ring0 'unknown error' (1): call=18, status=Error, exitreason='none',
> last-rc-change='Wed May 18 23:17:36 2016', queued=0ms, exec=1120ms
> * dlm_start_0 on apache-up002.ring0 'not configured' (6): call=16, status=complete, exitreason='none',
> last-rc-change='Wed May 18 23:17:34 2016', queued=0ms, exec=102ms
These errors likely showed up at this point simply because the start
actions took that long to fail, without any relation to adding the other
resources.
>
> PCSD Status:
> apache-up001.ring0: Online
> apache-up002.ring0: Online
> apache-up003.ring0: Online
>
> Daemon Status:
> corosync: active/enabled
> pacemaker: active/enabled
> pcsd: active/disabled
>
>
> in corosync log I found this:
>
> May 18 23:17:35 [2526] apache-up001.itc4u.local stonith-ng: warning: log_action: fence_scsi[6673] stderr: [ Failed: Cannot open file "/var/run/cluster/fence_scsi.key" ]
> May 18 23:17:35 [2526] apache-up001.itc4u.local stonith-ng: warning: log_action: fence_scsi[6673] stderr: [ ]
> May 18 23:17:35 [2526] apache-up001.itc4u.local stonith-ng: warning: log_action: fence_scsi[6673] stderr: [ Please use '-h' for usage ]
> May 18 23:17:35 [2526] apache-up001.itc4u.local stonith-ng: warning: log_action: fence_scsi[6673] stderr: [ ]
>
> that seems similar to the following
>
> https://access.redhat.com/solutions/1615453
>
> Tomorrow morning I’ll try to build a RPM using TAG 4.0.22
>
> Do you have any other suggestions?
That does look relevant, try the new version, and maybe set
pcmk_host_check=none or pcmk_host_list/pcmk_host_map as before.
> Here is my corosync.log (I raised to debug stonith-ng only, but maybe there’s something useful in other lines, so I pasted it full)
>
> [2410] apache-up001.itc4u.local corosyncnotice [MAIN ] Corosync Cluster Engine ('2.3.4'): started and ready to provide service.
> [2410] apache-up001.itc4u.local corosyncinfo [MAIN ] Corosync built-in features: dbus systemd xmlconf snmp pie relro bindnow
> [2410] apache-up001.itc4u.local corosyncnotice [TOTEM ] Initializing transport (UDP/IP Unicast).
> [2410] apache-up001.itc4u.local corosyncnotice [TOTEM ] Initializing transmit/receive security (NSS) crypto: none hash: none
> [2410] apache-up001.itc4u.local corosyncnotice [TOTEM ] Initializing transport (UDP/IP Unicast).
> [2410] apache-up001.itc4u.local corosyncnotice [TOTEM ] Initializing transmit/receive security (NSS) crypto: none hash: none
> [2410] apache-up001.itc4u.local corosyncnotice [TOTEM ] The network interface [192.168.15.9] is now up.
> [2410] apache-up001.itc4u.local corosyncnotice [SERV ] Service engine loaded: corosync configuration map access [0]
> [2410] apache-up001.itc4u.local corosyncinfo [QB ] server name: cmap
> [2410] apache-up001.itc4u.local corosyncnotice [SERV ] Service engine loaded: corosync configuration service [1]
> [2410] apache-up001.itc4u.local corosyncinfo [QB ] server name: cfg
> [2410] apache-up001.itc4u.local corosyncnotice [SERV ] Service engine loaded: corosync cluster closed process group service v1.01 [2]
> [2410] apache-up001.itc4u.local corosyncinfo [QB ] server name: cpg
> [2410] apache-up001.itc4u.local corosyncnotice [SERV ] Service engine loaded: corosync profile loading service [4]
> [2410] apache-up001.itc4u.local corosyncnotice [QUORUM] Using quorum provider corosync_votequorum
> [2410] apache-up001.itc4u.local corosyncnotice [SERV ] Service engine loaded: corosync vote quorum service v1.0 [5]
> [2410] apache-up001.itc4u.local corosyncinfo [QB ] server name: votequorum
> [2410] apache-up001.itc4u.local corosyncnotice [SERV ] Service engine loaded: corosync cluster quorum service v0.1 [3]
> [2410] apache-up001.itc4u.local corosyncinfo [QB ] server name: quorum
> [2410] apache-up001.itc4u.local corosyncnotice [TOTEM ] adding new UDPU member {192.168.15.9}
> [2410] apache-up001.itc4u.local corosyncnotice [TOTEM ] adding new UDPU member {192.168.15.8}
> [2410] apache-up001.itc4u.local corosyncnotice [TOTEM ] adding new UDPU member {192.168.15.7}
> [2410] apache-up001.itc4u.local corosyncnotice [TOTEM ] The network interface [192.168.16.9] is now up.
> [2410] apache-up001.itc4u.local corosyncnotice [TOTEM ] adding new UDPU member {192.168.16.9}
> [2410] apache-up001.itc4u.local corosyncnotice [TOTEM ] adding new UDPU member {192.168.16.8}
> [2410] apache-up001.itc4u.local corosyncnotice [TOTEM ] adding new UDPU member {192.168.16.7}
> [2410] apache-up001.itc4u.local corosyncnotice [TOTEM ] A new membership (192.168.15.9:204) was formed. Members joined: 1
> [2410] apache-up001.itc4u.local corosyncnotice [QUORUM] Members[1]: 1
> [2410] apache-up001.itc4u.local corosyncnotice [MAIN ] Completed service synchronization, ready to provide service.
> May 18 23:01:16 [2524] apache-up001.itc4u.local pacemakerd: notice: mcp_read_config: Configured corosync to accept connections from group 189: OK (1)
> May 18 23:01:16 [2524] apache-up001.itc4u.local pacemakerd: notice: main: Starting Pacemaker 1.1.14-10.el7.centos (Build: 70404b0): generated-manpages agent-manpages ncurses libqb-logging libqb-ipc upstart systemd nagios corosync-native atomic-attrd acls
> May 18 23:01:16 [2524] apache-up001.itc4u.local pacemakerd: info: main: Maximum core file size is: 18446744073709551615
> May 18 23:01:16 [2524] apache-up001.itc4u.local pacemakerd: info: qb_ipcs_us_publish: server name: pacemakerd
> May 18 23:01:17 [2524] apache-up001.itc4u.local pacemakerd: info: crm_get_peer: Created entry 00711929-c547-4549-afe9-726766325ddb/0x27bb820 for node apache-up001.ring0/1 (1 total)
> May 18 23:01:17 [2524] apache-up001.itc4u.local pacemakerd: info: crm_get_peer: Node 1 is now known as apache-up001.ring0
> May 18 23:01:17 [2524] apache-up001.itc4u.local pacemakerd: info: crm_get_peer: Node 1 has uuid 1
> May 18 23:01:17 [2524] apache-up001.itc4u.local pacemakerd: info: crm_update_peer_proc: cluster_connect_cpg: Node apache-up001.ring0[1] - corosync-cpg is now online
> May 18 23:01:17 [2524] apache-up001.itc4u.local pacemakerd: notice: cluster_connect_quorum: Quorum lost
> May 18 23:01:17 [2524] apache-up001.itc4u.local pacemakerd: info: crm_get_peer: Created entry 693f74ca-a9ee-4277-a280-137f807489eb/0x27bce50 for node apache-up002.ring0/2 (2 total)
> May 18 23:01:17 [2524] apache-up001.itc4u.local pacemakerd: info: crm_get_peer: Node 2 is now known as apache-up002.ring0
> May 18 23:01:17 [2524] apache-up001.itc4u.local pacemakerd: info: crm_get_peer: Node 2 has uuid 2
> May 18 23:01:17 [2524] apache-up001.itc4u.local pacemakerd: info: crm_get_peer: Created entry 26124ec7-b967-4101-837a-10fc78b518d9/0x27bd230 for node apache-up003.ring0/3 (3 total)
> May 18 23:01:17 [2524] apache-up001.itc4u.local pacemakerd: info: crm_get_peer: Node 3 is now known as apache-up003.ring0
> May 18 23:01:17 [2524] apache-up001.itc4u.local pacemakerd: info: crm_get_peer: Node 3 has uuid 3
> May 18 23:01:17 [2524] apache-up001.itc4u.local pacemakerd: info: start_child: Using uid=189 and group=189 for process cib
> May 18 23:01:17 [2524] apache-up001.itc4u.local pacemakerd: info: start_child: Forked child 2525 for process cib
> May 18 23:01:17 [2524] apache-up001.itc4u.local pacemakerd: info: start_child: Forked child 2526 for process stonith-ng
> May 18 23:01:17 [2524] apache-up001.itc4u.local pacemakerd: info: start_child: Forked child 2527 for process lrmd
> May 18 23:01:17 [2524] apache-up001.itc4u.local pacemakerd: info: start_child: Using uid=189 and group=189 for process attrd
> May 18 23:01:17 [2524] apache-up001.itc4u.local pacemakerd: info: start_child: Forked child 2528 for process attrd
> May 18 23:01:17 [2524] apache-up001.itc4u.local pacemakerd: info: start_child: Using uid=189 and group=189 for process pengine
> May 18 23:01:17 [2524] apache-up001.itc4u.local pacemakerd: info: start_child: Forked child 2529 for process pengine
> May 18 23:01:17 [2524] apache-up001.itc4u.local pacemakerd: info: start_child: Using uid=189 and group=189 for process crmd
> May 18 23:01:17 [2524] apache-up001.itc4u.local pacemakerd: info: start_child: Forked child 2530 for process crmd
> May 18 23:01:17 [2524] apache-up001.itc4u.local pacemakerd: info: main: Starting mainloop
> May 18 23:01:17 [2524] apache-up001.itc4u.local pacemakerd: info: pcmk_quorum_notification: Membership 204: quorum still lost (1)
> May 18 23:01:17 [2524] apache-up001.itc4u.local pacemakerd: notice: crm_update_peer_state_iter: pcmk_quorum_notification: Node apache-up001.ring0[1] - state is now member (was (null))
> May 18 23:01:17 [2524] apache-up001.itc4u.local pacemakerd: info: crm_reap_unseen_nodes: State of node apache-up003.ring0[3] is still unknown
> May 18 23:01:17 [2524] apache-up001.itc4u.local pacemakerd: info: crm_reap_unseen_nodes: State of node apache-up002.ring0[2] is still unknown
> May 18 23:01:17 [2524] apache-up001.itc4u.local pacemakerd: info: pcmk_cpg_membership: Node 1 joined group pacemakerd (counter=0.0)
> May 18 23:01:17 [2524] apache-up001.itc4u.local pacemakerd: info: pcmk_cpg_membership: Node 1 still member of group pacemakerd (peer=apache-up001.ring0, counter=0.0)
> May 18 23:01:17 [2524] apache-up001.itc4u.local pacemakerd: info: mcp_cpg_deliver: Ignoring process list sent by peer for local node
> May 18 23:01:17 [2524] apache-up001.itc4u.local pacemakerd: info: mcp_cpg_deliver: Ignoring process list sent by peer for local node
> May 18 23:01:17 [2524] apache-up001.itc4u.local pacemakerd: info: mcp_cpg_deliver: Ignoring process list sent by peer for local node
> May 18 23:01:17 [2524] apache-up001.itc4u.local pacemakerd: info: mcp_cpg_deliver: Ignoring process list sent by peer for local node
> May 18 23:01:17 [2524] apache-up001.itc4u.local pacemakerd: info: mcp_cpg_deliver: Ignoring process list sent by peer for local node
> May 18 23:01:17 [2524] apache-up001.itc4u.local pacemakerd: info: mcp_cpg_deliver: Ignoring process list sent by peer for local node
> May 18 23:01:17 [2524] apache-up001.itc4u.local pacemakerd: info: mcp_cpg_deliver: Ignoring process list sent by peer for local node
> May 18 23:01:17 [2525] apache-up001.itc4u.local cib: info: crm_log_init: Changed active directory to /var/lib/pacemaker/cores/hacluster
> May 18 23:01:17 [2525] apache-up001.itc4u.local cib: info: get_cluster_type: Verifying cluster type: 'corosync'
> May 18 23:01:17 [2525] apache-up001.itc4u.local cib: info: get_cluster_type: Assuming an active 'corosync' cluster
> May 18 23:01:17 [2525] apache-up001.itc4u.local cib: info: retrieveCib: Reading cluster configuration file /var/lib/pacemaker/cib/cib.xml (digest: /var/lib/pacemaker/cib/cib.xml.sig)
> May 18 23:01:17 [2526] apache-up001.itc4u.local stonith-ng: info: crm_log_init: Changed active directory to /var/lib/pacemaker/cores/root
> May 18 23:01:17 [2526] apache-up001.itc4u.local stonith-ng: info: get_cluster_type: Verifying cluster type: 'corosync'
> May 18 23:01:17 [2526] apache-up001.itc4u.local stonith-ng: info: get_cluster_type: Assuming an active 'corosync' cluster
> May 18 23:01:17 [2526] apache-up001.itc4u.local stonith-ng: notice: crm_cluster_connect: Connecting to cluster infrastructure: corosync
> May 18 23:01:17 [2528] apache-up001.itc4u.local attrd: info: crm_log_init: Changed active directory to /var/lib/pacemaker/cores/hacluster
> May 18 23:01:17 [2528] apache-up001.itc4u.local attrd: info: main: Starting up
> May 18 23:01:17 [2528] apache-up001.itc4u.local attrd: info: get_cluster_type: Verifying cluster type: 'corosync'
> May 18 23:01:17 [2528] apache-up001.itc4u.local attrd: info: get_cluster_type: Assuming an active 'corosync' cluster
> May 18 23:01:17 [2528] apache-up001.itc4u.local attrd: notice: crm_cluster_connect: Connecting to cluster infrastructure: corosync
> May 18 23:01:17 [2525] apache-up001.itc4u.local cib: info: validate_with_relaxng: Creating RNG parser context
> May 18 23:01:17 [2529] apache-up001.itc4u.local pengine: info: crm_log_init: Changed active directory to /var/lib/pacemaker/cores/hacluster
> May 18 23:01:17 [2529] apache-up001.itc4u.local pengine: info: qb_ipcs_us_publish: server name: pengine
> May 18 23:01:17 [2529] apache-up001.itc4u.local pengine: info: main: Starting pengine
> May 18 23:01:17 [2527] apache-up001.itc4u.local lrmd: info: crm_log_init: Changed active directory to /var/lib/pacemaker/cores/root
> May 18 23:01:17 [2527] apache-up001.itc4u.local lrmd: info: qb_ipcs_us_publish: server name: lrmd
> May 18 23:01:17 [2527] apache-up001.itc4u.local lrmd: info: main: Starting
> May 18 23:01:17 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_open_2: shm size:1048589; real_size:1052672; rb->word_size:263168
> May 18 23:01:17 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_open_2: shm size:1048589; real_size:1052672; rb->word_size:263168
> May 18 23:01:17 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_open_2: shm size:1048589; real_size:1052672; rb->word_size:263168
> May 18 23:01:17 [2526] apache-up001.itc4u.local stonith-ng: debug: get_local_nodeid: Local nodeid is 1
> May 18 23:01:17 [2525] apache-up001.itc4u.local cib: info: startCib: CIB Initialization completed successfully
> May 18 23:01:17 [2525] apache-up001.itc4u.local cib: notice: crm_cluster_connect: Connecting to cluster infrastructure: corosync
> May 18 23:01:17 [2528] apache-up001.itc4u.local attrd: info: crm_get_peer: Created entry 7f64959b-3768-400d-ae54-177d1cf0ea81/0x156cab0 for node apache-up001.ring0/1 (1 total)
> May 18 23:01:17 [2528] apache-up001.itc4u.local attrd: info: crm_get_peer: Node 1 is now known as apache-up001.ring0
> May 18 23:01:17 [2530] apache-up001.itc4u.local crmd: info: crm_log_init: Changed active directory to /var/lib/pacemaker/cores/hacluster
> May 18 23:01:17 [2530] apache-up001.itc4u.local crmd: notice: main: CRM Git Version: 1.1.14-10.el7.centos (70404b0)
> May 18 23:01:17 [2530] apache-up001.itc4u.local crmd: info: do_log: FSA: Input I_STARTUP from crmd_init() received in state S_STARTING
> May 18 23:01:17 [2530] apache-up001.itc4u.local crmd: info: get_cluster_type: Verifying cluster type: 'corosync'
> May 18 23:01:17 [2530] apache-up001.itc4u.local crmd: info: get_cluster_type: Assuming an active 'corosync' cluster
> May 18 23:01:17 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_open_2: shm size:1048589; real_size:1052672; rb->word_size:263168
> May 18 23:01:17 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_open_2: shm size:1048589; real_size:1052672; rb->word_size:263168
> May 18 23:01:17 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_open_2: shm size:1048589; real_size:1052672; rb->word_size:263168
> May 18 23:01:17 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_ipcc_disconnect: qb_ipcc_disconnect()
> May 18 23:01:17 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_close: Closing ringbuffer: /dev/shm/qb-cmap-request-2429-2526-29-header
> May 18 23:01:17 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_close: Closing ringbuffer: /dev/shm/qb-cmap-response-2429-2526-29-header
> May 18 23:01:17 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_close: Closing ringbuffer: /dev/shm/qb-cmap-event-2429-2526-29-header
> May 18 23:01:17 [2526] apache-up001.itc4u.local stonith-ng: info: crm_get_peer: Created entry da4cc771-240a-41ab-9509-d1a99b6b1727/0x1e8f1f0 for node apache-up001.ring0/1 (1 total)
> May 18 23:01:17 [2526] apache-up001.itc4u.local stonith-ng: info: crm_get_peer: Node 1 is now known as apache-up001.ring0
> May 18 23:01:17 [2525] apache-up001.itc4u.local cib: info: crm_get_peer: Created entry 4640bdec-852d-405e-bbfe-1d6b1b5097c4/0x1277dc0 for node apache-up001.ring0/1 (1 total)
> May 18 23:01:17 [2525] apache-up001.itc4u.local cib: info: crm_get_peer: Node 1 is now known as apache-up001.ring0
> May 18 23:01:17 [2528] apache-up001.itc4u.local attrd: info: crm_get_peer: Node 1 has uuid 1
> May 18 23:01:17 [2528] apache-up001.itc4u.local attrd: info: crm_update_peer_proc: cluster_connect_cpg: Node apache-up001.ring0[1] - corosync-cpg is now online
> May 18 23:01:17 [2528] apache-up001.itc4u.local attrd: notice: crm_update_peer_state_iter: crm_update_peer_proc: Node apache-up001.ring0[1] - state is now member (was (null))
> May 18 23:01:17 [2528] apache-up001.itc4u.local attrd: info: init_cs_connection_once: Connection to 'corosync': established
> May 18 23:01:17 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_open_2: shm size:1048589; real_size:1052672; rb->word_size:263168
> May 18 23:01:17 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_open_2: shm size:1048589; real_size:1052672; rb->word_size:263168
> May 18 23:01:17 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_open_2: shm size:1048589; real_size:1052672; rb->word_size:263168
> May 18 23:01:17 [2525] apache-up001.itc4u.local cib: info: crm_get_peer: Node 1 has uuid 1
> May 18 23:01:17 [2525] apache-up001.itc4u.local cib: info: crm_update_peer_proc: cluster_connect_cpg: Node apache-up001.ring0[1] - corosync-cpg is now online
> May 18 23:01:17 [2525] apache-up001.itc4u.local cib: notice: crm_update_peer_state_iter: crm_update_peer_proc: Node apache-up001.ring0[1] - state is now member (was (null))
> May 18 23:01:17 [2525] apache-up001.itc4u.local cib: info: init_cs_connection_once: Connection to 'corosync': established
> May 18 23:01:17 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_ipcc_disconnect: qb_ipcc_disconnect()
> May 18 23:01:17 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_close: Closing ringbuffer: /dev/shm/qb-cmap-request-2429-2526-29-header
> May 18 23:01:17 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_close: Closing ringbuffer: /dev/shm/qb-cmap-response-2429-2526-29-header
> May 18 23:01:17 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_close: Closing ringbuffer: /dev/shm/qb-cmap-event-2429-2526-29-header
> May 18 23:01:17 [2526] apache-up001.itc4u.local stonith-ng: info: crm_get_peer: Node 1 has uuid 1
> May 18 23:01:17 [2526] apache-up001.itc4u.local stonith-ng: info: crm_update_peer_proc: cluster_connect_cpg: Node apache-up001.ring0[1] - corosync-cpg is now online
> May 18 23:01:17 [2526] apache-up001.itc4u.local stonith-ng: notice: crm_update_peer_state_iter: crm_update_peer_proc: Node apache-up001.ring0[1] - state is now member (was (null))
> May 18 23:01:17 [2526] apache-up001.itc4u.local stonith-ng: info: init_cs_connection_once: Connection to 'corosync': established
> May 18 23:01:17 [2528] apache-up001.itc4u.local attrd: info: main: Cluster connection active
> May 18 23:01:17 [2528] apache-up001.itc4u.local attrd: info: qb_ipcs_us_publish: server name: attrd
> May 18 23:01:17 [2528] apache-up001.itc4u.local attrd: info: main: Accepting attribute updates
> May 18 23:01:17 [2525] apache-up001.itc4u.local cib: info: qb_ipcs_us_publish: server name: cib_ro
> May 18 23:01:17 [2525] apache-up001.itc4u.local cib: info: qb_ipcs_us_publish: server name: cib_rw
> May 18 23:01:17 [2525] apache-up001.itc4u.local cib: info: qb_ipcs_us_publish: server name: cib_shm
> May 18 23:01:17 [2525] apache-up001.itc4u.local cib: info: cib_init: Starting cib mainloop
> May 18 23:01:17 [2525] apache-up001.itc4u.local cib: info: pcmk_cpg_membership: Node 1 joined group cib (counter=0.0)
> May 18 23:01:17 [2525] apache-up001.itc4u.local cib: info: pcmk_cpg_membership: Node 1 still member of group cib (peer=apache-up001.ring0, counter=0.0)
> May 18 23:01:17 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_open_2: shm size:1048589; real_size:1052672; rb->word_size:263168
> May 18 23:01:17 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_open_2: shm size:1048589; real_size:1052672; rb->word_size:263168
> May 18 23:01:17 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_open_2: shm size:1048589; real_size:1052672; rb->word_size:263168
> May 18 23:01:17 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_ipcc_disconnect: qb_ipcc_disconnect()
> May 18 23:01:17 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_close: Closing ringbuffer: /dev/shm/qb-cmap-request-2429-2526-29-header
> May 18 23:01:17 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_close: Closing ringbuffer: /dev/shm/qb-cmap-response-2429-2526-29-header
> May 18 23:01:17 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_close: Closing ringbuffer: /dev/shm/qb-cmap-event-2429-2526-29-header
> May 18 23:01:17 [2525] apache-up001.itc4u.local cib: info: cib_file_backup: Archived previous version as /var/lib/pacemaker/cib/cib-3.raw
> May 18 23:01:17 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_open_2: shm size:524301; real_size:528384; rb->word_size:132096
> May 18 23:01:17 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_open_2: shm size:524301; real_size:528384; rb->word_size:132096
> May 18 23:01:17 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_open_2: shm size:524301; real_size:528384; rb->word_size:132096
> May 18 23:01:17 [2526] apache-up001.itc4u.local stonith-ng: debug: cib_native_signon_raw: Connection to CIB successful
> May 18 23:01:17 [2526] apache-up001.itc4u.local stonith-ng: notice: setup_cib: Watching for stonith topology changes
> May 18 23:01:17 [2526] apache-up001.itc4u.local stonith-ng: info: qb_ipcs_us_publish: server name: stonith-ng
> May 18 23:01:17 [2526] apache-up001.itc4u.local stonith-ng: info: main: Starting stonith-ng mainloop
> May 18 23:01:17 [2526] apache-up001.itc4u.local stonith-ng: info: pcmk_cpg_membership: Node 1 joined group stonith-ng (counter=0.0)
> May 18 23:01:17 [2526] apache-up001.itc4u.local stonith-ng: info: pcmk_cpg_membership: Node 1 still member of group stonith-ng (peer=apache-up001.ring0, counter=0.0)
> May 18 23:01:17 [2526] apache-up001.itc4u.local stonith-ng: info: init_cib_cache_cb: Updating device list from the cib: init
> May 18 23:01:17 [2526] apache-up001.itc4u.local stonith-ng: info: cib_devices_update: Updating devices to version 0.67.0
> May 18 23:01:17 [2526] apache-up001.itc4u.local stonith-ng: debug: unpack_config: STONITH timeout: 60000
> May 18 23:01:17 [2526] apache-up001.itc4u.local stonith-ng: debug: unpack_config: STONITH of failed nodes is disabled
> May 18 23:01:17 [2526] apache-up001.itc4u.local stonith-ng: debug: unpack_config: Stop all active resources: false
> May 18 23:01:17 [2526] apache-up001.itc4u.local stonith-ng: debug: unpack_config: Cluster is symmetric - resources can run anywhere by default
> May 18 23:01:17 [2526] apache-up001.itc4u.local stonith-ng: debug: unpack_config: Default stickiness: 0
> May 18 23:01:17 [2526] apache-up001.itc4u.local stonith-ng: debug: unpack_config: On loss of CCM Quorum: Freeze resources
> May 18 23:01:17 [2526] apache-up001.itc4u.local stonith-ng: debug: unpack_config: Node scores: 'red' = -INFINITY, 'yellow' = 0, 'green' = 0
> May 18 23:01:17 [2526] apache-up001.itc4u.local stonith-ng: debug: cib_device_update: Device scsi is allowed on apache-up001.ring0: score=0
> May 18 23:01:17 [2525] apache-up001.itc4u.local cib: info: cib_file_write_with_digest: Wrote version 0.67.0 of the CIB to disk (digest: 5b97de0bbd872b90fec907f99372b729)
> May 18 23:01:17 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_action_create: Initiating action metadata for agent fence_scsi (target=(null))
> May 18 23:01:17 [2526] apache-up001.itc4u.local stonith-ng: debug: internal_stonith_action_execute: forking
> May 18 23:01:17 [2526] apache-up001.itc4u.local stonith-ng: debug: internal_stonith_action_execute: sending args
> May 18 23:01:17 [2525] apache-up001.itc4u.local cib: info: cib_file_write_with_digest: Reading cluster configuration file /var/lib/pacemaker/cib/cib.vkUMOr (digest: /var/lib/pacemaker/cib/cib.rx0IDA)
> May 18 23:01:18 [2530] apache-up001.itc4u.local crmd: info: do_cib_control: CIB connection established
> May 18 23:01:18 [2530] apache-up001.itc4u.local crmd: notice: crm_cluster_connect: Connecting to cluster infrastructure: corosync
> May 18 23:01:18 [2530] apache-up001.itc4u.local crmd: info: crm_get_peer: Created entry 00ac8a1c-f978-4534-bdc9-6b7fcad8205b/0x1d419f0 for node apache-up001.ring0/1 (1 total)
> May 18 23:01:18 [2530] apache-up001.itc4u.local crmd: info: crm_get_peer: Node 1 is now known as apache-up001.ring0
> May 18 23:01:18 [2530] apache-up001.itc4u.local crmd: info: peer_update_callback: apache-up001.ring0 is now in unknown state
> May 18 23:01:18 [2530] apache-up001.itc4u.local crmd: info: crm_get_peer: Node 1 has uuid 1
> May 18 23:01:18 [2530] apache-up001.itc4u.local crmd: info: crm_update_peer_proc: cluster_connect_cpg: Node apache-up001.ring0[1] - corosync-cpg is now online
> May 18 23:01:18 [2530] apache-up001.itc4u.local crmd: info: peer_update_callback: Client apache-up001.ring0/peer now has status [online] (DC=<null>, changed=4000000)
> May 18 23:01:18 [2530] apache-up001.itc4u.local crmd: info: init_cs_connection_once: Connection to 'corosync': established
> May 18 23:01:18 [2528] apache-up001.itc4u.local attrd: info: attrd_cib_connect: Connected to the CIB after 2 attempts
> May 18 23:01:18 [2528] apache-up001.itc4u.local attrd: info: main: CIB connection active
> May 18 23:01:18 [2528] apache-up001.itc4u.local attrd: info: pcmk_cpg_membership: Node 1 joined group attrd (counter=0.0)
> May 18 23:01:18 [2528] apache-up001.itc4u.local attrd: info: pcmk_cpg_membership: Node 1 still member of group attrd (peer=apache-up001.ring0, counter=0.0)
> May 18 23:01:18 [2530] apache-up001.itc4u.local crmd: notice: cluster_connect_quorum: Quorum lost
> May 18 23:01:18 [2526] apache-up001.itc4u.local stonith-ng: debug: internal_stonith_action_execute: result = 0
> May 18 23:01:18 [2526] apache-up001.itc4u.local stonith-ng: info: build_device_from_xml: The fencing device 'scsi' requires unfencing
> May 18 23:01:18 [2526] apache-up001.itc4u.local stonith-ng: info: build_device_from_xml: The fencing device 'scsi' requires actions (on) to be executed on the target node
> May 18 23:01:18 [2526] apache-up001.itc4u.local stonith-ng: notice: stonith_device_register: Added 'scsi' to the device list (1 active devices)
> May 18 23:01:18 [2530] apache-up001.itc4u.local crmd: info: crm_get_peer: Created entry b5ef1089-c2e2-4c72-8e12-2225fc4f4446/0x1f44110 for node apache-up002.ring0/2 (2 total)
> May 18 23:01:18 [2530] apache-up001.itc4u.local crmd: info: crm_get_peer: Node 2 is now known as apache-up002.ring0
> May 18 23:01:18 [2530] apache-up001.itc4u.local crmd: info: peer_update_callback: apache-up002.ring0 is now in unknown state
> May 18 23:01:18 [2530] apache-up001.itc4u.local crmd: info: crm_get_peer: Node 2 has uuid 2
> May 18 23:01:18 [2530] apache-up001.itc4u.local crmd: info: crm_get_peer: Created entry 6a7a2079-69de-445c-8999-788fa8a83575/0x1f442d0 for node apache-up003.ring0/3 (3 total)
> May 18 23:01:18 [2530] apache-up001.itc4u.local crmd: info: crm_get_peer: Node 3 is now known as apache-up003.ring0
> May 18 23:01:18 [2530] apache-up001.itc4u.local crmd: info: peer_update_callback: apache-up003.ring0 is now in unknown state
> May 18 23:01:18 [2530] apache-up001.itc4u.local crmd: info: crm_get_peer: Node 3 has uuid 3
> May 18 23:01:18 [2530] apache-up001.itc4u.local crmd: info: do_ha_control: Connected to the cluster
> May 18 23:01:18 [2530] apache-up001.itc4u.local crmd: info: lrmd_ipc_connect: Connecting to lrmd
> May 18 23:01:18 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Forwarding cib_modify operation for section nodes to master (origin=local/crmd/6)
> May 18 23:01:18 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section nodes: OK (rc=0, origin=apache-up001.ring0/crmd/6, version=0.67.0)
> May 18 23:01:18 [2530] apache-up001.itc4u.local crmd: info: do_lrm_control: LRM connection established
> May 18 23:01:18 [2530] apache-up001.itc4u.local crmd: info: do_started: Delaying start, no membership data (0000000000100000)
> May 18 23:01:18 [2530] apache-up001.itc4u.local crmd: info: do_started: Delaying start, no membership data (0000000000100000)
> May 18 23:01:18 [2530] apache-up001.itc4u.local crmd: info: pcmk_quorum_notification: Membership 204: quorum still lost (1)
> May 18 23:01:18 [2530] apache-up001.itc4u.local crmd: notice: crm_update_peer_state_iter: pcmk_quorum_notification: Node apache-up001.ring0[1] - state is now member (was (null))
> May 18 23:01:18 [2530] apache-up001.itc4u.local crmd: info: peer_update_callback: apache-up001.ring0 is now member (was in unknown state)
> May 18 23:01:18 [2530] apache-up001.itc4u.local crmd: info: crm_reap_unseen_nodes: State of node apache-up002.ring0[2] is still unknown
> May 18 23:01:18 [2530] apache-up001.itc4u.local crmd: info: crm_reap_unseen_nodes: State of node apache-up003.ring0[3] is still unknown
> May 18 23:01:18 [2530] apache-up001.itc4u.local crmd: info: qb_ipcs_us_publish: server name: crmd
> May 18 23:01:18 [2530] apache-up001.itc4u.local crmd: notice: do_started: The local CRM is operational
> May 18 23:01:18 [2530] apache-up001.itc4u.local crmd: info: do_log: FSA: Input I_PENDING from do_started() received in state S_STARTING
> May 18 23:01:18 [2530] apache-up001.itc4u.local crmd: notice: do_state_transition: State transition S_STARTING -> S_PENDING [ input=I_PENDING cause=C_FSA_INTERNAL origin=do_started ]
> May 18 23:01:19 [2530] apache-up001.itc4u.local crmd: info: pcmk_cpg_membership: Node 1 joined group crmd (counter=0.0)
> May 18 23:01:19 [2530] apache-up001.itc4u.local crmd: info: pcmk_cpg_membership: Node 1 still member of group crmd (peer=apache-up001.ring0, counter=0.0)
> May 18 23:01:20 [2526] apache-up001.itc4u.local stonith-ng: debug: crm_client_new: Connecting 0x1fa0aa0 for uid=189 gid=189 pid=2530 id=78870da2-902c-4786-8f46-788a0476f5ae
> May 18 23:01:20 [2526] apache-up001.itc4u.local stonith-ng: debug: handle_new_connection: IPC credentials authenticated (2526-2530-10)
> May 18 23:01:20 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_ipcs_shm_connect: connecting to client [2530]
> May 18 23:01:20 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_open_2: shm size:131085; real_size:135168; rb->word_size:33792
> May 18 23:01:20 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_open_2: shm size:131085; real_size:135168; rb->word_size:33792
> May 18 23:01:20 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_open_2: shm size:131085; real_size:135168; rb->word_size:33792
> May 18 23:01:20 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_command: Processing register 12 from crmd.2530 ( 0)
> May 18 23:01:20 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_command: Processed register from crmd.2530: OK (0)
> May 18 23:01:20 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_command: Processing st_notify 13 from crmd.2530 ( 0)
> May 18 23:01:20 [2526] apache-up001.itc4u.local stonith-ng: debug: handle_request: Setting st_notify_disconnect callbacks for crmd.2530 (78870da2-902c-4786-8f46-788a0476f5ae): ON
> May 18 23:01:20 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_command: Processed st_notify from crmd.2530: OK (0)
> May 18 23:01:20 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_command: Processing st_notify 14 from crmd.2530 ( 0)
> May 18 23:01:20 [2526] apache-up001.itc4u.local stonith-ng: debug: handle_request: Setting st_notify_fence callbacks for crmd.2530 (78870da2-902c-4786-8f46-788a0476f5ae): ON
> May 18 23:01:20 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_command: Processed st_notify from crmd.2530: OK (0)
> May 18 23:01:39 [2530] apache-up001.itc4u.local crmd: info: crm_timer_popped: Election Trigger (I_DC_TIMEOUT) just popped (20000ms)
> May 18 23:01:39 [2530] apache-up001.itc4u.local crmd: warning: do_log: FSA: Input I_DC_TIMEOUT from crm_timer_popped() received in state S_PENDING
> May 18 23:01:39 [2530] apache-up001.itc4u.local crmd: info: do_state_transition: State transition S_PENDING -> S_ELECTION [ input=I_DC_TIMEOUT cause=C_TIMER_POPPED origin=crm_timer_popped ]
> May 18 23:01:39 [2530] apache-up001.itc4u.local crmd: info: election_complete: Election election-0 complete
> May 18 23:01:39 [2530] apache-up001.itc4u.local crmd: info: election_timeout_popped: Election failed: Declaring ourselves the winner
> May 18 23:01:39 [2530] apache-up001.itc4u.local crmd: info: do_log: FSA: Input I_ELECTION_DC from election_timeout_popped() received in state S_ELECTION
> May 18 23:01:39 [2530] apache-up001.itc4u.local crmd: notice: do_state_transition: State transition S_ELECTION -> S_INTEGRATION [ input=I_ELECTION_DC cause=C_TIMER_POPPED origin=election_timeout_popped ]
> May 18 23:01:39 [2530] apache-up001.itc4u.local crmd: info: do_te_control: Registering TE UUID: cf3499c5-bb9c-4d70-b209-80cc23a4ff1d
> May 18 23:01:39 [2530] apache-up001.itc4u.local crmd: info: set_graph_functions: Setting custom graph functions
> May 18 23:01:39 [2530] apache-up001.itc4u.local crmd: info: do_dc_takeover: Taking over DC status for this partition
> May 18 23:01:39 [2525] apache-up001.itc4u.local cib: info: cib_process_readwrite: We are now in R/W mode
> May 18 23:01:39 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_master operation for section 'all': OK (rc=0, origin=local/crmd/9, version=0.67.0)
> May 18 23:01:39 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Forwarding cib_modify operation for section cib to master (origin=local/crmd/10)
> May 18 23:01:39 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section cib: OK (rc=0, origin=apache-up001.ring0/crmd/10, version=0.67.0)
> May 18 23:01:39 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Forwarding cib_modify operation for section crm_config to master (origin=local/crmd/12)
> May 18 23:01:39 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section crm_config: OK (rc=0, origin=apache-up001.ring0/crmd/12, version=0.67.0)
> May 18 23:01:39 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Forwarding cib_modify operation for section crm_config to master (origin=local/crmd/14)
> May 18 23:01:39 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section crm_config: OK (rc=0, origin=apache-up001.ring0/crmd/14, version=0.67.0)
> May 18 23:01:39 [2530] apache-up001.itc4u.local crmd: info: join_make_offer: Not making an offer to apache-up002.ring0: not active ((null))
> May 18 23:01:39 [2530] apache-up001.itc4u.local crmd: info: join_make_offer: Making join offers based on membership 204
> May 18 23:01:39 [2530] apache-up001.itc4u.local crmd: info: join_make_offer: join-1: Sending offer to apache-up001.ring0
> May 18 23:01:39 [2530] apache-up001.itc4u.local crmd: info: crm_update_peer_join: join_make_offer: Node apache-up001.ring0[1] - join-1 phase 0 -> 1
> May 18 23:01:39 [2530] apache-up001.itc4u.local crmd: info: join_make_offer: Not making an offer to apache-up003.ring0: not active ((null))
> May 18 23:01:39 [2530] apache-up001.itc4u.local crmd: info: do_dc_join_offer_all: join-1: Waiting on 1 outstanding join acks
> May 18 23:01:39 [2530] apache-up001.itc4u.local crmd: warning: do_log: FSA: Input I_ELECTION_DC from do_election_check() received in state S_INTEGRATION
> May 18 23:01:39 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Forwarding cib_modify operation for section crm_config to master (origin=local/crmd/16)
> May 18 23:01:39 [2530] apache-up001.itc4u.local crmd: info: crm_update_peer_join: initialize_join: Node apache-up001.ring0[1] - join-2 phase 1 -> 0
> May 18 23:01:39 [2530] apache-up001.itc4u.local crmd: info: join_make_offer: Not making an offer to apache-up002.ring0: not active ((null))
> May 18 23:01:39 [2530] apache-up001.itc4u.local crmd: info: join_make_offer: join-2: Sending offer to apache-up001.ring0
> May 18 23:01:39 [2530] apache-up001.itc4u.local crmd: info: crm_update_peer_join: join_make_offer: Node apache-up001.ring0[1] - join-2 phase 0 -> 1
> May 18 23:01:39 [2530] apache-up001.itc4u.local crmd: info: join_make_offer: Not making an offer to apache-up003.ring0: not active ((null))
> May 18 23:01:39 [2530] apache-up001.itc4u.local crmd: info: do_dc_join_offer_all: join-2: Waiting on 1 outstanding join acks
> May 18 23:01:39 [2530] apache-up001.itc4u.local crmd: info: update_dc: Set DC to apache-up001.ring0 (3.0.10)
> May 18 23:01:39 [2530] apache-up001.itc4u.local crmd: info: crm_update_peer_expected: update_dc: Node apache-up001.ring0[1] - expected state is now member (was (null))
> May 18 23:01:39 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section crm_config: OK (rc=0, origin=apache-up001.ring0/crmd/16, version=0.67.0)
> May 18 23:01:39 [2530] apache-up001.itc4u.local crmd: info: crm_update_peer_join: do_dc_join_filter_offer: Node apache-up001.ring0[1] - join-2 phase 1 -> 2
> May 18 23:01:39 [2530] apache-up001.itc4u.local crmd: info: do_state_transition: State transition S_INTEGRATION -> S_FINALIZE_JOIN [ input=I_INTEGRATED cause=C_FSA_INTERNAL origin=check_join_state ]
> May 18 23:01:39 [2530] apache-up001.itc4u.local crmd: info: crmd_join_phase_log: join-2: apache-up002.ring0=none
> May 18 23:01:39 [2530] apache-up001.itc4u.local crmd: info: crmd_join_phase_log: join-2: apache-up001.ring0=integrated
> May 18 23:01:39 [2530] apache-up001.itc4u.local crmd: info: crmd_join_phase_log: join-2: apache-up003.ring0=none
> May 18 23:01:39 [2530] apache-up001.itc4u.local crmd: info: do_dc_join_finalize: join-2: Syncing our CIB to the rest of the cluster
> May 18 23:01:39 [2525] apache-up001.itc4u.local cib: info: cib_process_replace: Digest matched on replace from apache-up001.ring0: 8e6725d36383bb22cadfa0d19591e850
> May 18 23:01:39 [2525] apache-up001.itc4u.local cib: info: cib_process_replace: Replaced 0.67.0 with 0.67.0 from apache-up001.ring0
> May 18 23:01:39 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_replace operation for section 'all': OK (rc=0, origin=apache-up001.ring0/crmd/20, version=0.67.0)
> May 18 23:01:39 [2530] apache-up001.itc4u.local crmd: info: crm_update_peer_join: finalize_join_for: Node apache-up001.ring0[1] - join-2 phase 2 -> 3
> May 18 23:01:39 [2530] apache-up001.itc4u.local crmd: info: erase_status_tag: Deleting xpath: //node_state[@uname='apache-up001.ring0']/transient_attributes
> May 18 23:01:39 [2530] apache-up001.itc4u.local crmd: info: update_attrd_helper: Connecting to attribute manager ... 5 retries remaining
> May 18 23:01:39 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Forwarding cib_modify operation for section nodes to master (origin=local/crmd/21)
> May 18 23:01:39 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Forwarding cib_delete operation for section //node_state[@uname='apache-up001.ring0']/transient_attributes to master (origin=local/crmd/22)
> May 18 23:01:39 [2528] apache-up001.itc4u.local attrd: info: attrd_client_update: Starting an election to determine the writer
> May 18 23:01:39 [2530] apache-up001.itc4u.local crmd: info: crm_update_peer_join: do_dc_join_ack: Node apache-up001.ring0[1] - join-2 phase 3 -> 4
> May 18 23:01:39 [2530] apache-up001.itc4u.local crmd: info: do_dc_join_ack: join-2: Updating node state to member for apache-up001.ring0
> May 18 23:01:39 [2530] apache-up001.itc4u.local crmd: info: erase_status_tag: Deleting xpath: //node_state[@uname='apache-up001.ring0']/lrm
> May 18 23:01:39 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section nodes: OK (rc=0, origin=apache-up001.ring0/crmd/21, version=0.67.0)
> May 18 23:01:39 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_delete operation for section //node_state[@uname='apache-up001.ring0']/transient_attributes: OK (rc=0, origin=apache-up001.ring0/crmd/22, version=0.67.0)
> May 18 23:01:39 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Forwarding cib_delete operation for section //node_state[@uname='apache-up001.ring0']/lrm to master (origin=local/crmd/23)
> May 18 23:01:39 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Forwarding cib_modify operation for section status to master (origin=local/crmd/24)
> May 18 23:01:39 [2525] apache-up001.itc4u.local cib: info: cib_file_backup: Archived previous version as /var/lib/pacemaker/cib/cib-4.raw
> May 18 23:01:39 [2525] apache-up001.itc4u.local cib: info: cib_file_write_with_digest: Wrote version 0.67.0 of the CIB to disk (digest: 1ebb1bf146e3fd8bd6b93261ee36c923)
> May 18 23:01:39 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_delete operation for section //node_state[@uname='apache-up001.ring0']/lrm: OK (rc=0, origin=apache-up001.ring0/crmd/23, version=0.67.0)
> May 18 23:01:39 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: --- 0.67.0 2
> May 18 23:01:39 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: +++ 0.67.1 (null)
> May 18 23:01:39 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib: @num_updates=1
> May 18 23:01:39 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ /cib/status: <node_state id="1" uname="apache-up001.ring0" in_ccm="true" crmd="online" crm-debug-origin="do_lrm_query_internal" join="member" expected="member"/>
> May 18 23:01:39 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ <lrm id="1">
> May 18 23:01:39 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ <lrm_resources/>
> May 18 23:01:39 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ </lrm>
> May 18 23:01:39 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ </node_state>
> May 18 23:01:39 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section status: OK (rc=0, origin=apache-up001.ring0/crmd/24, version=0.67.1)
> May 18 23:01:39 [2530] apache-up001.itc4u.local crmd: info: do_state_transition: State transition S_FINALIZE_JOIN -> S_POLICY_ENGINE [ input=I_FINALIZED cause=C_FSA_INTERNAL origin=check_join_state ]
> May 18 23:01:39 [2526] apache-up001.itc4u.local stonith-ng: debug: xml_patch_version_check: Can apply patch 0.67.1 to 0.67.0
> May 18 23:01:39 [2530] apache-up001.itc4u.local crmd: info: do_update_node_cib: Node update for apache-up002.ring0 cancelled: no state, not seen yet
> May 18 23:01:39 [2530] apache-up001.itc4u.local crmd: info: do_update_node_cib: Node update for apache-up003.ring0 cancelled: no state, not seen yet
> May 18 23:01:39 [2530] apache-up001.itc4u.local crmd: info: abort_transition_graph: Transition aborted: Peer Cancelled (source=do_te_invoke:161, 1)
> May 18 23:01:39 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Forwarding cib_modify operation for section nodes to master (origin=local/crmd/28)
> May 18 23:01:39 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Forwarding cib_modify operation for section status to master (origin=local/crmd/29)
> May 18 23:01:39 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Forwarding cib_modify operation for section cib to master (origin=local/crmd/30)
> May 18 23:01:39 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section nodes: OK (rc=0, origin=apache-up001.ring0/crmd/28, version=0.67.1)
> May 18 23:01:39 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: --- 0.67.1 2
> May 18 23:01:39 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: +++ 0.67.2 (null)
> May 18 23:01:39 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib: @num_updates=2
> May 18 23:01:39 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib/status/node_state[@id='1']: @crm-debug-origin=do_state_transition
> May 18 23:01:39 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section status: OK (rc=0, origin=apache-up001.ring0/crmd/29, version=0.67.2)
> May 18 23:01:39 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: --- 0.67.2 2
> May 18 23:01:39 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: +++ 0.67.3 (null)
> May 18 23:01:39 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib: @num_updates=3, @have-quorum=0, @dc-uuid=1
> May 18 23:01:39 [2526] apache-up001.itc4u.local stonith-ng: debug: xml_patch_version_check: Can apply patch 0.67.2 to 0.67.1
> May 18 23:01:39 [2525] apache-up001.itc4u.local cib: info: cib_file_write_with_digest: Reading cluster configuration file /var/lib/pacemaker/cib/cib.ssgBSm (digest: /var/lib/pacemaker/cib/cib.Xe5kpn)
> May 18 23:01:39 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section cib: OK (rc=0, origin=apache-up001.ring0/crmd/30, version=0.67.3)
> May 18 23:01:39 [2528] apache-up001.itc4u.local attrd: info: attrd_client_refresh: Updating all attributes
> May 18 23:01:39 [2528] apache-up001.itc4u.local attrd: info: election_complete: Election election-attrd complete
> May 18 23:01:39 [2528] apache-up001.itc4u.local attrd: info: attrd_peer_update: Setting shutdown[apache-up001.ring0]: (null) -> 0 from apache-up001.ring0
> May 18 23:01:39 [2528] apache-up001.itc4u.local attrd: info: write_attribute: Sent update 2 with 1 changes for shutdown, id=<n/a>, set=(null)
> May 18 23:01:39 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Forwarding cib_modify operation for section status to master (origin=local/attrd/2)
> May 18 23:01:39 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: --- 0.67.3 2
> May 18 23:01:39 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: +++ 0.67.4 (null)
> May 18 23:01:39 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib: @num_updates=4
> May 18 23:01:39 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ /cib/status/node_state[@id='1']: <transient_attributes id="1"/>
> May 18 23:01:39 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ <instance_attributes id="status-1">
> May 18 23:01:39 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ <nvpair id="status-1-shutdown" name="shutdown" value="0"/>
> May 18 23:01:39 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ </instance_attributes>
> May 18 23:01:39 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ </transient_attributes>
> May 18 23:01:39 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section status: OK (rc=0, origin=apache-up001.ring0/attrd/2, version=0.67.4)
> May 18 23:01:39 [2528] apache-up001.itc4u.local attrd: info: attrd_cib_callback: Update 2 for shutdown: OK (0)
> May 18 23:01:39 [2528] apache-up001.itc4u.local attrd: info: attrd_cib_callback: Update 2 for shutdown[apache-up001.ring0]=0: OK (0)
> May 18 23:01:39 [2526] apache-up001.itc4u.local stonith-ng: debug: xml_patch_version_check: Can apply patch 0.67.3 to 0.67.2
> May 18 23:01:39 [2526] apache-up001.itc4u.local stonith-ng: debug: xml_patch_version_check: Can apply patch 0.67.4 to 0.67.3
> May 18 23:01:40 [2530] apache-up001.itc4u.local crmd: info: abort_transition_graph: Transition aborted by transient_attributes.1 'create': Transient attribute change (cib=0.67.4, source=abort_unless_down:329, path=/cib/status/node_state[@id='1'], 1)
> May 18 23:01:40 [2529] apache-up001.itc4u.local pengine: notice: cluster_status: We do not have quorum - fencing and resource management disabled
> May 18 23:01:40 [2529] apache-up001.itc4u.local pengine: info: determine_online_status: Node apache-up001.ring0 is online
> May 18 23:01:40 [2529] apache-up001.itc4u.local pengine: info: native_print: scsi (stonith:fence_scsi): Stopped
> May 18 23:01:40 [2529] apache-up001.itc4u.local pengine: notice: trigger_unfencing: Unfencing apache-up001.ring0: node discovery
> May 18 23:01:40 [2529] apache-up001.itc4u.local pengine: notice: LogActions: Start scsi (apache-up001.ring0 - blocked)
> May 18 23:01:40 [2529] apache-up001.itc4u.local pengine: notice: process_pe_message: Calculated Transition 0: /var/lib/pacemaker/pengine/pe-input-122.bz2
> May 18 23:01:40 [2530] apache-up001.itc4u.local crmd: info: do_state_transition: State transition S_POLICY_ENGINE -> S_TRANSITION_ENGINE [ input=I_PE_SUCCESS cause=C_IPC_MESSAGE origin=handle_response ]
> May 18 23:01:40 [2530] apache-up001.itc4u.local crmd: info: do_te_invoke: Processing graph 0 (ref=pe_calc-dc-1463605300-9) derived from /var/lib/pacemaker/pengine/pe-input-122.bz2
> May 18 23:01:40 [2530] apache-up001.itc4u.local crmd: notice: te_fence_node: Executing on fencing operation (3) on apache-up001.ring0 (timeout=60000)
> May 18 23:01:40 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_command: Processing st_fence 44 from crmd.2530 ( 2)
> May 18 23:01:40 [2526] apache-up001.itc4u.local stonith-ng: notice: handle_request: Client crmd.2530.78870da2 wants to fence (on) 'apache-up001.ring0' with device '(any)'
> May 18 23:01:40 [2526] apache-up001.itc4u.local stonith-ng: notice: initiate_remote_stonith_op: Initiating remote operation on for apache-up001.ring0: 49b08346-e9ee-497b-a556-fa71b45bf0ff (0)
> May 18 23:01:40 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_open_2: shm size:1048589; real_size:1052672; rb->word_size:263168
> May 18 23:01:40 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_open_2: shm size:1048589; real_size:1052672; rb->word_size:263168
> May 18 23:01:40 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_open_2: shm size:1048589; real_size:1052672; rb->word_size:263168
> May 18 23:01:40 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_ipcc_disconnect: qb_ipcc_disconnect()
> May 18 23:01:40 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_close: Closing ringbuffer: /dev/shm/qb-cmap-request-2429-2526-30-header
> May 18 23:01:40 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_close: Closing ringbuffer: /dev/shm/qb-cmap-response-2429-2526-30-header
> May 18 23:01:40 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_close: Closing ringbuffer: /dev/shm/qb-cmap-event-2429-2526-30-header
> May 18 23:01:40 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_command: Processed st_fence from crmd.2530: Operation now in progress (-115)
> May 18 23:01:40 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_command: Processing st_query 0 from apache-up001.ring0 ( 2)
> May 18 23:01:40 [2526] apache-up001.itc4u.local stonith-ng: debug: create_remote_stonith_op: 49b08346-e9ee-497b-a556-fa71b45bf0ff already exists
> May 18 23:01:40 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_query: Query <stonith_command __name__="stonith_command" t="stonith-ng" st_async_id="49b08346-e9ee-497b-a556-fa71b45bf0ff" st_op="st_query" st_callid="2" st_callopt="2" st_remote_op="49b08346-e9ee-497b-a556-fa71b45bf0ff" st_target="apache-up001.ring0" st_device_action="on" st_origin="apache-up001.ring0" st_clientid="78870da2-902c-4786-8f46-788a0476f5ae" st_clientname="crmd.2530" st_timeout="60" src="apache-up001.ring0"/>
> May 18 23:01:40 [2526] apache-up001.itc4u.local stonith-ng: debug: get_capable_devices: Searching through 1 devices to see what is capable of action (on) for target apache-up001.ring0
> May 18 23:01:40 [2526] apache-up001.itc4u.local stonith-ng: debug: schedule_stonith_command: Scheduling list on scsi for stonith-ng (timeout=60s)
> May 18 23:01:40 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_command: Processed st_query from apache-up001.ring0: OK (0)
> May 18 23:01:40 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_action_create: Initiating action list for agent fence_scsi (target=(null))
> May 18 23:01:40 [2526] apache-up001.itc4u.local stonith-ng: debug: internal_stonith_action_execute: forking
> May 18 23:01:40 [2526] apache-up001.itc4u.local stonith-ng: debug: internal_stonith_action_execute: sending args
> May 18 23:01:40 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_device_execute: Operation list on scsi now running with pid=5786, timeout=60s
> May 18 23:01:40 [2526] apache-up001.itc4u.local stonith-ng: debug: child_waitpid: wait(5786) = 0: Resource temporarily unavailable (11)
> May 18 23:01:40 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_action_async_done: Child process 5786 performing action 'list' exited with rc 1
> May 18 23:01:40 [2526] apache-up001.itc4u.local stonith-ng: warning: log_action: fence_scsi[5786] stderr: [ Failed: nodename or key is required ]
> May 18 23:01:40 [2526] apache-up001.itc4u.local stonith-ng: warning: log_action: fence_scsi[5786] stderr: [ ]
> May 18 23:01:40 [2526] apache-up001.itc4u.local stonith-ng: warning: log_action: fence_scsi[5786] stderr: [ Please use '-h' for usage ]
> May 18 23:01:40 [2526] apache-up001.itc4u.local stonith-ng: warning: log_action: fence_scsi[5786] stderr: [ ]
> May 18 23:01:40 [2526] apache-up001.itc4u.local stonith-ng: info: internal_stonith_action_execute: Attempt 2 to execute fence_scsi (list). remaining timeout is 60
> May 18 23:01:40 [2526] apache-up001.itc4u.local stonith-ng: debug: internal_stonith_action_execute: forking
> May 18 23:01:40 [2526] apache-up001.itc4u.local stonith-ng: debug: internal_stonith_action_execute: sending args
> May 18 23:01:40 [2526] apache-up001.itc4u.local stonith-ng: debug: child_waitpid: wait(5791) = 0: Success (0)
> May 18 23:01:41 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_action_async_done: Child process 5791 performing action 'list' exited with rc 1
> May 18 23:01:41 [2526] apache-up001.itc4u.local stonith-ng: warning: log_action: fence_scsi[5791] stderr: [ Failed: nodename or key is required ]
> May 18 23:01:41 [2526] apache-up001.itc4u.local stonith-ng: warning: log_action: fence_scsi[5791] stderr: [ ]
> May 18 23:01:41 [2526] apache-up001.itc4u.local stonith-ng: warning: log_action: fence_scsi[5791] stderr: [ Please use '-h' for usage ]
> May 18 23:01:41 [2526] apache-up001.itc4u.local stonith-ng: warning: log_action: fence_scsi[5791] stderr: [ ]
> May 18 23:01:41 [2526] apache-up001.itc4u.local stonith-ng: info: update_remaining_timeout: Attempted to execute agent fence_scsi (list) the maximum number of times (2) allowed
> May 18 23:01:41 [2526] apache-up001.itc4u.local stonith-ng: notice: dynamic_list_search_cb: Disabling port list queries for scsi (-201): (null)
> May 18 23:01:41 [2526] apache-up001.itc4u.local stonith-ng: debug: search_devices_record_result: Finished Search. 0 devices can perform action (on) on node apache-up001.ring0
> May 18 23:01:41 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_query_capable_device_cb: Found 0 matching devices for 'apache-up001.ring0'
> May 18 23:01:41 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_command: Processing st_query reply 0 from apache-up001.ring0 ( 2)
> May 18 23:01:41 [2526] apache-up001.itc4u.local stonith-ng: info: process_remote_stonith_query: Query result 1 of 1 from apache-up001.ring0 for apache-up001.ring0/on (0 devices) 49b08346-e9ee-497b-a556-fa71b45bf0ff
> May 18 23:01:41 [2526] apache-up001.itc4u.local stonith-ng: info: process_remote_stonith_query: All query replies have arrived, continuing (1 expected/1 received)
> May 18 23:01:41 [2526] apache-up001.itc4u.local stonith-ng: notice: stonith_choose_peer: Couldn't find anyone to fence (on) apache-up001.ring0 with any device
> May 18 23:01:41 [2526] apache-up001.itc4u.local stonith-ng: info: call_remote_stonith: Total remote op timeout set to 60 for fencing of node apache-up001.ring0 for crmd.2530.49b08346
> May 18 23:01:41 [2526] apache-up001.itc4u.local stonith-ng: info: call_remote_stonith: None of the 1 peers have devices capable of fencing (on) apache-up001.ring0 for crmd.2530 (0)
> May 18 23:01:41 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_command: Processed st_query reply from apache-up001.ring0: OK (0)
> May 18 23:01:41 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_command: Processing st_notify reply 0 from apache-up001.ring0 ( 0)
> May 18 23:01:41 [2526] apache-up001.itc4u.local stonith-ng: debug: process_remote_stonith_exec: Marking call to on for apache-up001.ring0 on behalf of crmd.2530 at 49b08346-e9ee-497b-a556-fa71b45bf0ff.apache-u: No such device (-19)
> May 18 23:01:41 [2526] apache-up001.itc4u.local stonith-ng: error: remote_op_done: Operation on of apache-up001.ring0 by <no-one> for crmd.2530 at apache-up001.ring0.49b08346: No such device
> May 18 23:01:41 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_command: Processed st_notify reply from apache-up001.ring0: OK (0)
> May 18 23:01:41 [2530] apache-up001.itc4u.local crmd: notice: tengine_stonith_callback: Stonith operation 2/3:0:0:cf3499c5-bb9c-4d70-b209-80cc23a4ff1d: No such device (-19)
> May 18 23:01:41 [2530] apache-up001.itc4u.local crmd: notice: tengine_stonith_callback: Stonith operation 2 for apache-up001.ring0 failed (No such device): aborting transition.
> May 18 23:01:41 [2530] apache-up001.itc4u.local crmd: notice: abort_transition_graph: Transition aborted: Stonith failed (source=tengine_stonith_callback:748, 0)
> May 18 23:01:41 [2530] apache-up001.itc4u.local crmd: error: tengine_stonith_notify: Unfencing of apache-up001.ring0 by <anyone> failed: No such device (-19)
> May 18 23:01:41 [2530] apache-up001.itc4u.local crmd: notice: te_rsc_command: Initiating action 2: monitor scsi_monitor_0 on apache-up001.ring0 (local)
> May 18 23:01:41 [2527] apache-up001.itc4u.local lrmd: info: process_lrmd_get_rsc_info: Resource 'scsi' not found (0 active resources)
> May 18 23:01:41 [2527] apache-up001.itc4u.local lrmd: info: process_lrmd_rsc_register: Added 'scsi' to the rsc list (1 active resources)
> May 18 23:01:41 [2530] apache-up001.itc4u.local crmd: info: do_lrm_rsc_op: Performing key=2:0:7:cf3499c5-bb9c-4d70-b209-80cc23a4ff1d op=scsi_monitor_0
> May 18 23:01:41 [2526] apache-up001.itc4u.local stonith-ng: debug: crm_client_new: Connecting 0x1f9a180 for uid=0 gid=0 pid=2527 id=2ae02530-4f1c-4c84-8ad5-a40e1acdc866
> May 18 23:01:41 [2526] apache-up001.itc4u.local stonith-ng: debug: handle_new_connection: IPC credentials authenticated (2526-2527-11)
> May 18 23:01:41 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_ipcs_shm_connect: connecting to client [2527]
> May 18 23:01:41 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_open_2: shm size:131085; real_size:135168; rb->word_size:33792
> May 18 23:01:41 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_open_2: shm size:131085; real_size:135168; rb->word_size:33792
> May 18 23:01:41 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_open_2: shm size:131085; real_size:135168; rb->word_size:33792
> May 18 23:01:41 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_command: Processing register 1 from lrmd.2527 ( 0)
> May 18 23:01:41 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_command: Processed register from lrmd.2527: OK (0)
> May 18 23:01:41 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_command: Processing st_notify 2 from lrmd.2527 ( 0)
> May 18 23:01:41 [2526] apache-up001.itc4u.local stonith-ng: debug: handle_request: Setting st_notify_disconnect callbacks for lrmd.2527 (2ae02530-4f1c-4c84-8ad5-a40e1acdc866): ON
> May 18 23:01:41 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_command: Processed st_notify from lrmd.2527: OK (0)
> May 18 23:01:42 [2530] apache-up001.itc4u.local crmd: notice: process_lrm_event: Operation scsi_monitor_0: not running (node=apache-up001.ring0, call=5, rc=7, cib-update=33, confirmed=true)
> May 18 23:01:42 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Forwarding cib_modify operation for section status to master (origin=local/crmd/33)
> May 18 23:01:42 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: --- 0.67.4 2
> May 18 23:01:42 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: +++ 0.67.5 (null)
> May 18 23:01:42 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib: @num_updates=5
> May 18 23:01:42 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib/status/node_state[@id='1']: @crm-debug-origin=do_update_resource
> May 18 23:01:42 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ /cib/status/node_state[@id='1']/lrm[@id='1']/lrm_resources: <lrm_resource id="scsi" type="fence_scsi" class="stonith"/>
> May 18 23:01:42 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ <lrm_rsc_op id="scsi_last_0" operation_key="scsi_monitor_0" operation="monitor" crm-debug-origin="do_update_resource" crm_feature_set="3.0.10" transition-key="2:0:7:cf3499c5-bb9c-4d70-b209-80cc23a4ff1d" transition-magic="0:7;2:0:7:cf3499c5-bb9c-4d70-b209-80cc23a4ff1d" on_node="apache-up001.ring0" call-id="5" rc-code="7" op-status="0" interval="0" last-run="
> May 18 23:01:42 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ </lrm_resource>
> May 18 23:01:42 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section status: OK (rc=0, origin=apache-up001.ring0/crmd/33, version=0.67.5)
> May 18 23:01:42 [2526] apache-up001.itc4u.local stonith-ng: debug: xml_patch_version_check: Can apply patch 0.67.5 to 0.67.4
> May 18 23:01:42 [2530] apache-up001.itc4u.local crmd: info: match_graph_event: Action scsi_monitor_0 (2) confirmed on apache-up001.ring0 (rc=7)
> May 18 23:01:42 [2530] apache-up001.itc4u.local crmd: notice: run_graph: Transition 0 (Complete=2, Pending=0, Fired=0, Skipped=0, Incomplete=0, Source=/var/lib/pacemaker/pengine/pe-input-122.bz2): Complete
> May 18 23:01:42 [2530] apache-up001.itc4u.local crmd: notice: too_many_st_failures: No devices found in cluster to fence apache-up001.ring0, giving up
> May 18 23:01:42 [2530] apache-up001.itc4u.local crmd: info: do_log: FSA: Input I_TE_SUCCESS from notify_crmd() received in state S_TRANSITION_ENGINE
> May 18 23:01:42 [2530] apache-up001.itc4u.local crmd: notice: do_state_transition: State transition S_TRANSITION_ENGINE -> S_IDLE [ input=I_TE_SUCCESS cause=C_FSA_INTERNAL origin=notify_crmd ]
> May 18 23:01:47 [2525] apache-up001.itc4u.local cib: info: cib_process_ping: Reporting our current digest to apache-up001.ring0: 4e3cec6d26ac0bbe8cd5a3d19776950c for 0.67.5 (0x12fdcb0 0)
> May 18 23:01:48 [2530] apache-up001.itc4u.local crmd: info: crm_procfs_pid_of: Found cib active as process 2525
> May 18 23:01:48 [2530] apache-up001.itc4u.local crmd: info: throttle_send_command: New throttle mode: 0000 (was ffffffff)
> May 18 23:03:48 [2526] apache-up001.itc4u.local stonith-ng: debug: crm_client_new: Connecting 0x1fe0950 for uid=0 gid=0 pid=5872 id=191778ef-2eec-473a-b31a-8363dcf16fef
> May 18 23:03:48 [2526] apache-up001.itc4u.local stonith-ng: debug: handle_new_connection: IPC credentials authenticated (2526-5872-12)
> May 18 23:03:48 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_ipcs_shm_connect: connecting to client [5872]
> May 18 23:03:48 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_open_2: shm size:131085; real_size:135168; rb->word_size:33792
> May 18 23:03:48 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_open_2: shm size:131085; real_size:135168; rb->word_size:33792
> May 18 23:03:48 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_open_2: shm size:131085; real_size:135168; rb->word_size:33792
> May 18 23:03:48 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_command: Processing register 1 from stonith_admin.5872 ( 0)
> May 18 23:03:48 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_command: Processed register from stonith_admin.5872: OK (0)
> May 18 23:03:48 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_command: Processing st_query 2 from stonith_admin.5872 ( 1002)
> May 18 23:03:48 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_query: Query <stonith_command __name__="stonith_command" t="stonith-ng" st_async_id="191778ef-2eec-473a-b31a-8363dcf16fef" st_op="st_query" st_callid="2" st_callopt="4098" st_timeout="120" st_clientid="191778ef-2eec-473a-b31a-8363dcf16fef" st_clientname="stonith_admin.5872" st_clientnode="apache-up001.ring0">
> May 18 23:03:48 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_query: Query <st_calldata>
> May 18 23:03:48 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_query: Query <st_device_id st_origin="stonith_api_query" st_device_action="off"/>
> May 18 23:03:48 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_query: Query </st_calldata>
> May 18 23:03:48 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_query: Query </stonith_command>
> May 18 23:03:48 [2526] apache-up001.itc4u.local stonith-ng: debug: get_capable_devices: Searching through 1 devices to see what is capable of action (off) for target <anyone>
> May 18 23:03:48 [2526] apache-up001.itc4u.local stonith-ng: debug: search_devices_record_result: Finished Search. 1 devices can perform action (off) on node <anyone>
> May 18 23:03:48 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_query_capable_device_cb: 1 devices installed
> May 18 23:03:48 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_command: Processed st_query from stonith_admin.5872: OK (0)
> May 18 23:03:48 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_ipcs_dispatch_connection_request: HUP conn (2526-5872-12)
> May 18 23:03:48 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_ipcs_disconnect: qb_ipcs_disconnect(2526-5872-12) state:2
> May 18 23:03:48 [2526] apache-up001.itc4u.local stonith-ng: debug: crm_client_destroy: Destroying 0 events
> May 18 23:03:48 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_close: Free'ing ringbuffer: /dev/shm/qb-stonith-ng-response-2526-5872-12-header
> May 18 23:03:48 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_close: Free'ing ringbuffer: /dev/shm/qb-stonith-ng-event-2526-5872-12-header
> May 18 23:03:48 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_close: Free'ing ringbuffer: /dev/shm/qb-stonith-ng-request-2526-5872-12-header
> May 18 23:04:48 [2530] apache-up001.itc4u.local crmd: info: throttle_send_command: New throttle mode: 0001 (was 0000)
> May 18 23:07:04 [2526] apache-up001.itc4u.local stonith-ng: debug: crm_client_new: Connecting 0x1fe1140 for uid=0 gid=0 pid=5993 id=6a6b2fbe-c78a-4bf2-b703-39b6e3f7d90c
> May 18 23:07:04 [2526] apache-up001.itc4u.local stonith-ng: debug: handle_new_connection: IPC credentials authenticated (2526-5993-12)
> May 18 23:07:04 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_ipcs_shm_connect: connecting to client [5993]
> May 18 23:07:04 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_open_2: shm size:131085; real_size:135168; rb->word_size:33792
> May 18 23:07:04 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_open_2: shm size:131085; real_size:135168; rb->word_size:33792
> May 18 23:07:04 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_open_2: shm size:131085; real_size:135168; rb->word_size:33792
> May 18 23:07:04 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_command: Processing register 1 from stonith_admin.5993 ( 0)
> May 18 23:07:04 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_command: Processed register from stonith_admin.5993: OK (0)
> May 18 23:07:04 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_command: Processing st_query 2 from stonith_admin.5993 ( 1002)
> May 18 23:07:04 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_query: Query <stonith_command __name__="stonith_command" t="stonith-ng" st_async_id="6a6b2fbe-c78a-4bf2-b703-39b6e3f7d90c" st_op="st_query" st_callid="2" st_callopt="4098" st_timeout="120" st_clientid="6a6b2fbe-c78a-4bf2-b703-39b6e3f7d90c" st_clientname="stonith_admin.5993" st_clientnode="apache-up001.ring0">
> May 18 23:07:04 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_query: Query <st_calldata>
> May 18 23:07:04 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_query: Query <st_device_id st_origin="stonith_api_query" st_device_action="off"/>
> May 18 23:07:04 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_query: Query </st_calldata>
> May 18 23:07:04 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_query: Query </stonith_command>
> May 18 23:07:04 [2526] apache-up001.itc4u.local stonith-ng: debug: get_capable_devices: Searching through 1 devices to see what is capable of action (off) for target <anyone>
> May 18 23:07:04 [2526] apache-up001.itc4u.local stonith-ng: debug: search_devices_record_result: Finished Search. 1 devices can perform action (off) on node <anyone>
> May 18 23:07:04 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_query_capable_device_cb: 1 devices installed
> May 18 23:07:04 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_command: Processed st_query from stonith_admin.5993: OK (0)
> May 18 23:07:04 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_ipcs_dispatch_connection_request: HUP conn (2526-5993-12)
> May 18 23:07:04 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_ipcs_disconnect: qb_ipcs_disconnect(2526-5993-12) state:2
> May 18 23:07:04 [2526] apache-up001.itc4u.local stonith-ng: debug: crm_client_destroy: Destroying 0 events
> May 18 23:07:04 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_close: Free'ing ringbuffer: /dev/shm/qb-stonith-ng-response-2526-5993-12-header
> May 18 23:07:04 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_close: Free'ing ringbuffer: /dev/shm/qb-stonith-ng-event-2526-5993-12-header
> May 18 23:07:04 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_close: Free'ing ringbuffer: /dev/shm/qb-stonith-ng-request-2526-5993-12-header
> [2410] apache-up001.itc4u.local corosyncnotice [TOTEM ] A new membership (192.168.15.7:216) was formed. Members joined: 3
> May 18 23:07:35 [2524] apache-up001.itc4u.local pacemakerd: notice: pcmk_quorum_notification: Membership 204: quorum acquired (1)
> May 18 23:07:35 [2524] apache-up001.itc4u.local pacemakerd: info: crm_reap_unseen_nodes: State of node apache-up003.ring0[3] is still unknown
> May 18 23:07:35 [2524] apache-up001.itc4u.local pacemakerd: info: crm_reap_unseen_nodes: State of node apache-up002.ring0[2] is still unknown
> May 18 23:07:35 [2530] apache-up001.itc4u.local crmd: notice: pcmk_quorum_notification: Membership 204: quorum acquired (1)
> May 18 23:07:35 [2530] apache-up001.itc4u.local crmd: info: crm_reap_unseen_nodes: State of node apache-up002.ring0[2] is still unknown
> May 18 23:07:35 [2530] apache-up001.itc4u.local crmd: info: crm_reap_unseen_nodes: State of node apache-up003.ring0[3] is still unknown
> May 18 23:07:35 [2530] apache-up001.itc4u.local crmd: info: do_update_node_cib: Node update for apache-up002.ring0 cancelled: no state, not seen yet
> May 18 23:07:35 [2530] apache-up001.itc4u.local crmd: info: do_update_node_cib: Node update for apache-up003.ring0 cancelled: no state, not seen yet
> [2410] apache-up001.itc4u.local corosyncnotice [QUORUM] This node is within the primary component and will provide service.
> [2410] apache-up001.itc4u.local corosyncnotice [QUORUM] Members[1]: 1
> [2410] apache-up001.itc4u.local corosyncnotice [QUORUM] Members[2]: 3 1
> [2410] apache-up001.itc4u.local corosyncnotice [MAIN ] Completed service synchronization, ready to provide service.
> May 18 23:07:35 [2530] apache-up001.itc4u.local crmd: info: pcmk_quorum_notification: Membership 216: quorum retained (2)
> May 18 23:07:35 [2530] apache-up001.itc4u.local crmd: notice: crm_update_peer_state_iter: pcmk_quorum_notification: Node apache-up003.ring0[3] - state is now member (was (null))
> May 18 23:07:35 [2530] apache-up001.itc4u.local crmd: info: peer_update_callback: apache-up003.ring0 is now member (was in unknown state)
> May 18 23:07:35 [2530] apache-up001.itc4u.local crmd: info: crm_reap_unseen_nodes: State of node apache-up002.ring0[2] is still unknown
> May 18 23:07:35 [2530] apache-up001.itc4u.local crmd: info: do_update_node_cib: Node update for apache-up002.ring0 cancelled: no state, not seen yet
> May 18 23:07:35 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Forwarding cib_modify operation for section cib to master (origin=local/crmd/34)
> May 18 23:07:35 [2524] apache-up001.itc4u.local pacemakerd: info: pcmk_quorum_notification: Membership 216: quorum retained (2)
> May 18 23:07:35 [2524] apache-up001.itc4u.local pacemakerd: notice: crm_update_peer_state_iter: pcmk_quorum_notification: Node apache-up003.ring0[3] - state is now member (was (null))
> May 18 23:07:35 [2524] apache-up001.itc4u.local pacemakerd: info: crm_reap_unseen_nodes: State of node apache-up002.ring0[2] is still unknown
> May 18 23:07:35 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Forwarding cib_modify operation for section nodes to master (origin=local/crmd/38)
> May 18 23:07:35 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: --- 0.67.5 2
> May 18 23:07:35 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: +++ 0.67.6 b6ff870bdbeb2900727d94f33c04bb67
> May 18 23:07:35 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib: @num_updates=6, @have-quorum=1
> May 18 23:07:35 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section cib: OK (rc=0, origin=apache-up001.ring0/crmd/34, version=0.67.6)
> May 18 23:07:35 [2526] apache-up001.itc4u.local stonith-ng: debug: xml_patch_version_check: Can apply patch 0.67.6 to 0.67.5
> May 18 23:07:35 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section nodes: OK (rc=0, origin=apache-up001.ring0/crmd/38, version=0.67.6)
> May 18 23:07:35 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Forwarding cib_modify operation for section status to master (origin=local/crmd/39)
> May 18 23:07:35 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Forwarding cib_modify operation for section status to master (origin=local/crmd/40)
> May 18 23:07:35 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Forwarding cib_modify operation for section nodes to master (origin=local/crmd/44)
> May 18 23:07:35 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Forwarding cib_modify operation for section status to master (origin=local/crmd/45)
> [2410] apache-up001.itc4u.local corosyncnotice [TOTEM ] A new membership (192.168.15.7:224) was formed. Members joined: 2
> May 18 23:07:37 [2530] apache-up001.itc4u.local crmd: info: pcmk_quorum_notification: Membership 224: quorum retained (3)
> May 18 23:07:37 [2530] apache-up001.itc4u.local crmd: notice: crm_update_peer_state_iter: pcmk_quorum_notification: Node apache-up002.ring0[2] - state is now member (was (null))
> May 18 23:07:37 [2530] apache-up001.itc4u.local crmd: info: peer_update_callback: apache-up002.ring0 is now member (was in unknown state)
> May 18 23:07:37 [2530] apache-up001.itc4u.local crmd: info: pcmk_cpg_membership: Node 2 joined group crmd (counter=1.0)
> May 18 23:07:37 [2530] apache-up001.itc4u.local crmd: info: pcmk_cpg_membership: Node 1 still member of group crmd (peer=apache-up001.ring0, counter=1.0)
> May 18 23:07:37 [2530] apache-up001.itc4u.local crmd: info: pcmk_cpg_membership: Node 2 still member of group crmd (peer=apache-up002.ring0, counter=1.1)
> May 18 23:07:37 [2530] apache-up001.itc4u.local crmd: info: crm_update_peer_proc: pcmk_cpg_membership: Node apache-up002.ring0[2] - corosync-cpg is now online
> May 18 23:07:37 [2530] apache-up001.itc4u.local crmd: info: peer_update_callback: Client apache-up002.ring0/peer now has status [online] (DC=true, changed=4000000)
> May 18 23:07:37 [2530] apache-up001.itc4u.local crmd: notice: do_state_transition: State transition S_IDLE -> S_INTEGRATION [ input=I_NODE_JOIN cause=C_FSA_INTERNAL origin=peer_update_callback ]
> May 18 23:07:37 [2530] apache-up001.itc4u.local crmd: info: do_dc_join_offer_one: An unknown node joined - (re-)offer to any unconfirmed nodes
> May 18 23:07:37 [2530] apache-up001.itc4u.local crmd: info: join_make_offer: Making join offers based on membership 224
> May 18 23:07:37 [2530] apache-up001.itc4u.local crmd: info: join_make_offer: join-2: Sending offer to apache-up002.ring0
> May 18 23:07:37 [2530] apache-up001.itc4u.local crmd: info: crm_update_peer_join: join_make_offer: Node apache-up002.ring0[2] - join-2 phase 0 -> 1
> May 18 23:07:37 [2530] apache-up001.itc4u.local crmd: info: join_make_offer: Skipping apache-up001.ring0: already known 4
> May 18 23:07:37 [2530] apache-up001.itc4u.local crmd: info: join_make_offer: Not making an offer to apache-up003.ring0: not active (member)
> May 18 23:07:37 [2530] apache-up001.itc4u.local crmd: info: abort_transition_graph: Transition aborted: Peer Halt (source=do_te_invoke:168, 1)
> May 18 23:07:37 [2530] apache-up001.itc4u.local crmd: info: pcmk_cpg_membership: Node 3 joined group crmd (counter=2.0)
> May 18 23:07:37 [2530] apache-up001.itc4u.local crmd: info: pcmk_cpg_membership: Node 1 still member of group crmd (peer=apache-up001.ring0, counter=2.0)
> May 18 23:07:37 [2530] apache-up001.itc4u.local crmd: info: pcmk_cpg_membership: Node 2 still member of group crmd (peer=apache-up002.ring0, counter=2.1)
> May 18 23:07:37 [2530] apache-up001.itc4u.local crmd: info: pcmk_cpg_membership: Node 3 still member of group crmd (peer=apache-up003.ring0, counter=2.2)
> May 18 23:07:37 [2530] apache-up001.itc4u.local crmd: info: crm_update_peer_proc: pcmk_cpg_membership: Node apache-up003.ring0[3] - corosync-cpg is now online
> May 18 23:07:37 [2530] apache-up001.itc4u.local crmd: info: peer_update_callback: Client apache-up003.ring0/peer now has status [online] (DC=true, changed=4000000)
> May 18 23:07:37 [2530] apache-up001.itc4u.local crmd: info: do_dc_join_offer_one: An unknown node joined - (re-)offer to any unconfirmed nodes
> May 18 23:07:37 [2530] apache-up001.itc4u.local crmd: info: join_make_offer: Skipping apache-up002.ring0: already known 1
> May 18 23:07:37 [2530] apache-up001.itc4u.local crmd: info: join_make_offer: Skipping apache-up001.ring0: already known 4
> May 18 23:07:37 [2530] apache-up001.itc4u.local crmd: info: join_make_offer: join-2: Sending offer to apache-up003.ring0
> May 18 23:07:37 [2530] apache-up001.itc4u.local crmd: info: crm_update_peer_join: join_make_offer: Node apache-up003.ring0[3] - join-2 phase 0 -> 1
> May 18 23:07:37 [2526] apache-up001.itc4u.local stonith-ng: info: pcmk_cpg_membership: Node 2 joined group stonith-ng (counter=1.0)
> May 18 23:07:37 [2526] apache-up001.itc4u.local stonith-ng: info: pcmk_cpg_membership: Node 1 still member of group stonith-ng (peer=apache-up001.ring0, counter=1.0)
> May 18 23:07:37 [2525] apache-up001.itc4u.local cib: info: pcmk_cpg_membership: Node 2 joined group cib (counter=1.0)
> May 18 23:07:37 [2525] apache-up001.itc4u.local cib: info: pcmk_cpg_membership: Node 1 still member of group cib (peer=apache-up001.ring0, counter=1.0)
> May 18 23:07:37 [2528] apache-up001.itc4u.local attrd: info: pcmk_cpg_membership: Node 2 joined group attrd (counter=1.0)
> May 18 23:07:37 [2528] apache-up001.itc4u.local attrd: info: pcmk_cpg_membership: Node 1 still member of group attrd (peer=apache-up001.ring0, counter=1.0)
> May 18 23:07:37 [2524] apache-up001.itc4u.local pacemakerd: info: pcmk_quorum_notification: Membership 224: quorum retained (3)
> May 18 23:07:37 [2524] apache-up001.itc4u.local pacemakerd: notice: crm_update_peer_state_iter: pcmk_quorum_notification: Node apache-up002.ring0[2] - state is now member (was (null))
> May 18 23:07:37 [2524] apache-up001.itc4u.local pacemakerd: info: pcmk_cpg_membership: Node 2 joined group pacemakerd (counter=1.0)
> May 18 23:07:37 [2524] apache-up001.itc4u.local pacemakerd: info: pcmk_cpg_membership: Node 1 still member of group pacemakerd (peer=apache-up001.ring0, counter=1.0)
> May 18 23:07:37 [2524] apache-up001.itc4u.local pacemakerd: info: pcmk_cpg_membership: Node 2 still member of group pacemakerd (peer=apache-up002.ring0, counter=1.1)
> May 18 23:07:37 [2524] apache-up001.itc4u.local pacemakerd: info: crm_update_peer_proc: pcmk_cpg_membership: Node apache-up002.ring0[2] - corosync-cpg is now online
> May 18 23:07:37 [2524] apache-up001.itc4u.local pacemakerd: info: pcmk_cpg_membership: Node 3 joined group pacemakerd (counter=2.0)
> May 18 23:07:37 [2524] apache-up001.itc4u.local pacemakerd: info: pcmk_cpg_membership: Node 1 still member of group pacemakerd (peer=apache-up001.ring0, counter=2.0)
> May 18 23:07:37 [2524] apache-up001.itc4u.local pacemakerd: info: pcmk_cpg_membership: Node 2 still member of group pacemakerd (peer=apache-up002.ring0, counter=2.1)
> May 18 23:07:37 [2524] apache-up001.itc4u.local pacemakerd: info: pcmk_cpg_membership: Node 3 still member of group pacemakerd (peer=apache-up003.ring0, counter=2.2)
> May 18 23:07:37 [2524] apache-up001.itc4u.local pacemakerd: info: crm_update_peer_proc: pcmk_cpg_membership: Node apache-up003.ring0[3] - corosync-cpg is now online
> May 18 23:07:37 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_open_2: shm size:1048589; real_size:1052672; rb->word_size:263168
> May 18 23:07:37 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_open_2: shm size:1048589; real_size:1052672; rb->word_size:263168
> May 18 23:07:37 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_open_2: shm size:1048589; real_size:1052672; rb->word_size:263168
> [2410] apache-up001.itc4u.local corosyncnotice [QUORUM] Members[3]: 3 2 1
> [2410] apache-up001.itc4u.local corosyncnotice [MAIN ] Completed service synchronization, ready to provide service.
> May 18 23:07:37 [2524] apache-up001.itc4u.local pacemakerd: info: mcp_cpg_deliver: Ignoring process list sent by peer for local node
> May 18 23:07:37 [2524] apache-up001.itc4u.local pacemakerd: info: mcp_cpg_deliver: Ignoring process list sent by peer for local node
> May 18 23:07:37 [2525] apache-up001.itc4u.local cib: info: crm_get_peer: Created entry 295bdf8f-6bdf-4880-8bb1-df2de877350a/0x126e090 for node apache-up002.ring0/2 (2 total)
> May 18 23:07:37 [2525] apache-up001.itc4u.local cib: info: crm_get_peer: Node 2 is now known as apache-up002.ring0
> May 18 23:07:37 [2525] apache-up001.itc4u.local cib: info: crm_get_peer: Node 2 has uuid 2
> May 18 23:07:37 [2525] apache-up001.itc4u.local cib: info: pcmk_cpg_membership: Node 2 still member of group cib (peer=apache-up002.ring0, counter=1.1)
> May 18 23:07:37 [2525] apache-up001.itc4u.local cib: info: crm_update_peer_proc: pcmk_cpg_membership: Node apache-up002.ring0[2] - corosync-cpg is now online
> May 18 23:07:37 [2525] apache-up001.itc4u.local cib: notice: crm_update_peer_state_iter: crm_update_peer_proc: Node apache-up002.ring0[2] - state is now member (was (null))
> May 18 23:07:37 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_ipcc_disconnect: qb_ipcc_disconnect()
> May 18 23:07:37 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section nodes: OK (rc=0, origin=apache-up002.ring0/crmd/6, version=0.67.6)
> May 18 23:07:37 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_close: Closing ringbuffer: /dev/shm/qb-cmap-request-2429-2526-30-header
> May 18 23:07:37 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_close: Closing ringbuffer: /dev/shm/qb-cmap-response-2429-2526-30-header
> May 18 23:07:37 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_close: Closing ringbuffer: /dev/shm/qb-cmap-event-2429-2526-30-header
> May 18 23:07:37 [2526] apache-up001.itc4u.local stonith-ng: info: crm_get_peer: Created entry a64c387e-0d2d-4d97-b606-9667f1ebd365/0x1fdd620 for node apache-up002.ring0/2 (2 total)
> May 18 23:07:37 [2526] apache-up001.itc4u.local stonith-ng: info: crm_get_peer: Node 2 is now known as apache-up002.ring0
> May 18 23:07:37 [2526] apache-up001.itc4u.local stonith-ng: debug: st_peer_update_callback: Broadcasting our uname because of node 2
> May 18 23:07:37 [2526] apache-up001.itc4u.local stonith-ng: info: crm_get_peer: Node 2 has uuid 2
> May 18 23:07:37 [2526] apache-up001.itc4u.local stonith-ng: info: pcmk_cpg_membership: Node 2 still member of group stonith-ng (peer=apache-up002.ring0, counter=1.1)
> May 18 23:07:37 [2526] apache-up001.itc4u.local stonith-ng: info: crm_update_peer_proc: pcmk_cpg_membership: Node apache-up002.ring0[2] - corosync-cpg is now online
> May 18 23:07:37 [2526] apache-up001.itc4u.local stonith-ng: notice: crm_update_peer_state_iter: crm_update_peer_proc: Node apache-up002.ring0[2] - state is now member (was (null))
> May 18 23:07:37 [2526] apache-up001.itc4u.local stonith-ng: debug: st_peer_update_callback: Broadcasting our uname because of node 2
> May 18 23:07:37 [2526] apache-up001.itc4u.local stonith-ng: info: pcmk_cpg_membership: Node 3 joined group stonith-ng (counter=2.0)
> May 18 23:07:37 [2526] apache-up001.itc4u.local stonith-ng: info: pcmk_cpg_membership: Node 1 still member of group stonith-ng (peer=apache-up001.ring0, counter=2.0)
> May 18 23:07:37 [2526] apache-up001.itc4u.local stonith-ng: info: pcmk_cpg_membership: Node 2 still member of group stonith-ng (peer=apache-up002.ring0, counter=2.1)
> May 18 23:07:37 [2528] apache-up001.itc4u.local attrd: info: crm_get_peer: Created entry 377b62be-9eac-46ee-b617-2e4d1e087779/0x1596c20 for node apache-up002.ring0/2 (2 total)
> May 18 23:07:37 [2528] apache-up001.itc4u.local attrd: info: crm_get_peer: Node 2 is now known as apache-up002.ring0
> May 18 23:07:37 [2528] apache-up001.itc4u.local attrd: info: crm_get_peer: Node 2 has uuid 2
> May 18 23:07:37 [2528] apache-up001.itc4u.local attrd: info: pcmk_cpg_membership: Node 2 still member of group attrd (peer=apache-up002.ring0, counter=1.1)
> May 18 23:07:37 [2528] apache-up001.itc4u.local attrd: info: crm_update_peer_proc: pcmk_cpg_membership: Node apache-up002.ring0[2] - corosync-cpg is now online
> May 18 23:07:37 [2528] apache-up001.itc4u.local attrd: notice: crm_update_peer_state_iter: crm_update_peer_proc: Node apache-up002.ring0[2] - state is now member (was (null))
> May 18 23:07:37 [2528] apache-up001.itc4u.local attrd: info: pcmk_cpg_membership: Node 3 joined group attrd (counter=2.0)
> May 18 23:07:37 [2528] apache-up001.itc4u.local attrd: info: pcmk_cpg_membership: Node 1 still member of group attrd (peer=apache-up001.ring0, counter=2.0)
> May 18 23:07:37 [2528] apache-up001.itc4u.local attrd: info: pcmk_cpg_membership: Node 2 still member of group attrd (peer=apache-up002.ring0, counter=2.1)
> May 18 23:07:37 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: --- 0.67.6 2
> May 18 23:07:37 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: +++ 0.67.7 (null)
> May 18 23:07:37 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib: @num_updates=7
> May 18 23:07:37 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib/status/node_state[@id='1']: @crm-debug-origin=post_cache_update
> May 18 23:07:37 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section status: OK (rc=0, origin=apache-up001.ring0/crmd/39, version=0.67.7)
> May 18 23:07:37 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: --- 0.67.7 2
> May 18 23:07:37 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: +++ 0.67.8 (null)
> May 18 23:07:37 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib: @num_updates=8
> May 18 23:07:37 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ /cib/status: <node_state id="3" uname="apache-up003.ring0" crmd="offline" crm-debug-origin="peer_update_callback"/>
> May 18 23:07:37 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section status: OK (rc=0, origin=apache-up001.ring0/crmd/40, version=0.67.8)
> May 18 23:07:37 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section nodes: OK (rc=0, origin=apache-up001.ring0/crmd/44, version=0.67.8)
> May 18 23:07:37 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: --- 0.67.8 2
> May 18 23:07:37 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: +++ 0.67.9 (null)
> May 18 23:07:37 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib: @num_updates=9
> May 18 23:07:37 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib/status/node_state[@id='3']: @crm-debug-origin=post_cache_update, @in_ccm=true
> May 18 23:07:37 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section status: OK (rc=0, origin=apache-up001.ring0/crmd/45, version=0.67.9)
> May 18 23:07:37 [2525] apache-up001.itc4u.local cib: info: pcmk_cpg_membership: Node 3 joined group cib (counter=2.0)
> May 18 23:07:37 [2525] apache-up001.itc4u.local cib: info: pcmk_cpg_membership: Node 1 still member of group cib (peer=apache-up001.ring0, counter=2.0)
> May 18 23:07:37 [2525] apache-up001.itc4u.local cib: info: pcmk_cpg_membership: Node 2 still member of group cib (peer=apache-up002.ring0, counter=2.1)
> May 18 23:07:37 [2528] apache-up001.itc4u.local attrd: info: crm_get_peer: Created entry e470c7c2-643a-446d-baa9-c69059604674/0x1594f70 for node apache-up003.ring0/3 (3 total)
> May 18 23:07:37 [2528] apache-up001.itc4u.local attrd: info: crm_get_peer: Node 3 is now known as apache-up003.ring0
> May 18 23:07:37 [2528] apache-up001.itc4u.local attrd: info: crm_get_peer: Node 3 has uuid 3
> May 18 23:07:37 [2528] apache-up001.itc4u.local attrd: info: pcmk_cpg_membership: Node 3 still member of group attrd (peer=apache-up003.ring0, counter=2.2)
> May 18 23:07:37 [2528] apache-up001.itc4u.local attrd: info: crm_update_peer_proc: pcmk_cpg_membership: Node apache-up003.ring0[3] - corosync-cpg is now online
> May 18 23:07:37 [2528] apache-up001.itc4u.local attrd: notice: crm_update_peer_state_iter: crm_update_peer_proc: Node apache-up003.ring0[3] - state is now member (was (null))
> May 18 23:07:37 [2525] apache-up001.itc4u.local cib: info: crm_get_peer: Created entry a4c57ad6-16fd-4348-ab20-40bcc4ed2a19/0x12758d0 for node apache-up003.ring0/3 (3 total)
> May 18 23:07:37 [2525] apache-up001.itc4u.local cib: info: crm_get_peer: Node 3 is now known as apache-up003.ring0
> May 18 23:07:37 [2525] apache-up001.itc4u.local cib: info: crm_get_peer: Node 3 has uuid 3
> May 18 23:07:37 [2525] apache-up001.itc4u.local cib: info: pcmk_cpg_membership: Node 3 still member of group cib (peer=apache-up003.ring0, counter=2.2)
> May 18 23:07:37 [2525] apache-up001.itc4u.local cib: info: crm_update_peer_proc: pcmk_cpg_membership: Node apache-up003.ring0[3] - corosync-cpg is now online
> May 18 23:07:37 [2525] apache-up001.itc4u.local cib: notice: crm_update_peer_state_iter: crm_update_peer_proc: Node apache-up003.ring0[3] - state is now member (was (null))
> May 18 23:07:37 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section nodes: OK (rc=0, origin=apache-up003.ring0/crmd/6, version=0.67.9)
> May 18 23:07:37 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Forwarding cib_modify operation for section status to master (origin=local/crmd/46)
> May 18 23:07:37 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_open_2: shm size:1048589; real_size:1052672; rb->word_size:263168
> May 18 23:07:37 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_open_2: shm size:1048589; real_size:1052672; rb->word_size:263168
> May 18 23:07:37 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_open_2: shm size:1048589; real_size:1052672; rb->word_size:263168
> May 18 23:07:37 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_ipcc_disconnect: qb_ipcc_disconnect()
> May 18 23:07:37 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Forwarding cib_modify operation for section nodes to master (origin=local/crmd/50)
> May 18 23:07:37 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Forwarding cib_modify operation for section status to master (origin=local/crmd/51)
> May 18 23:07:37 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Forwarding cib_modify operation for section status to master (origin=local/crmd/52)
> May 18 23:07:37 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Forwarding cib_modify operation for section status to master (origin=local/crmd/53)
> May 18 23:07:37 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_close: Closing ringbuffer: /dev/shm/qb-cmap-request-2429-2526-30-header
> May 18 23:07:37 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_close: Closing ringbuffer: /dev/shm/qb-cmap-response-2429-2526-30-header
> May 18 23:07:37 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: --- 0.67.9 2
> May 18 23:07:37 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: +++ 0.67.10 (null)
> May 18 23:07:37 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib: @num_updates=10
> May 18 23:07:37 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ /cib/status: <node_state id="2" uname="apache-up002.ring0" crmd="offline" crm-debug-origin="peer_update_callback"/>
> May 18 23:07:37 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_close: Closing ringbuffer: /dev/shm/qb-cmap-event-2429-2526-30-header
> May 18 23:07:37 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section status: OK (rc=0, origin=apache-up001.ring0/crmd/46, version=0.67.10)
> May 18 23:07:37 [2526] apache-up001.itc4u.local stonith-ng: info: crm_get_peer: Created entry 64f2a9e2-d594-4eb2-b2df-891ffb8e2707/0x1fa42d0 for node apache-up003.ring0/3 (3 total)
> May 18 23:07:37 [2526] apache-up001.itc4u.local stonith-ng: info: crm_get_peer: Node 3 is now known as apache-up003.ring0
> May 18 23:07:37 [2526] apache-up001.itc4u.local stonith-ng: debug: st_peer_update_callback: Broadcasting our uname because of node 3
> May 18 23:07:37 [2526] apache-up001.itc4u.local stonith-ng: info: crm_get_peer: Node 3 has uuid 3
> May 18 23:07:37 [2526] apache-up001.itc4u.local stonith-ng: info: pcmk_cpg_membership: Node 3 still member of group stonith-ng (peer=apache-up003.ring0, counter=2.2)
> May 18 23:07:37 [2526] apache-up001.itc4u.local stonith-ng: info: crm_update_peer_proc: pcmk_cpg_membership: Node apache-up003.ring0[3] - corosync-cpg is now online
> May 18 23:07:37 [2526] apache-up001.itc4u.local stonith-ng: notice: crm_update_peer_state_iter: crm_update_peer_proc: Node apache-up003.ring0[3] - state is now member (was (null))
> May 18 23:07:37 [2526] apache-up001.itc4u.local stonith-ng: debug: st_peer_update_callback: Broadcasting our uname because of node 3
> May 18 23:07:37 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section nodes: OK (rc=0, origin=apache-up001.ring0/crmd/50, version=0.67.10)
> May 18 23:07:37 [2526] apache-up001.itc4u.local stonith-ng: debug: xml_patch_version_check: Can apply patch 0.67.7 to 0.67.6
> May 18 23:07:37 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: --- 0.67.10 2
> May 18 23:07:37 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: +++ 0.67.11 (null)
> May 18 23:07:37 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib: @num_updates=11
> May 18 23:07:37 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib/status/node_state[@id='2']: @crm-debug-origin=post_cache_update, @in_ccm=true
> May 18 23:07:37 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section status: OK (rc=0, origin=apache-up001.ring0/crmd/51, version=0.67.11)
> May 18 23:07:37 [2526] apache-up001.itc4u.local stonith-ng: debug: xml_patch_version_check: Can apply patch 0.67.8 to 0.67.7
> May 18 23:07:37 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: --- 0.67.11 2
> May 18 23:07:37 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: +++ 0.67.12 (null)
> May 18 23:07:37 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib: @num_updates=12
> May 18 23:07:37 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib/status/node_state[@id='2']: @crmd=online, @crm-debug-origin=peer_update_callback
> May 18 23:07:37 [2526] apache-up001.itc4u.local stonith-ng: debug: xml_patch_version_check: Can apply patch 0.67.9 to 0.67.8
> May 18 23:07:37 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section status: OK (rc=0, origin=apache-up001.ring0/crmd/52, version=0.67.12)
> May 18 23:07:37 [2526] apache-up001.itc4u.local stonith-ng: debug: xml_patch_version_check: Can apply patch 0.67.10 to 0.67.9
> May 18 23:07:37 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: --- 0.67.12 2
> May 18 23:07:37 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: +++ 0.67.13 (null)
> May 18 23:07:37 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib: @num_updates=13
> May 18 23:07:37 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib/status/node_state[@id='3']: @crmd=online, @crm-debug-origin=peer_update_callback
> May 18 23:07:37 [2526] apache-up001.itc4u.local stonith-ng: debug: xml_patch_version_check: Can apply patch 0.67.11 to 0.67.10
> May 18 23:07:37 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section status: OK (rc=0, origin=apache-up001.ring0/crmd/53, version=0.67.13)
> May 18 23:07:37 [2526] apache-up001.itc4u.local stonith-ng: debug: xml_patch_version_check: Can apply patch 0.67.12 to 0.67.11
> May 18 23:07:37 [2526] apache-up001.itc4u.local stonith-ng: debug: xml_patch_version_check: Can apply patch 0.67.13 to 0.67.12
> May 18 23:07:38 [2530] apache-up001.itc4u.local crmd: info: do_dc_join_offer_one: join-2: Processing join_announce request from apache-up002.ring0 in state S_INTEGRATION
> May 18 23:07:38 [2530] apache-up001.itc4u.local crmd: info: crm_update_peer_join: do_dc_join_offer_one: Node apache-up002.ring0[2] - join-2 phase 1 -> 0
> May 18 23:07:38 [2530] apache-up001.itc4u.local crmd: info: join_make_offer: join-2: Sending offer to apache-up002.ring0
> May 18 23:07:38 [2530] apache-up001.itc4u.local crmd: info: crm_update_peer_join: join_make_offer: Node apache-up002.ring0[2] - join-2 phase 0 -> 1
> May 18 23:07:38 [2530] apache-up001.itc4u.local crmd: info: crm_update_peer_join: join_make_offer: Node apache-up001.ring0[1] - join-2 phase 4 -> 0
> May 18 23:07:38 [2530] apache-up001.itc4u.local crmd: info: join_make_offer: join-2: Sending offer to apache-up001.ring0
> May 18 23:07:38 [2530] apache-up001.itc4u.local crmd: info: crm_update_peer_join: join_make_offer: Node apache-up001.ring0[1] - join-2 phase 0 -> 1
> May 18 23:07:38 [2530] apache-up001.itc4u.local crmd: info: abort_transition_graph: Transition aborted: Node join (source=do_dc_join_offer_one:250, 1)
> May 18 23:07:38 [2530] apache-up001.itc4u.local crmd: info: do_dc_join_offer_one: join-2: Processing join_announce request from apache-up003.ring0 in state S_INTEGRATION
> May 18 23:07:38 [2530] apache-up001.itc4u.local crmd: info: crm_update_peer_join: do_dc_join_offer_one: Node apache-up003.ring0[3] - join-2 phase 1 -> 0
> May 18 23:07:38 [2530] apache-up001.itc4u.local crmd: info: join_make_offer: join-2: Sending offer to apache-up003.ring0
> May 18 23:07:38 [2530] apache-up001.itc4u.local crmd: info: crm_update_peer_join: join_make_offer: Node apache-up003.ring0[3] - join-2 phase 0 -> 1
> May 18 23:07:38 [2530] apache-up001.itc4u.local crmd: info: crm_update_peer_join: join_make_offer: Node apache-up001.ring0[1] - join-2 phase 1 -> 0
> May 18 23:07:38 [2530] apache-up001.itc4u.local crmd: info: join_make_offer: join-2: Sending offer to apache-up001.ring0
> May 18 23:07:38 [2530] apache-up001.itc4u.local crmd: info: crm_update_peer_join: join_make_offer: Node apache-up001.ring0[1] - join-2 phase 0 -> 1
> May 18 23:07:38 [2530] apache-up001.itc4u.local crmd: info: abort_transition_graph: Transition aborted: Node join (source=do_dc_join_offer_one:250, 1)
> May 18 23:07:38 [2530] apache-up001.itc4u.local crmd: info: crm_update_peer_join: do_dc_join_filter_offer: Node apache-up001.ring0[1] - join-2 phase 1 -> 2
> May 18 23:07:39 [2530] apache-up001.itc4u.local crmd: info: crm_update_peer_join: do_dc_join_filter_offer: Node apache-up002.ring0[2] - join-2 phase 1 -> 2
> May 18 23:07:39 [2530] apache-up001.itc4u.local crmd: info: crm_update_peer_expected: do_dc_join_filter_offer: Node apache-up002.ring0[2] - expected state is now member (was (null))
> May 18 23:07:39 [2530] apache-up001.itc4u.local crmd: info: crm_update_peer_join: do_dc_join_filter_offer: Node apache-up003.ring0[3] - join-2 phase 1 -> 2
> May 18 23:07:39 [2530] apache-up001.itc4u.local crmd: info: crm_update_peer_expected: do_dc_join_filter_offer: Node apache-up003.ring0[3] - expected state is now member (was (null))
> May 18 23:07:39 [2530] apache-up001.itc4u.local crmd: info: do_state_transition: State transition S_INTEGRATION -> S_FINALIZE_JOIN [ input=I_INTEGRATED cause=C_FSA_INTERNAL origin=check_join_state ]
> May 18 23:07:39 [2530] apache-up001.itc4u.local crmd: info: crmd_join_phase_log: join-2: apache-up002.ring0=integrated
> May 18 23:07:39 [2530] apache-up001.itc4u.local crmd: info: crmd_join_phase_log: join-2: apache-up001.ring0=integrated
> May 18 23:07:39 [2530] apache-up001.itc4u.local crmd: info: crmd_join_phase_log: join-2: apache-up003.ring0=integrated
> May 18 23:07:39 [2530] apache-up001.itc4u.local crmd: info: do_dc_join_finalize: join-2: Syncing our CIB to the rest of the cluster
> May 18 23:07:39 [2530] apache-up001.itc4u.local crmd: info: crm_update_peer_join: finalize_join_for: Node apache-up002.ring0[2] - join-2 phase 2 -> 3
> May 18 23:07:39 [2530] apache-up001.itc4u.local crmd: info: crm_update_peer_join: finalize_join_for: Node apache-up001.ring0[1] - join-2 phase 2 -> 3
> May 18 23:07:39 [2530] apache-up001.itc4u.local crmd: info: crm_update_peer_join: finalize_join_for: Node apache-up003.ring0[3] - join-2 phase 2 -> 3
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_process_replace: Digest matched on replace from apache-up001.ring0: 6809c68d3245e7b22b27352e6224110e
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_process_replace: Replaced 0.67.13 with 0.67.13 from apache-up001.ring0
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_replace operation for section 'all': OK (rc=0, origin=apache-up001.ring0/crmd/56, version=0.67.13)
> May 18 23:07:39 [2530] apache-up001.itc4u.local crmd: info: crm_update_peer_join: do_dc_join_ack: Node apache-up002.ring0[2] - join-2 phase 3 -> 4
> May 18 23:07:39 [2530] apache-up001.itc4u.local crmd: info: do_dc_join_ack: join-2: Updating node state to member for apache-up002.ring0
> May 18 23:07:39 [2530] apache-up001.itc4u.local crmd: info: erase_status_tag: Deleting xpath: //node_state[@uname='apache-up002.ring0']/lrm
> May 18 23:07:39 [2530] apache-up001.itc4u.local crmd: info: crm_update_peer_join: do_dc_join_ack: Node apache-up001.ring0[1] - join-2 phase 3 -> 4
> May 18 23:07:39 [2530] apache-up001.itc4u.local crmd: info: do_dc_join_ack: join-2: Updating node state to member for apache-up001.ring0
> May 18 23:07:39 [2530] apache-up001.itc4u.local crmd: info: erase_status_tag: Deleting xpath: //node_state[@uname='apache-up001.ring0']/lrm
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_delete operation for section //node_state[@uname='apache-up002.ring0']/transient_attributes: OK (rc=0, origin=apache-up002.ring0/crmd/11, version=0.67.13)
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Forwarding cib_modify operation for section nodes to master (origin=local/crmd/57)
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Forwarding cib_modify operation for section nodes to master (origin=local/crmd/58)
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Forwarding cib_modify operation for section nodes to master (origin=local/crmd/59)
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Forwarding cib_delete operation for section //node_state[@uname='apache-up002.ring0']/lrm to master (origin=local/crmd/60)
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Forwarding cib_modify operation for section status to master (origin=local/crmd/61)
> May 18 23:07:39 [2528] apache-up001.itc4u.local attrd: info: attrd_peer_update: Setting shutdown[apache-up002.ring0]: (null) -> 0 from apache-up002.ring0
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section nodes: OK (rc=0, origin=apache-up001.ring0/crmd/57, version=0.67.13)
> May 18 23:07:39 [2530] apache-up001.itc4u.local crmd: info: crm_update_peer_join: do_dc_join_ack: Node apache-up003.ring0[3] - join-2 phase 3 -> 4
> May 18 23:07:39 [2530] apache-up001.itc4u.local crmd: info: do_dc_join_ack: join-2: Updating node state to member for apache-up003.ring0
> May 18 23:07:39 [2530] apache-up001.itc4u.local crmd: info: erase_status_tag: Deleting xpath: //node_state[@uname='apache-up003.ring0']/lrm
> May 18 23:07:39 [2528] apache-up001.itc4u.local attrd: info: write_attribute: Sent update 3 with 2 changes for shutdown, id=<n/a>, set=(null)
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section nodes: OK (rc=0, origin=apache-up001.ring0/crmd/58, version=0.67.13)
> May 18 23:07:39 [2528] apache-up001.itc4u.local attrd: info: attrd_peer_update: Setting shutdown[apache-up003.ring0]: (null) -> 0 from apache-up003.ring0
> May 18 23:07:39 [2528] apache-up001.itc4u.local attrd: info: write_attribute: Write out of 'shutdown' delayed: update 3 in progress
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section status: OK (rc=0, origin=apache-up002.ring0/attrd/2, version=0.67.13)
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: --- 0.67.13 2
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: +++ 0.67.14 (null)
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib: @num_updates=14
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section status: OK (rc=0, origin=apache-up002.ring0/attrd/3, version=0.67.14)
> May 18 23:07:39 [2526] apache-up001.itc4u.local stonith-ng: debug: xml_patch_version_check: Can apply patch 0.67.14 to 0.67.13
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section nodes: OK (rc=0, origin=apache-up001.ring0/crmd/59, version=0.67.14)
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_delete operation for section //node_state[@uname='apache-up002.ring0']/lrm: OK (rc=0, origin=apache-up001.ring0/crmd/60, version=0.67.14)
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: --- 0.67.14 2
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: +++ 0.67.15 (null)
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib: @num_updates=15
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib/status/node_state[@id='2']: @crm-debug-origin=do_lrm_query_internal, @join=member, @expected=member
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ /cib/status/node_state[@id='2']: <lrm id="2"/>
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ <lrm_resources/>
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ </lrm>
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section status: OK (rc=0, origin=apache-up001.ring0/crmd/61, version=0.67.15)
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Forwarding cib_delete operation for section //node_state[@uname='apache-up001.ring0']/lrm to master (origin=local/crmd/62)
> May 18 23:07:39 [2526] apache-up001.itc4u.local stonith-ng: debug: xml_patch_version_check: Can apply patch 0.67.15 to 0.67.14
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Forwarding cib_modify operation for section status to master (origin=local/crmd/63)
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Forwarding cib_delete operation for section //node_state[@uname='apache-up003.ring0']/lrm to master (origin=local/crmd/64)
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Forwarding cib_modify operation for section status to master (origin=local/crmd/65)
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Forwarding cib_modify operation for section status to master (origin=local/attrd/3)
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: --- 0.67.15 2
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: +++ 0.67.16 (null)
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: -- /cib/status/node_state[@id='1']/lrm[@id='1']
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib: @num_updates=16
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_delete operation for section //node_state[@uname='apache-up001.ring0']/lrm: OK (rc=0, origin=apache-up001.ring0/crmd/62, version=0.67.16)
> May 18 23:07:39 [2526] apache-up001.itc4u.local stonith-ng: debug: xml_patch_version_check: Can apply patch 0.67.16 to 0.67.15
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: --- 0.67.16 2
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: +++ 0.67.17 (null)
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib: @num_updates=17
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib/status/node_state[@id='1']: @crm-debug-origin=do_lrm_query_internal
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ /cib/status/node_state[@id='1']: <lrm id="1"/>
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ <lrm_resources>
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ <lrm_resource id="scsi" type="fence_scsi" class="stonith">
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ <lrm_rsc_op id="scsi_last_0" operation_key="scsi_monitor_0" operation="monitor" crm-debug-origin="build_active_RAs" crm_feature_set="3.0.10" transition-key="2:0:7:cf3499c5-bb9c-4d70-b209-80cc23a4ff1d" transition-magic="0:7;2:0:7:cf3499c5-bb9c-4d70-b209-80cc23a4ff1d" on_node="apache-up001.ring0" call-id="5" rc-code="7" op-status="0" interval="0" last-run="1463605301" last-rc-chang
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ </lrm_resource>
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ </lrm_resources>
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ </lrm>
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section status: OK (rc=0, origin=apache-up001.ring0/crmd/63, version=0.67.17)
> May 18 23:07:39 [2526] apache-up001.itc4u.local stonith-ng: debug: xml_patch_version_check: Can apply patch 0.67.17 to 0.67.16
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_delete operation for section //node_state[@uname='apache-up003.ring0']/lrm: OK (rc=0, origin=apache-up001.ring0/crmd/64, version=0.67.17)
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: --- 0.67.17 2
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: +++ 0.67.18 (null)
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib: @num_updates=18
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib/status/node_state[@id='3']: @crm-debug-origin=do_lrm_query_internal, @join=member, @expected=member
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ /cib/status/node_state[@id='3']: <lrm id="3"/>
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ <lrm_resources/>
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ </lrm>
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section status: OK (rc=0, origin=apache-up001.ring0/crmd/65, version=0.67.18)
> May 18 23:07:39 [2526] apache-up001.itc4u.local stonith-ng: debug: xml_patch_version_check: Can apply patch 0.67.18 to 0.67.17
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: --- 0.67.18 2
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: +++ 0.67.19 (null)
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib: @num_updates=19
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ /cib/status/node_state[@id='2']: <transient_attributes id="2"/>
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ <instance_attributes id="status-2">
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ <nvpair id="status-2-shutdown" name="shutdown" value="0"/>
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ </instance_attributes>
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ </transient_attributes>
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section status: OK (rc=0, origin=apache-up001.ring0/attrd/3, version=0.67.19)
> May 18 23:07:39 [2526] apache-up001.itc4u.local stonith-ng: debug: xml_patch_version_check: Can apply patch 0.67.19 to 0.67.18
> May 18 23:07:39 [2528] apache-up001.itc4u.local attrd: info: attrd_cib_callback: Update 3 for shutdown: OK (0)
> May 18 23:07:39 [2528] apache-up001.itc4u.local attrd: info: attrd_cib_callback: Update 3 for shutdown[apache-up001.ring0]=0: OK (0)
> May 18 23:07:39 [2528] apache-up001.itc4u.local attrd: info: attrd_cib_callback: Update 3 for shutdown[apache-up002.ring0]=0: OK (0)
> May 18 23:07:39 [2528] apache-up001.itc4u.local attrd: info: attrd_cib_callback: Update 3 for shutdown[apache-up003.ring0]=(null): OK (0)
> May 18 23:07:39 [2528] apache-up001.itc4u.local attrd: info: write_attribute: Sent update 4 with 3 changes for shutdown, id=<n/a>, set=(null)
> May 18 23:07:39 [2530] apache-up001.itc4u.local crmd: info: do_state_transition: State transition S_FINALIZE_JOIN -> S_POLICY_ENGINE [ input=I_FINALIZED cause=C_FSA_INTERNAL origin=check_join_state ]
> May 18 23:07:39 [2530] apache-up001.itc4u.local crmd: info: abort_transition_graph: Transition aborted: Peer Cancelled (source=do_te_invoke:161, 1)
> May 18 23:07:39 [2528] apache-up001.itc4u.local attrd: info: attrd_client_refresh: Updating all attributes
> May 18 23:07:39 [2528] apache-up001.itc4u.local attrd: info: write_attribute: Write out of 'shutdown' delayed: update 4 in progress
> May 18 23:07:39 [2528] apache-up001.itc4u.local attrd: info: write_attribute: Sent update 5 with 3 changes for terminate, id=<n/a>, set=(null)
> May 18 23:07:39 [2530] apache-up001.itc4u.local crmd: info: abort_transition_graph: Transition aborted by transient_attributes.2 'create': Transient attribute change (cib=0.67.19, source=abort_unless_down:329, path=/cib/status/node_state[@id='2'], 1)
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Forwarding cib_modify operation for section nodes to master (origin=local/crmd/69)
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Forwarding cib_modify operation for section status to master (origin=local/crmd/70)
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Forwarding cib_modify operation for section status to master (origin=local/attrd/4)
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Forwarding cib_modify operation for section status to master (origin=local/attrd/5)
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Forwarding cib_modify operation for section cib to master (origin=local/crmd/71)
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section nodes: OK (rc=0, origin=apache-up001.ring0/crmd/69, version=0.67.19)
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: --- 0.67.19 2
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: +++ 0.67.20 (null)
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib: @num_updates=20
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib/status/node_state[@id='1']: @crm-debug-origin=do_state_transition
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib/status/node_state[@id='3']: @crm-debug-origin=do_state_transition
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib/status/node_state[@id='2']: @crm-debug-origin=do_state_transition
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section status: OK (rc=0, origin=apache-up001.ring0/crmd/70, version=0.67.20)
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: --- 0.67.20 2
> May 18 23:07:39 [2526] apache-up001.itc4u.local stonith-ng: debug: xml_patch_version_check: Can apply patch 0.67.20 to 0.67.19
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: +++ 0.67.21 (null)
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib: @num_updates=21
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ /cib/status/node_state[@id='3']: <transient_attributes id="3"/>
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ <instance_attributes id="status-3">
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ <nvpair id="status-3-shutdown" name="shutdown" value="0"/>
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ </instance_attributes>
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ </transient_attributes>
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section status: OK (rc=0, origin=apache-up001.ring0/attrd/4, version=0.67.21)
> May 18 23:07:39 [2526] apache-up001.itc4u.local stonith-ng: debug: xml_patch_version_check: Can apply patch 0.67.21 to 0.67.20
> May 18 23:07:39 [2528] apache-up001.itc4u.local attrd: info: attrd_cib_callback: Update 4 for shutdown: OK (0)
> May 18 23:07:39 [2528] apache-up001.itc4u.local attrd: info: attrd_cib_callback: Update 4 for shutdown[apache-up001.ring0]=0: OK (0)
> May 18 23:07:39 [2528] apache-up001.itc4u.local attrd: info: attrd_cib_callback: Update 4 for shutdown[apache-up002.ring0]=0: OK (0)
> May 18 23:07:39 [2528] apache-up001.itc4u.local attrd: info: attrd_cib_callback: Update 4 for shutdown[apache-up003.ring0]=0: OK (0)
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: --- 0.67.21 2
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: +++ 0.67.22 (null)
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib: @num_updates=22
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section status: OK (rc=0, origin=apache-up001.ring0/attrd/5, version=0.67.22)
> May 18 23:07:39 [2526] apache-up001.itc4u.local stonith-ng: debug: xml_patch_version_check: Can apply patch 0.67.22 to 0.67.21
> May 18 23:07:39 [2528] apache-up001.itc4u.local attrd: info: attrd_cib_callback: Update 5 for terminate: OK (0)
> May 18 23:07:39 [2528] apache-up001.itc4u.local attrd: info: attrd_cib_callback: Update 5 for terminate[apache-up001.ring0]=(null): OK (0)
> May 18 23:07:39 [2528] apache-up001.itc4u.local attrd: info: attrd_cib_callback: Update 5 for terminate[apache-up002.ring0]=(null): OK (0)
> May 18 23:07:39 [2528] apache-up001.itc4u.local attrd: info: attrd_cib_callback: Update 5 for terminate[apache-up003.ring0]=(null): OK (0)
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section cib: OK (rc=0, origin=apache-up001.ring0/crmd/71, version=0.67.22)
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_file_backup: Archived previous version as /var/lib/pacemaker/cib/cib-5.raw
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_file_write_with_digest: Wrote version 0.67.0 of the CIB to disk (digest: 9adf3399131073db5439f0a2a9aead6f)
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section status: OK (rc=0, origin=apache-up003.ring0/attrd/2, version=0.67.22)
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: --- 0.67.22 2
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: +++ 0.67.23 (null)
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib: @num_updates=23
> May 18 23:07:39 [2526] apache-up001.itc4u.local stonith-ng: debug: xml_patch_version_check: Can apply patch 0.67.23 to 0.67.22
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_file_write_with_digest: Reading cluster configuration file /var/lib/pacemaker/cib/cib.4ioZip (digest: /var/lib/pacemaker/cib/cib.3csJ1Y)
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section status: OK (rc=0, origin=apache-up003.ring0/attrd/3, version=0.67.23)
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: --- 0.67.23 2
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: +++ 0.67.24 (null)
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: -- /cib/status/node_state[@id='3']/transient_attributes[@id='3']
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib: @num_updates=24
> May 18 23:07:39 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_delete operation for section //node_state[@uname='apache-up003.ring0']/transient_attributes: OK (rc=0, origin=apache-up003.ring0/crmd/11, version=0.67.24)
> May 18 23:07:39 [2526] apache-up001.itc4u.local stonith-ng: debug: xml_patch_version_check: Can apply patch 0.67.24 to 0.67.23
> May 18 23:07:40 [2530] apache-up001.itc4u.local crmd: info: abort_transition_graph: Transition aborted by transient_attributes.3 'create': Transient attribute change (cib=0.67.21, source=abort_unless_down:329, path=/cib/status/node_state[@id='3'], 1)
> May 18 23:07:40 [2530] apache-up001.itc4u.local crmd: warning: match_down_event: No match for shutdown action on 3
> May 18 23:07:40 [2530] apache-up001.itc4u.local crmd: info: abort_transition_graph: Transition aborted by deletion of transient_attributes[@id='3']: Transient attribute change (cib=0.67.24, source=abort_unless_down:343, path=/cib/status/node_state[@id='3']/transient_attributes[@id='3'], 1)
> May 18 23:07:40 [2529] apache-up001.itc4u.local pengine: info: determine_online_status: Node apache-up001.ring0 is online
> May 18 23:07:40 [2529] apache-up001.itc4u.local pengine: info: determine_online_status: Node apache-up003.ring0 is online
> May 18 23:07:40 [2529] apache-up001.itc4u.local pengine: info: determine_online_status: Node apache-up002.ring0 is online
> May 18 23:07:40 [2529] apache-up001.itc4u.local pengine: info: native_print: scsi (stonith:fence_scsi): Stopped
> May 18 23:07:40 [2529] apache-up001.itc4u.local pengine: notice: trigger_unfencing: Unfencing apache-up002.ring0: node discovery
> May 18 23:07:40 [2529] apache-up001.itc4u.local pengine: notice: trigger_unfencing: Unfencing apache-up003.ring0: node discovery
> May 18 23:07:40 [2529] apache-up001.itc4u.local pengine: info: RecurringOp: Start recurring monitor (60s) for scsi on apache-up001.ring0
> May 18 23:07:40 [2529] apache-up001.itc4u.local pengine: notice: LogActions: Start scsi (apache-up001.ring0)
> May 18 23:07:40 [2529] apache-up001.itc4u.local pengine: notice: process_pe_message: Calculated Transition 1: /var/lib/pacemaker/pengine/pe-input-123.bz2
> May 18 23:07:40 [2530] apache-up001.itc4u.local crmd: info: do_state_transition: State transition S_POLICY_ENGINE -> S_TRANSITION_ENGINE [ input=I_PE_SUCCESS cause=C_IPC_MESSAGE origin=handle_response ]
> May 18 23:07:40 [2530] apache-up001.itc4u.local crmd: info: do_te_invoke: Processing graph 1 (ref=pe_calc-dc-1463605660-28) derived from /var/lib/pacemaker/pengine/pe-input-123.bz2
> May 18 23:07:40 [2530] apache-up001.itc4u.local crmd: notice: te_fence_node: Executing on fencing operation (5) on apache-up003.ring0 (timeout=60000)
> May 18 23:07:40 [2530] apache-up001.itc4u.local crmd: notice: te_fence_node: Executing on fencing operation (3) on apache-up002.ring0 (timeout=60000)
> May 18 23:07:40 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_command: Processing st_fence 94 from crmd.2530 ( 0)
> May 18 23:07:40 [2526] apache-up001.itc4u.local stonith-ng: notice: handle_request: Client crmd.2530.78870da2 wants to fence (on) 'apache-up003.ring0' with device '(any)'
> May 18 23:07:40 [2526] apache-up001.itc4u.local stonith-ng: notice: initiate_remote_stonith_op: Initiating remote operation on for apache-up003.ring0: a495d11f-c17f-4d81-8907-7225015a1f8a (0)
> May 18 23:07:40 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_command: Processed st_fence from crmd.2530: Operation now in progress (-115)
> May 18 23:07:40 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_command: Processing st_fence 95 from crmd.2530 ( 0)
> May 18 23:07:40 [2526] apache-up001.itc4u.local stonith-ng: notice: handle_request: Client crmd.2530.78870da2 wants to fence (on) 'apache-up002.ring0' with device '(any)'
> May 18 23:07:40 [2526] apache-up001.itc4u.local stonith-ng: notice: initiate_remote_stonith_op: Initiating remote operation on for apache-up002.ring0: 059132ba-c7ea-4430-9973-32c5ab2a42fc (0)
> May 18 23:07:40 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_command: Processed st_fence from crmd.2530: Operation now in progress (-115)
> May 18 23:07:40 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_command: Processing st_query 0 from apache-up001.ring0 ( 0)
> May 18 23:07:40 [2526] apache-up001.itc4u.local stonith-ng: debug: create_remote_stonith_op: a495d11f-c17f-4d81-8907-7225015a1f8a already exists
> May 18 23:07:40 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_query: Query <stonith_command __name__="stonith_command" t="stonith-ng" st_async_id="a495d11f-c17f-4d81-8907-7225015a1f8a" st_op="st_query" st_callid="3" st_callopt="0" st_remote_op="a495d11f-c17f-4d81-8907-7225015a1f8a" st_target="apache-up003.ring0" st_device_action="on" st_origin="apache-up001.ring0" st_clientid="78870da2-902c-4786-8f46-788a0476f5ae" st_clientname="crmd.2530" st_timeout="60" src="apache-up001.ring0"/>
> May 18 23:07:40 [2526] apache-up001.itc4u.local stonith-ng: debug: get_capable_devices: Searching through 1 devices to see what is capable of action (on) for target apache-up003.ring0
> May 18 23:07:40 [2526] apache-up001.itc4u.local stonith-ng: debug: search_devices_record_result: Finished Search. 0 devices can perform action (on) on node apache-up003.ring0
> May 18 23:07:40 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_query_capable_device_cb: Found 0 matching devices for 'apache-up003.ring0'
> May 18 23:07:40 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_command: Processed st_query from apache-up001.ring0: OK (0)
> May 18 23:07:40 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_command: Processing st_query 0 from apache-up001.ring0 ( 0)
> May 18 23:07:40 [2526] apache-up001.itc4u.local stonith-ng: debug: create_remote_stonith_op: 059132ba-c7ea-4430-9973-32c5ab2a42fc already exists
> May 18 23:07:40 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_query: Query <stonith_command __name__="stonith_command" t="stonith-ng" st_async_id="059132ba-c7ea-4430-9973-32c5ab2a42fc" st_op="st_query" st_callid="4" st_callopt="0" st_remote_op="059132ba-c7ea-4430-9973-32c5ab2a42fc" st_target="apache-up002.ring0" st_device_action="on" st_origin="apache-up001.ring0" st_clientid="78870da2-902c-4786-8f46-788a0476f5ae" st_clientname="crmd.2530" st_timeout="60" src="apache-up001.ring0"/>
> May 18 23:07:40 [2526] apache-up001.itc4u.local stonith-ng: debug: get_capable_devices: Searching through 1 devices to see what is capable of action (on) for target apache-up002.ring0
> May 18 23:07:40 [2526] apache-up001.itc4u.local stonith-ng: debug: search_devices_record_result: Finished Search. 0 devices can perform action (on) on node apache-up002.ring0
> May 18 23:07:40 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_query_capable_device_cb: Found 0 matching devices for 'apache-up002.ring0'
> May 18 23:07:40 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_command: Processed st_query from apache-up001.ring0: OK (0)
> May 18 23:07:40 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_command: Processing st_query reply 0 from apache-up002.ring0 ( 0)
> May 18 23:07:40 [2526] apache-up001.itc4u.local stonith-ng: info: process_remote_stonith_query: Query result 1 of 3 from apache-up002.ring0 for apache-up003.ring0/on (0 devices) a495d11f-c17f-4d81-8907-7225015a1f8a
> May 18 23:07:40 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_command: Processed st_query reply from apache-up002.ring0: OK (0)
> May 18 23:07:40 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_command: Processing st_query reply 0 from apache-up001.ring0 ( 0)
> May 18 23:07:40 [2526] apache-up001.itc4u.local stonith-ng: info: process_remote_stonith_query: Query result 2 of 3 from apache-up001.ring0 for apache-up003.ring0/on (0 devices) a495d11f-c17f-4d81-8907-7225015a1f8a
> May 18 23:07:40 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_command: Processed st_query reply from apache-up001.ring0: OK (0)
> May 18 23:07:40 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_command: Processing st_query reply 0 from apache-up001.ring0 ( 0)
> May 18 23:07:40 [2526] apache-up001.itc4u.local stonith-ng: info: process_remote_stonith_query: Query result 1 of 3 from apache-up001.ring0 for apache-up002.ring0/on (0 devices) 059132ba-c7ea-4430-9973-32c5ab2a42fc
> May 18 23:07:40 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_command: Processed st_query reply from apache-up001.ring0: OK (0)
> May 18 23:07:40 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_command: Processing st_query reply 0 from apache-up003.ring0 ( 0)
> May 18 23:07:40 [2526] apache-up001.itc4u.local stonith-ng: info: process_remote_stonith_query: Query result 2 of 3 from apache-up003.ring0 for apache-up002.ring0/on (0 devices) 059132ba-c7ea-4430-9973-32c5ab2a42fc
> May 18 23:07:40 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_command: Processed st_query reply from apache-up003.ring0: OK (0)
> May 18 23:07:41 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_command: Processing st_query reply 0 from apache-up002.ring0 ( 0)
> May 18 23:07:41 [2526] apache-up001.itc4u.local stonith-ng: info: process_remote_stonith_query: Query result 3 of 3 from apache-up002.ring0 for apache-up002.ring0/on (0 devices) 059132ba-c7ea-4430-9973-32c5ab2a42fc
> May 18 23:07:41 [2526] apache-up001.itc4u.local stonith-ng: info: process_remote_stonith_query: All query replies have arrived, continuing (3 expected/3 received)
> May 18 23:07:41 [2526] apache-up001.itc4u.local stonith-ng: notice: stonith_choose_peer: Couldn't find anyone to fence (on) apache-up002.ring0 with any device
> May 18 23:07:41 [2526] apache-up001.itc4u.local stonith-ng: info: call_remote_stonith: Total remote op timeout set to 60 for fencing of node apache-up002.ring0 for crmd.2530.059132ba
> May 18 23:07:41 [2526] apache-up001.itc4u.local stonith-ng: info: call_remote_stonith: None of the 3 peers have devices capable of fencing (on) apache-up002.ring0 for crmd.2530 (0)
> May 18 23:07:41 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_command: Processed st_query reply from apache-up002.ring0: OK (0)
> May 18 23:07:41 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_command: Processing st_notify reply 0 from apache-up001.ring0 ( 0)
> May 18 23:07:41 [2526] apache-up001.itc4u.local stonith-ng: debug: process_remote_stonith_exec: Marking call to on for apache-up002.ring0 on behalf of crmd.2530 at 059132ba-c7ea-4430-9973-32c5ab2a42fc.apache-u: No such device (-19)
> May 18 23:07:41 [2526] apache-up001.itc4u.local stonith-ng: error: remote_op_done: Operation on of apache-up002.ring0 by <no-one> for crmd.2530 at apache-up001.ring0.059132ba: No such device
> May 18 23:07:41 [2530] apache-up001.itc4u.local crmd: notice: tengine_stonith_callback: Stonith operation 4/3:1:0:cf3499c5-bb9c-4d70-b209-80cc23a4ff1d: No such device (-19)
> May 18 23:07:41 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_command: Processed st_notify reply from apache-up001.ring0: OK (0)
> May 18 23:07:41 [2530] apache-up001.itc4u.local crmd: notice: tengine_stonith_callback: Stonith operation 4 for apache-up002.ring0 failed (No such device): aborting transition.
> May 18 23:07:41 [2530] apache-up001.itc4u.local crmd: notice: abort_transition_graph: Transition aborted: Stonith failed (source=tengine_stonith_callback:748, 0)
> May 18 23:07:41 [2530] apache-up001.itc4u.local crmd: error: tengine_stonith_notify: Unfencing of apache-up002.ring0 by <anyone> failed: No such device (-19)
> May 18 23:07:41 [2530] apache-up001.itc4u.local crmd: notice: te_rsc_command: Initiating action 2: monitor scsi_monitor_0 on apache-up002.ring0
> May 18 23:07:41 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_command: Processing st_query reply 0 from apache-up003.ring0 ( 0)
> May 18 23:07:41 [2526] apache-up001.itc4u.local stonith-ng: info: process_remote_stonith_query: Query result 3 of 3 from apache-up003.ring0 for apache-up003.ring0/on (0 devices) a495d11f-c17f-4d81-8907-7225015a1f8a
> May 18 23:07:41 [2526] apache-up001.itc4u.local stonith-ng: info: process_remote_stonith_query: All query replies have arrived, continuing (3 expected/3 received)
> May 18 23:07:41 [2526] apache-up001.itc4u.local stonith-ng: notice: stonith_choose_peer: Couldn't find anyone to fence (on) apache-up003.ring0 with any device
> May 18 23:07:41 [2526] apache-up001.itc4u.local stonith-ng: info: call_remote_stonith: Total remote op timeout set to 60 for fencing of node apache-up003.ring0 for crmd.2530.a495d11f
> May 18 23:07:41 [2526] apache-up001.itc4u.local stonith-ng: info: call_remote_stonith: None of the 3 peers have devices capable of fencing (on) apache-up003.ring0 for crmd.2530 (0)
> May 18 23:07:41 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_command: Processed st_query reply from apache-up003.ring0: OK (0)
> May 18 23:07:41 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_command: Processing st_notify reply 0 from apache-up001.ring0 ( 0)
> May 18 23:07:41 [2526] apache-up001.itc4u.local stonith-ng: debug: process_remote_stonith_exec: Marking call to on for apache-up003.ring0 on behalf of crmd.2530 at a495d11f-c17f-4d81-8907-7225015a1f8a.apache-u: No such device (-19)
> May 18 23:07:41 [2526] apache-up001.itc4u.local stonith-ng: error: remote_op_done: Operation on of apache-up003.ring0 by <no-one> for crmd.2530 at apache-up001.ring0.a495d11f: No such device
> May 18 23:07:41 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_command: Processed st_notify reply from apache-up001.ring0: OK (0)
> May 18 23:07:41 [2530] apache-up001.itc4u.local crmd: notice: tengine_stonith_callback: Stonith operation 3/5:1:0:cf3499c5-bb9c-4d70-b209-80cc23a4ff1d: No such device (-19)
> May 18 23:07:41 [2530] apache-up001.itc4u.local crmd: notice: tengine_stonith_callback: Stonith operation 3 for apache-up003.ring0 failed (No such device): aborting transition.
> May 18 23:07:41 [2530] apache-up001.itc4u.local crmd: info: abort_transition_graph: Transition aborted: Stonith failed (source=tengine_stonith_callback:748, 0)
> May 18 23:07:41 [2530] apache-up001.itc4u.local crmd: error: tengine_stonith_notify: Unfencing of apache-up003.ring0 by <anyone> failed: No such device (-19)
> May 18 23:07:41 [2530] apache-up001.itc4u.local crmd: notice: te_rsc_command: Initiating action 4: monitor scsi_monitor_0 on apache-up003.ring0
> May 18 23:07:42 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: --- 0.67.24 2
> May 18 23:07:42 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: +++ 0.67.25 (null)
> May 18 23:07:42 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib: @num_updates=25
> May 18 23:07:42 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib/status/node_state[@id='2']: @crm-debug-origin=do_update_resource
> May 18 23:07:42 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ /cib/status/node_state[@id='2']/lrm[@id='2']/lrm_resources: <lrm_resource id="scsi" type="fence_scsi" class="stonith"/>
> May 18 23:07:42 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ <lrm_rsc_op id="scsi_last_0" operation_key="scsi_monitor_0" operation="monitor" crm-debug-origin="do_update_resource" crm_feature_set="3.0.10" transition-key="2:1:7:cf3499c5-bb9c-4d70-b209-80cc23a4ff1d" transition-magic="0:7;2:1:7:cf3499c5-bb9c-4d70-b209-80cc23a4ff1d" on_node="apache-up002.ring0" call-id="5" rc-code="7" op-status="0" interval="0" last-run="
> May 18 23:07:42 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ </lrm_resource>
> May 18 23:07:42 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section status: OK (rc=0, origin=apache-up002.ring0/crmd/12, version=0.67.25)
> May 18 23:07:42 [2526] apache-up001.itc4u.local stonith-ng: debug: xml_patch_version_check: Can apply patch 0.67.25 to 0.67.24
> May 18 23:07:42 [2530] apache-up001.itc4u.local crmd: info: match_graph_event: Action scsi_monitor_0 (2) confirmed on apache-up002.ring0 (rc=7)
> May 18 23:07:42 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: --- 0.67.25 2
> May 18 23:07:42 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: +++ 0.67.26 (null)
> May 18 23:07:42 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib: @num_updates=26
> May 18 23:07:42 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib/status/node_state[@id='3']: @crm-debug-origin=do_update_resource
> May 18 23:07:42 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ /cib/status/node_state[@id='3']/lrm[@id='3']/lrm_resources: <lrm_resource id="scsi" type="fence_scsi" class="stonith"/>
> May 18 23:07:42 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ <lrm_rsc_op id="scsi_last_0" operation_key="scsi_monitor_0" operation="monitor" crm-debug-origin="do_update_resource" crm_feature_set="3.0.10" transition-key="4:1:7:cf3499c5-bb9c-4d70-b209-80cc23a4ff1d" transition-magic="0:7;4:1:7:cf3499c5-bb9c-4d70-b209-80cc23a4ff1d" on_node="apache-up003.ring0" call-id="5" rc-code="7" op-status="0" interval="0" last-run="
> May 18 23:07:42 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ </lrm_resource>
> May 18 23:07:42 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section status: OK (rc=0, origin=apache-up003.ring0/crmd/12, version=0.67.26)
> May 18 23:07:42 [2526] apache-up001.itc4u.local stonith-ng: debug: xml_patch_version_check: Can apply patch 0.67.26 to 0.67.25
> May 18 23:07:42 [2530] apache-up001.itc4u.local crmd: info: match_graph_event: Action scsi_monitor_0 (4) confirmed on apache-up003.ring0 (rc=7)
> May 18 23:07:42 [2530] apache-up001.itc4u.local crmd: notice: run_graph: Transition 1 (Complete=4, Pending=0, Fired=0, Skipped=1, Incomplete=2, Source=/var/lib/pacemaker/pengine/pe-input-123.bz2): Stopped
> May 18 23:07:42 [2530] apache-up001.itc4u.local crmd: notice: too_many_st_failures: No devices found in cluster to fence apache-up001.ring0, giving up
> May 18 23:07:42 [2530] apache-up001.itc4u.local crmd: info: do_log: FSA: Input I_TE_SUCCESS from notify_crmd() received in state S_TRANSITION_ENGINE
> May 18 23:07:42 [2530] apache-up001.itc4u.local crmd: notice: do_state_transition: State transition S_TRANSITION_ENGINE -> S_IDLE [ input=I_TE_SUCCESS cause=C_FSA_INTERNAL origin=notify_crmd ]
> May 18 23:07:47 [2525] apache-up001.itc4u.local cib: info: cib_process_ping: Reporting our current digest to apache-up001.ring0: 2b2a1e31a51d91d93d47fcf9f05779db for 0.67.26 (0x1081330 0)
> May 18 23:09:04 [2526] apache-up001.itc4u.local stonith-ng: debug: crm_client_new: Connecting 0x2009860 for uid=0 gid=0 pid=6162 id=09628ef2-5c00-4469-bd93-b7bdd0fc0040
> May 18 23:09:04 [2526] apache-up001.itc4u.local stonith-ng: debug: handle_new_connection: IPC credentials authenticated (2526-6162-12)
> May 18 23:09:04 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_ipcs_shm_connect: connecting to client [6162]
> May 18 23:09:04 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_open_2: shm size:131085; real_size:135168; rb->word_size:33792
> May 18 23:09:04 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_open_2: shm size:131085; real_size:135168; rb->word_size:33792
> May 18 23:09:04 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_open_2: shm size:131085; real_size:135168; rb->word_size:33792
> May 18 23:09:04 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_command: Processing register 1 from stonith_admin.6162 ( 0)
> May 18 23:09:04 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_command: Processed register from stonith_admin.6162: OK (0)
> May 18 23:09:04 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_command: Processing st_query 2 from stonith_admin.6162 ( 1002)
> May 18 23:09:04 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_query: Query <stonith_command __name__="stonith_command" t="stonith-ng" st_async_id="09628ef2-5c00-4469-bd93-b7bdd0fc0040" st_op="st_query" st_callid="2" st_callopt="4098" st_timeout="120" st_clientid="09628ef2-5c00-4469-bd93-b7bdd0fc0040" st_clientname="stonith_admin.6162" st_clientnode="apache-up001.ring0">
> May 18 23:09:04 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_query: Query <st_calldata>
> May 18 23:09:04 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_query: Query <st_device_id st_origin="stonith_api_query" st_target="apache-up003.node0" st_device_action="off"/>
> May 18 23:09:04 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_query: Query </st_calldata>
> May 18 23:09:04 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_query: Query </stonith_command>
> May 18 23:09:04 [2526] apache-up001.itc4u.local stonith-ng: debug: get_capable_devices: Searching through 1 devices to see what is capable of action (off) for target apache-up003.node0
> May 18 23:09:04 [2526] apache-up001.itc4u.local stonith-ng: debug: schedule_stonith_command: Scheduling status on scsi for stonith-ng (timeout=120s)
> May 18 23:09:04 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_command: Processed st_query from stonith_admin.6162: OK (0)
> May 18 23:09:04 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_action_create: Initiating action status for agent fence_scsi (target=apache-up003.node0)
> May 18 23:09:04 [2526] apache-up001.itc4u.local stonith-ng: debug: make_args: Performing status action for node 'apache-up003.node0' as 'port=apache-up003.node0'
> May 18 23:09:04 [2526] apache-up001.itc4u.local stonith-ng: debug: internal_stonith_action_execute: forking
> May 18 23:09:04 [2526] apache-up001.itc4u.local stonith-ng: debug: internal_stonith_action_execute: sending args
> May 18 23:09:04 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_device_execute: Operation status for node apache-up003.node0 on scsi now running with pid=6163, timeout=120s
> May 18 23:09:04 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_action_async_done: Child process 6163 performing action 'status' exited with rc 1
> May 18 23:09:04 [2526] apache-up001.itc4u.local stonith-ng: warning: log_action: fence_scsi[6163] stderr: [ WARNING:root:Parse error: Ignoring unknown option 'port=apache-up003.node0' ]
> May 18 23:09:04 [2526] apache-up001.itc4u.local stonith-ng: warning: log_action: fence_scsi[6163] stderr: [ ]
> May 18 23:09:04 [2526] apache-up001.itc4u.local stonith-ng: warning: log_action: fence_scsi[6163] stderr: [ ERROR:root:Failed: unable to parse output of corosync-cmapctl or node does not exist ]
> May 18 23:09:04 [2526] apache-up001.itc4u.local stonith-ng: warning: log_action: fence_scsi[6163] stderr: [ ]
> May 18 23:09:04 [2526] apache-up001.itc4u.local stonith-ng: warning: log_action: fence_scsi[6163] stderr: [ Failed: unable to parse output of corosync-cmapctl or node does not exist ]
> May 18 23:09:04 [2526] apache-up001.itc4u.local stonith-ng: warning: log_action: fence_scsi[6163] stderr: [ ]
> May 18 23:09:04 [2526] apache-up001.itc4u.local stonith-ng: warning: log_action: fence_scsi[6163] stderr: [ ERROR:root:Please use '-h' for usage ]
> May 18 23:09:04 [2526] apache-up001.itc4u.local stonith-ng: warning: log_action: fence_scsi[6163] stderr: [ ]
> May 18 23:09:04 [2526] apache-up001.itc4u.local stonith-ng: warning: log_action: fence_scsi[6163] stderr: [ Please use '-h' for usage ]
> May 18 23:09:04 [2526] apache-up001.itc4u.local stonith-ng: warning: log_action: fence_scsi[6163] stderr: [ ]
> May 18 23:09:04 [2526] apache-up001.itc4u.local stonith-ng: info: internal_stonith_action_execute: Attempt 2 to execute fence_scsi (status). remaining timeout is 120
> May 18 23:09:04 [2526] apache-up001.itc4u.local stonith-ng: debug: internal_stonith_action_execute: forking
> May 18 23:09:04 [2526] apache-up001.itc4u.local stonith-ng: debug: internal_stonith_action_execute: sending args
> May 18 23:09:04 [2526] apache-up001.itc4u.local stonith-ng: debug: child_waitpid: wait(6172) = 0: Success (0)
> May 18 23:09:05 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_action_async_done: Child process 6172 performing action 'status' exited with rc 1
> May 18 23:09:05 [2526] apache-up001.itc4u.local stonith-ng: warning: log_action: fence_scsi[6172] stderr: [ WARNING:root:Parse error: Ignoring unknown option 'port=apache-up003.node0' ]
> May 18 23:09:05 [2526] apache-up001.itc4u.local stonith-ng: warning: log_action: fence_scsi[6172] stderr: [ ]
> May 18 23:09:05 [2526] apache-up001.itc4u.local stonith-ng: warning: log_action: fence_scsi[6172] stderr: [ ERROR:root:Failed: unable to parse output of corosync-cmapctl or node does not exist ]
> May 18 23:09:05 [2526] apache-up001.itc4u.local stonith-ng: warning: log_action: fence_scsi[6172] stderr: [ ]
> May 18 23:09:05 [2526] apache-up001.itc4u.local stonith-ng: warning: log_action: fence_scsi[6172] stderr: [ Failed: unable to parse output of corosync-cmapctl or node does not exist ]
> May 18 23:09:05 [2526] apache-up001.itc4u.local stonith-ng: warning: log_action: fence_scsi[6172] stderr: [ ]
> May 18 23:09:05 [2526] apache-up001.itc4u.local stonith-ng: warning: log_action: fence_scsi[6172] stderr: [ ERROR:root:Please use '-h' for usage ]
> May 18 23:09:05 [2526] apache-up001.itc4u.local stonith-ng: warning: log_action: fence_scsi[6172] stderr: [ ]
> May 18 23:09:05 [2526] apache-up001.itc4u.local stonith-ng: warning: log_action: fence_scsi[6172] stderr: [ Please use '-h' for usage ]
> May 18 23:09:05 [2526] apache-up001.itc4u.local stonith-ng: warning: log_action: fence_scsi[6172] stderr: [ ]
> May 18 23:09:05 [2526] apache-up001.itc4u.local stonith-ng: info: update_remaining_timeout: Attempted to execute agent fence_scsi (status) the maximum number of times (2) allowed
> May 18 23:09:05 [2526] apache-up001.itc4u.local stonith-ng: notice: status_search_cb: Unknown result when testing if scsi can fence apache-up003.node0: rc=-201
> May 18 23:09:05 [2526] apache-up001.itc4u.local stonith-ng: debug: search_devices_record_result: Finished Search. 0 devices can perform action (off) on node apache-up003.node0
> May 18 23:09:05 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_query_capable_device_cb: Found 0 matching devices for 'apache-up003.node0'
> May 18 23:09:05 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_ipcs_dispatch_connection_request: HUP conn (2526-6162-12)
> May 18 23:09:05 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_ipcs_disconnect: qb_ipcs_disconnect(2526-6162-12) state:2
> May 18 23:09:05 [2526] apache-up001.itc4u.local stonith-ng: debug: crm_client_destroy: Destroying 0 events
> May 18 23:09:05 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_close: Free'ing ringbuffer: /dev/shm/qb-stonith-ng-response-2526-6162-12-header
> May 18 23:09:05 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_close: Free'ing ringbuffer: /dev/shm/qb-stonith-ng-event-2526-6162-12-header
> May 18 23:09:05 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_close: Free'ing ringbuffer: /dev/shm/qb-stonith-ng-request-2526-6162-12-header
> May 18 23:11:12 [2526] apache-up001.itc4u.local stonith-ng: debug: crm_client_new: Connecting 0x2009860 for uid=0 gid=0 pid=6255 id=bdd9828a-3d55-451d-8916-57ca46c3da05
> May 18 23:11:12 [2526] apache-up001.itc4u.local stonith-ng: debug: handle_new_connection: IPC credentials authenticated (2526-6255-12)
> May 18 23:11:12 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_ipcs_shm_connect: connecting to client [6255]
> May 18 23:11:12 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_open_2: shm size:131085; real_size:135168; rb->word_size:33792
> May 18 23:11:13 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_open_2: shm size:131085; real_size:135168; rb->word_size:33792
> May 18 23:11:13 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_open_2: shm size:131085; real_size:135168; rb->word_size:33792
> May 18 23:11:13 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_command: Processing register 1 from stonith_admin.6255 ( 0)
> May 18 23:11:13 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_command: Processed register from stonith_admin.6255: OK (0)
> May 18 23:11:13 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_command: Processing st_query 2 from stonith_admin.6255 ( 1002)
> May 18 23:11:13 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_query: Query <stonith_command __name__="stonith_command" t="stonith-ng" st_async_id="bdd9828a-3d55-451d-8916-57ca46c3da05" st_op="st_query" st_callid="2" st_callopt="4098" st_timeout="120" st_clientid="bdd9828a-3d55-451d-8916-57ca46c3da05" st_clientname="stonith_admin.6255" st_clientnode="apache-up001.ring0">
> May 18 23:11:13 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_query: Query <st_calldata>
> May 18 23:11:13 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_query: Query <st_device_id st_origin="stonith_api_query" st_device_action="off"/>
> May 18 23:11:13 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_query: Query </st_calldata>
> May 18 23:11:13 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_query: Query </stonith_command>
> May 18 23:11:13 [2526] apache-up001.itc4u.local stonith-ng: debug: get_capable_devices: Searching through 1 devices to see what is capable of action (off) for target <anyone>
> May 18 23:11:13 [2526] apache-up001.itc4u.local stonith-ng: debug: search_devices_record_result: Finished Search. 1 devices can perform action (off) on node <anyone>
> May 18 23:11:13 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_query_capable_device_cb: 1 devices installed
> May 18 23:11:13 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_command: Processed st_query from stonith_admin.6255: OK (0)
> May 18 23:11:13 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_ipcs_dispatch_connection_request: HUP conn (2526-6255-12)
> May 18 23:11:13 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_ipcs_disconnect: qb_ipcs_disconnect(2526-6255-12) state:2
> May 18 23:11:13 [2526] apache-up001.itc4u.local stonith-ng: debug: crm_client_destroy: Destroying 0 events
> May 18 23:11:13 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_close: Free'ing ringbuffer: /dev/shm/qb-stonith-ng-response-2526-6255-12-header
> May 18 23:11:13 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_close: Free'ing ringbuffer: /dev/shm/qb-stonith-ng-event-2526-6255-12-header
> May 18 23:11:13 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_close: Free'ing ringbuffer: /dev/shm/qb-stonith-ng-request-2526-6255-12-header
> May 18 23:12:23 [2526] apache-up001.itc4u.local stonith-ng: debug: crm_client_new: Connecting 0x2009860 for uid=0 gid=0 pid=6302 id=a15e24c7-4419-4e75-ab92-86e3e1d55992
> May 18 23:12:23 [2526] apache-up001.itc4u.local stonith-ng: debug: handle_new_connection: IPC credentials authenticated (2526-6302-12)
> May 18 23:12:23 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_ipcs_shm_connect: connecting to client [6302]
> May 18 23:12:23 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_open_2: shm size:131085; real_size:135168; rb->word_size:33792
> May 18 23:12:23 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_open_2: shm size:131085; real_size:135168; rb->word_size:33792
> May 18 23:12:23 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_open_2: shm size:131085; real_size:135168; rb->word_size:33792
> May 18 23:12:23 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_command: Processing register 1 from stonith_admin.6302 ( 0)
> May 18 23:12:23 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_command: Processed register from stonith_admin.6302: OK (0)
> May 18 23:12:23 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_command: Processing st_query 2 from stonith_admin.6302 ( 1002)
> May 18 23:12:23 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_query: Query <stonith_command __name__="stonith_command" t="stonith-ng" st_async_id="a15e24c7-4419-4e75-ab92-86e3e1d55992" st_op="st_query" st_callid="2" st_callopt="4098" st_timeout="120" st_clientid="a15e24c7-4419-4e75-ab92-86e3e1d55992" st_clientname="stonith_admin.6302" st_clientnode="apache-up001.ring0">
> May 18 23:12:23 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_query: Query <st_calldata>
> May 18 23:12:23 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_query: Query <st_device_id st_origin="stonith_api_query" st_target="apache-up003.ring0" st_device_action="off"/>
> May 18 23:12:23 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_query: Query </st_calldata>
> May 18 23:12:23 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_query: Query </stonith_command>
> May 18 23:12:23 [2526] apache-up001.itc4u.local stonith-ng: debug: get_capable_devices: Searching through 1 devices to see what is capable of action (off) for target apache-up003.ring0
> May 18 23:12:23 [2526] apache-up001.itc4u.local stonith-ng: debug: schedule_stonith_command: Scheduling status on scsi for stonith-ng (timeout=120s)
> May 18 23:12:23 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_command: Processed st_query from stonith_admin.6302: OK (0)
> May 18 23:12:23 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_action_create: Initiating action status for agent fence_scsi (target=apache-up003.ring0)
> May 18 23:12:23 [2526] apache-up001.itc4u.local stonith-ng: debug: make_args: Performing status action for node 'apache-up003.ring0' as 'port=apache-up003.ring0'
> May 18 23:12:23 [2526] apache-up001.itc4u.local stonith-ng: debug: internal_stonith_action_execute: forking
> May 18 23:12:23 [2526] apache-up001.itc4u.local stonith-ng: debug: internal_stonith_action_execute: sending args
> May 18 23:12:23 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_device_execute: Operation status for node apache-up003.ring0 on scsi now running with pid=6303, timeout=120s
> May 18 23:12:23 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_action_async_done: Child process 6303 performing action 'status' exited with rc 0
> May 18 23:12:23 [2526] apache-up001.itc4u.local stonith-ng: warning: log_action: fence_scsi[6303] stderr: [ WARNING:root:Parse error: Ignoring unknown option 'port=apache-up003.ring0' ]
> May 18 23:12:23 [2526] apache-up001.itc4u.local stonith-ng: warning: log_action: fence_scsi[6303] stderr: [ ]
> May 18 23:12:23 [2526] apache-up001.itc4u.local stonith-ng: debug: search_devices_record_result: Finished Search. 1 devices can perform action (off) on node apache-up003.ring0
> May 18 23:12:23 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_query_capable_device_cb: Found 1 matching devices for 'apache-up003.ring0'
> May 18 23:12:23 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_ipcs_dispatch_connection_request: HUP conn (2526-6302-12)
> May 18 23:12:23 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_ipcs_disconnect: qb_ipcs_disconnect(2526-6302-12) state:2
> May 18 23:12:23 [2526] apache-up001.itc4u.local stonith-ng: debug: crm_client_destroy: Destroying 0 events
> May 18 23:12:23 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_close: Free'ing ringbuffer: /dev/shm/qb-stonith-ng-response-2526-6302-12-header
> May 18 23:12:23 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_close: Free'ing ringbuffer: /dev/shm/qb-stonith-ng-event-2526-6302-12-header
> May 18 23:12:23 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_close: Free'ing ringbuffer: /dev/shm/qb-stonith-ng-request-2526-6302-12-header
> May 18 23:12:42 [2526] apache-up001.itc4u.local stonith-ng: debug: crm_client_new: Connecting 0x2009860 for uid=0 gid=0 pid=6333 id=1a181520-ad8b-4a2f-a77f-1b5c0b03561a
> May 18 23:12:42 [2526] apache-up001.itc4u.local stonith-ng: debug: handle_new_connection: IPC credentials authenticated (2526-6333-12)
> May 18 23:12:42 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_ipcs_shm_connect: connecting to client [6333]
> May 18 23:12:42 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_open_2: shm size:131085; real_size:135168; rb->word_size:33792
> May 18 23:12:42 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_open_2: shm size:131085; real_size:135168; rb->word_size:33792
> May 18 23:12:42 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_open_2: shm size:131085; real_size:135168; rb->word_size:33792
> May 18 23:12:42 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_command: Processing register 1 from stonith_admin.6333 ( 0)
> May 18 23:12:42 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_command: Processed register from stonith_admin.6333: OK (0)
> May 18 23:12:42 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_command: Processing st_query 2 from stonith_admin.6333 ( 1002)
> May 18 23:12:42 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_query: Query <stonith_command __name__="stonith_command" t="stonith-ng" st_async_id="1a181520-ad8b-4a2f-a77f-1b5c0b03561a" st_op="st_query" st_callid="2" st_callopt="4098" st_timeout="120" st_clientid="1a181520-ad8b-4a2f-a77f-1b5c0b03561a" st_clientname="stonith_admin.6333" st_clientnode="apache-up001.ring0">
> May 18 23:12:42 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_query: Query <st_calldata>
> May 18 23:12:42 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_query: Query <st_device_id st_origin="stonith_api_query" st_target="apache-up002.ring0" st_device_action="off"/>
> May 18 23:12:42 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_query: Query </st_calldata>
> May 18 23:12:42 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_query: Query </stonith_command>
> May 18 23:12:42 [2526] apache-up001.itc4u.local stonith-ng: debug: get_capable_devices: Searching through 1 devices to see what is capable of action (off) for target apache-up002.ring0
> May 18 23:12:42 [2526] apache-up001.itc4u.local stonith-ng: debug: schedule_stonith_command: Scheduling status on scsi for stonith-ng (timeout=120s)
> May 18 23:12:42 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_command: Processed st_query from stonith_admin.6333: OK (0)
> May 18 23:12:42 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_action_create: Initiating action status for agent fence_scsi (target=apache-up002.ring0)
> May 18 23:12:42 [2526] apache-up001.itc4u.local stonith-ng: debug: make_args: Performing status action for node 'apache-up002.ring0' as 'port=apache-up002.ring0'
> May 18 23:12:42 [2526] apache-up001.itc4u.local stonith-ng: debug: internal_stonith_action_execute: forking
> May 18 23:12:42 [2526] apache-up001.itc4u.local stonith-ng: debug: internal_stonith_action_execute: sending args
> May 18 23:12:42 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_device_execute: Operation status for node apache-up002.ring0 on scsi now running with pid=6334, timeout=120s
> May 18 23:12:42 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_action_async_done: Child process 6334 performing action 'status' exited with rc 0
> May 18 23:12:42 [2526] apache-up001.itc4u.local stonith-ng: warning: log_action: fence_scsi[6334] stderr: [ WARNING:root:Parse error: Ignoring unknown option 'port=apache-up002.ring0' ]
> May 18 23:12:42 [2526] apache-up001.itc4u.local stonith-ng: warning: log_action: fence_scsi[6334] stderr: [ ]
> May 18 23:12:42 [2526] apache-up001.itc4u.local stonith-ng: debug: search_devices_record_result: Finished Search. 1 devices can perform action (off) on node apache-up002.ring0
> May 18 23:12:42 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_query_capable_device_cb: Found 1 matching devices for 'apache-up002.ring0'
> May 18 23:12:42 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_ipcs_dispatch_connection_request: HUP conn (2526-6333-12)
> May 18 23:12:42 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_ipcs_disconnect: qb_ipcs_disconnect(2526-6333-12) state:2
> May 18 23:12:42 [2526] apache-up001.itc4u.local stonith-ng: debug: crm_client_destroy: Destroying 0 events
> May 18 23:12:42 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_close: Free'ing ringbuffer: /dev/shm/qb-stonith-ng-response-2526-6333-12-header
> May 18 23:12:42 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_close: Free'ing ringbuffer: /dev/shm/qb-stonith-ng-event-2526-6333-12-header
> May 18 23:12:42 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_close: Free'ing ringbuffer: /dev/shm/qb-stonith-ng-request-2526-6333-12-header
> May 18 23:12:54 [2526] apache-up001.itc4u.local stonith-ng: debug: crm_client_new: Connecting 0x2009860 for uid=0 gid=0 pid=6354 id=4d687d0f-cbc3-43b9-afc3-e65d4a7a0ca5
> May 18 23:12:54 [2526] apache-up001.itc4u.local stonith-ng: debug: handle_new_connection: IPC credentials authenticated (2526-6354-12)
> May 18 23:12:54 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_ipcs_shm_connect: connecting to client [6354]
> May 18 23:12:54 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_open_2: shm size:131085; real_size:135168; rb->word_size:33792
> May 18 23:12:54 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_open_2: shm size:131085; real_size:135168; rb->word_size:33792
> May 18 23:12:54 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_open_2: shm size:131085; real_size:135168; rb->word_size:33792
> May 18 23:12:54 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_command: Processing register 1 from stonith_admin.6354 ( 0)
> May 18 23:12:54 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_command: Processed register from stonith_admin.6354: OK (0)
> May 18 23:12:54 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_command: Processing st_query 2 from stonith_admin.6354 ( 1002)
> May 18 23:12:54 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_query: Query <stonith_command __name__="stonith_command" t="stonith-ng" st_async_id="4d687d0f-cbc3-43b9-afc3-e65d4a7a0ca5" st_op="st_query" st_callid="2" st_callopt="4098" st_timeout="120" st_clientid="4d687d0f-cbc3-43b9-afc3-e65d4a7a0ca5" st_clientname="stonith_admin.6354" st_clientnode="apache-up001.ring0">
> May 18 23:12:54 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_query: Query <st_calldata>
> May 18 23:12:54 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_query: Query <st_device_id st_origin="stonith_api_query" st_target="apache-up001.ring0" st_device_action="off"/>
> May 18 23:12:54 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_query: Query </st_calldata>
> May 18 23:12:54 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_query: Query </stonith_command>
> May 18 23:12:54 [2526] apache-up001.itc4u.local stonith-ng: debug: get_capable_devices: Searching through 1 devices to see what is capable of action (off) for target apache-up001.ring0
> May 18 23:12:54 [2526] apache-up001.itc4u.local stonith-ng: debug: schedule_stonith_command: Scheduling status on scsi for stonith-ng (timeout=120s)
> May 18 23:12:54 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_command: Processed st_query from stonith_admin.6354: OK (0)
> May 18 23:12:54 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_action_create: Initiating action status for agent fence_scsi (target=apache-up001.ring0)
> May 18 23:12:54 [2526] apache-up001.itc4u.local stonith-ng: debug: make_args: Performing status action for node 'apache-up001.ring0' as 'port=apache-up001.ring0'
> May 18 23:12:54 [2526] apache-up001.itc4u.local stonith-ng: debug: internal_stonith_action_execute: forking
> May 18 23:12:54 [2526] apache-up001.itc4u.local stonith-ng: debug: internal_stonith_action_execute: sending args
> May 18 23:12:54 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_device_execute: Operation status for node apache-up001.ring0 on scsi now running with pid=6355, timeout=120s
> May 18 23:12:54 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_action_async_done: Child process 6355 performing action 'status' exited with rc 0
> May 18 23:12:54 [2526] apache-up001.itc4u.local stonith-ng: warning: log_action: fence_scsi[6355] stderr: [ WARNING:root:Parse error: Ignoring unknown option 'port=apache-up001.ring0' ]
> May 18 23:12:54 [2526] apache-up001.itc4u.local stonith-ng: warning: log_action: fence_scsi[6355] stderr: [ ]
> May 18 23:12:54 [2526] apache-up001.itc4u.local stonith-ng: debug: search_devices_record_result: Finished Search. 1 devices can perform action (off) on node apache-up001.ring0
> May 18 23:12:54 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_query_capable_device_cb: Found 1 matching devices for 'apache-up001.ring0'
> May 18 23:12:54 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_ipcs_dispatch_connection_request: HUP conn (2526-6354-12)
> May 18 23:12:54 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_ipcs_disconnect: qb_ipcs_disconnect(2526-6354-12) state:2
> May 18 23:12:54 [2526] apache-up001.itc4u.local stonith-ng: debug: crm_client_destroy: Destroying 0 events
> May 18 23:12:54 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_close: Free'ing ringbuffer: /dev/shm/qb-stonith-ng-response-2526-6354-12-header
> May 18 23:12:54 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_close: Free'ing ringbuffer: /dev/shm/qb-stonith-ng-event-2526-6354-12-header
> May 18 23:12:54 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_close: Free'ing ringbuffer: /dev/shm/qb-stonith-ng-request-2526-6354-12-header
> May 18 23:13:02 [2526] apache-up001.itc4u.local stonith-ng: debug: crm_client_new: Connecting 0x2009860 for uid=0 gid=0 pid=6372 id=f3cf1963-7229-4eec-adf9-835f595a4c28
> May 18 23:13:02 [2526] apache-up001.itc4u.local stonith-ng: debug: handle_new_connection: IPC credentials authenticated (2526-6372-12)
> May 18 23:13:02 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_ipcs_shm_connect: connecting to client [6372]
> May 18 23:13:02 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_open_2: shm size:131085; real_size:135168; rb->word_size:33792
> May 18 23:13:02 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_open_2: shm size:131085; real_size:135168; rb->word_size:33792
> May 18 23:13:02 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_open_2: shm size:131085; real_size:135168; rb->word_size:33792
> May 18 23:13:02 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_command: Processing register 1 from stonith_admin.6372 ( 0)
> May 18 23:13:02 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_command: Processed register from stonith_admin.6372: OK (0)
> May 18 23:13:02 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_command: Processing st_query 2 from stonith_admin.6372 ( 1002)
> May 18 23:13:02 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_query: Query <stonith_command __name__="stonith_command" t="stonith-ng" st_async_id="f3cf1963-7229-4eec-adf9-835f595a4c28" st_op="st_query" st_callid="2" st_callopt="4098" st_timeout="120" st_clientid="f3cf1963-7229-4eec-adf9-835f595a4c28" st_clientname="stonith_admin.6372" st_clientnode="apache-up001.ring0">
> May 18 23:13:02 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_query: Query <st_calldata>
> May 18 23:13:02 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_query: Query <st_device_id st_origin="stonith_api_query" st_target="apache-up001.ring1" st_device_action="off"/>
> May 18 23:13:02 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_query: Query </st_calldata>
> May 18 23:13:02 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_query: Query </stonith_command>
> May 18 23:13:02 [2526] apache-up001.itc4u.local stonith-ng: debug: get_capable_devices: Searching through 1 devices to see what is capable of action (off) for target apache-up001.ring1
> May 18 23:13:02 [2526] apache-up001.itc4u.local stonith-ng: debug: schedule_stonith_command: Scheduling status on scsi for stonith-ng (timeout=120s)
> May 18 23:13:02 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_command: Processed st_query from stonith_admin.6372: OK (0)
> May 18 23:13:02 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_action_create: Initiating action status for agent fence_scsi (target=apache-up001.ring1)
> May 18 23:13:02 [2526] apache-up001.itc4u.local stonith-ng: debug: make_args: Performing status action for node 'apache-up001.ring1' as 'port=apache-up001.ring1'
> May 18 23:13:02 [2526] apache-up001.itc4u.local stonith-ng: debug: internal_stonith_action_execute: forking
> May 18 23:13:02 [2526] apache-up001.itc4u.local stonith-ng: debug: internal_stonith_action_execute: sending args
> May 18 23:13:02 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_device_execute: Operation status for node apache-up001.ring1 on scsi now running with pid=6373, timeout=120s
> May 18 23:13:02 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_action_async_done: Child process 6373 performing action 'status' exited with rc 0
> May 18 23:13:02 [2526] apache-up001.itc4u.local stonith-ng: warning: log_action: fence_scsi[6373] stderr: [ WARNING:root:Parse error: Ignoring unknown option 'port=apache-up001.ring1' ]
> May 18 23:13:02 [2526] apache-up001.itc4u.local stonith-ng: warning: log_action: fence_scsi[6373] stderr: [ ]
> May 18 23:13:02 [2526] apache-up001.itc4u.local stonith-ng: debug: search_devices_record_result: Finished Search. 1 devices can perform action (off) on node apache-up001.ring1
> May 18 23:13:02 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_query_capable_device_cb: Found 1 matching devices for 'apache-up001.ring1'
> May 18 23:13:02 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_ipcs_dispatch_connection_request: HUP conn (2526-6372-12)
> May 18 23:13:02 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_ipcs_disconnect: qb_ipcs_disconnect(2526-6372-12) state:2
> May 18 23:13:02 [2526] apache-up001.itc4u.local stonith-ng: debug: crm_client_destroy: Destroying 0 events
> May 18 23:13:02 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_close: Free'ing ringbuffer: /dev/shm/qb-stonith-ng-response-2526-6372-12-header
> May 18 23:13:02 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_close: Free'ing ringbuffer: /dev/shm/qb-stonith-ng-event-2526-6372-12-header
> May 18 23:13:02 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_close: Free'ing ringbuffer: /dev/shm/qb-stonith-ng-request-2526-6372-12-header
> May 18 23:13:17 [2526] apache-up001.itc4u.local stonith-ng: debug: crm_client_new: Connecting 0x2009860 for uid=0 gid=0 pid=6391 id=1e54d665-4079-4168-a6fc-494b40377d93
> May 18 23:13:17 [2526] apache-up001.itc4u.local stonith-ng: debug: handle_new_connection: IPC credentials authenticated (2526-6391-12)
> May 18 23:13:17 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_ipcs_shm_connect: connecting to client [6391]
> May 18 23:13:17 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_open_2: shm size:131085; real_size:135168; rb->word_size:33792
> May 18 23:13:17 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_open_2: shm size:131085; real_size:135168; rb->word_size:33792
> May 18 23:13:17 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_open_2: shm size:131085; real_size:135168; rb->word_size:33792
> May 18 23:13:17 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_command: Processing register 1 from stonith_admin.6391 ( 0)
> May 18 23:13:17 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_command: Processed register from stonith_admin.6391: OK (0)
> May 18 23:13:17 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_command: Processing st_query 2 from stonith_admin.6391 ( 1002)
> May 18 23:13:17 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_query: Query <stonith_command __name__="stonith_command" t="stonith-ng" st_async_id="1e54d665-4079-4168-a6fc-494b40377d93" st_op="st_query" st_callid="2" st_callopt="4098" st_timeout="120" st_clientid="1e54d665-4079-4168-a6fc-494b40377d93" st_clientname="stonith_admin.6391" st_clientnode="apache-up001.ring0">
> May 18 23:13:17 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_query: Query <st_calldata>
> May 18 23:13:17 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_query: Query <st_device_id st_origin="stonith_api_query" st_target="apache-up002.ring1" st_device_action="off"/>
> May 18 23:13:17 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_query: Query </st_calldata>
> May 18 23:13:17 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_query: Query </stonith_command>
> May 18 23:13:17 [2526] apache-up001.itc4u.local stonith-ng: debug: get_capable_devices: Searching through 1 devices to see what is capable of action (off) for target apache-up002.ring1
> May 18 23:13:17 [2526] apache-up001.itc4u.local stonith-ng: debug: schedule_stonith_command: Scheduling status on scsi for stonith-ng (timeout=120s)
> May 18 23:13:17 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_command: Processed st_query from stonith_admin.6391: OK (0)
> May 18 23:13:17 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_action_create: Initiating action status for agent fence_scsi (target=apache-up002.ring1)
> May 18 23:13:17 [2526] apache-up001.itc4u.local stonith-ng: debug: make_args: Performing status action for node 'apache-up002.ring1' as 'port=apache-up002.ring1'
> May 18 23:13:17 [2526] apache-up001.itc4u.local stonith-ng: debug: internal_stonith_action_execute: forking
> May 18 23:13:17 [2526] apache-up001.itc4u.local stonith-ng: debug: internal_stonith_action_execute: sending args
> May 18 23:13:17 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_device_execute: Operation status for node apache-up002.ring1 on scsi now running with pid=6392, timeout=120s
> May 18 23:13:17 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_action_async_done: Child process 6392 performing action 'status' exited with rc 0
> May 18 23:13:17 [2526] apache-up001.itc4u.local stonith-ng: warning: log_action: fence_scsi[6392] stderr: [ WARNING:root:Parse error: Ignoring unknown option 'port=apache-up002.ring1' ]
> May 18 23:13:17 [2526] apache-up001.itc4u.local stonith-ng: warning: log_action: fence_scsi[6392] stderr: [ ]
> May 18 23:13:17 [2526] apache-up001.itc4u.local stonith-ng: debug: search_devices_record_result: Finished Search. 1 devices can perform action (off) on node apache-up002.ring1
> May 18 23:13:17 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_query_capable_device_cb: Found 1 matching devices for 'apache-up002.ring1'
> May 18 23:13:17 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_ipcs_dispatch_connection_request: HUP conn (2526-6391-12)
> May 18 23:13:17 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_ipcs_disconnect: qb_ipcs_disconnect(2526-6391-12) state:2
> May 18 23:13:17 [2526] apache-up001.itc4u.local stonith-ng: debug: crm_client_destroy: Destroying 0 events
> May 18 23:13:17 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_close: Free'ing ringbuffer: /dev/shm/qb-stonith-ng-response-2526-6391-12-header
> May 18 23:13:17 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_close: Free'ing ringbuffer: /dev/shm/qb-stonith-ng-event-2526-6391-12-header
> May 18 23:13:17 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_close: Free'ing ringbuffer: /dev/shm/qb-stonith-ng-request-2526-6391-12-header
> May 18 23:13:30 [2526] apache-up001.itc4u.local stonith-ng: debug: crm_client_new: Connecting 0x2009860 for uid=0 gid=0 pid=6424 id=63ab800f-9b9d-413b-8588-80657ce2411d
> May 18 23:13:30 [2526] apache-up001.itc4u.local stonith-ng: debug: handle_new_connection: IPC credentials authenticated (2526-6424-12)
> May 18 23:13:30 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_ipcs_shm_connect: connecting to client [6424]
> May 18 23:13:30 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_open_2: shm size:131085; real_size:135168; rb->word_size:33792
> May 18 23:13:30 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_open_2: shm size:131085; real_size:135168; rb->word_size:33792
> May 18 23:13:30 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_open_2: shm size:131085; real_size:135168; rb->word_size:33792
> May 18 23:13:30 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_command: Processing register 1 from stonith_admin.6424 ( 0)
> May 18 23:13:30 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_command: Processed register from stonith_admin.6424: OK (0)
> May 18 23:13:30 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_command: Processing st_query 2 from stonith_admin.6424 ( 1002)
> May 18 23:13:30 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_query: Query <stonith_command __name__="stonith_command" t="stonith-ng" st_async_id="63ab800f-9b9d-413b-8588-80657ce2411d" st_op="st_query" st_callid="2" st_callopt="4098" st_timeout="120" st_clientid="63ab800f-9b9d-413b-8588-80657ce2411d" st_clientname="stonith_admin.6424" st_clientnode="apache-up001.ring0">
> May 18 23:13:30 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_query: Query <st_calldata>
> May 18 23:13:30 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_query: Query <st_device_id st_origin="stonith_api_query" st_target="apache-up003.ring1" st_device_action="off"/>
> May 18 23:13:30 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_query: Query </st_calldata>
> May 18 23:13:30 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_query: Query </stonith_command>
> May 18 23:13:30 [2526] apache-up001.itc4u.local stonith-ng: debug: get_capable_devices: Searching through 1 devices to see what is capable of action (off) for target apache-up003.ring1
> May 18 23:13:30 [2526] apache-up001.itc4u.local stonith-ng: debug: schedule_stonith_command: Scheduling status on scsi for stonith-ng (timeout=120s)
> May 18 23:13:30 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_command: Processed st_query from stonith_admin.6424: OK (0)
> May 18 23:13:30 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_action_create: Initiating action status for agent fence_scsi (target=apache-up003.ring1)
> May 18 23:13:30 [2526] apache-up001.itc4u.local stonith-ng: debug: make_args: Performing status action for node 'apache-up003.ring1' as 'port=apache-up003.ring1'
> May 18 23:13:30 [2526] apache-up001.itc4u.local stonith-ng: debug: internal_stonith_action_execute: forking
> May 18 23:13:30 [2526] apache-up001.itc4u.local stonith-ng: debug: internal_stonith_action_execute: sending args
> May 18 23:13:30 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_device_execute: Operation status for node apache-up003.ring1 on scsi now running with pid=6425, timeout=120s
> May 18 23:13:30 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_action_async_done: Child process 6425 performing action 'status' exited with rc 0
> May 18 23:13:30 [2526] apache-up001.itc4u.local stonith-ng: warning: log_action: fence_scsi[6425] stderr: [ WARNING:root:Parse error: Ignoring unknown option 'port=apache-up003.ring1' ]
> May 18 23:13:30 [2526] apache-up001.itc4u.local stonith-ng: warning: log_action: fence_scsi[6425] stderr: [ ]
> May 18 23:13:30 [2526] apache-up001.itc4u.local stonith-ng: debug: search_devices_record_result: Finished Search. 1 devices can perform action (off) on node apache-up003.ring1
> May 18 23:13:30 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_query_capable_device_cb: Found 1 matching devices for 'apache-up003.ring1'
> May 18 23:13:30 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_ipcs_dispatch_connection_request: HUP conn (2526-6424-12)
> May 18 23:13:30 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_ipcs_disconnect: qb_ipcs_disconnect(2526-6424-12) state:2
> May 18 23:13:30 [2526] apache-up001.itc4u.local stonith-ng: debug: crm_client_destroy: Destroying 0 events
> May 18 23:13:30 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_close: Free'ing ringbuffer: /dev/shm/qb-stonith-ng-response-2526-6424-12-header
> May 18 23:13:30 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_close: Free'ing ringbuffer: /dev/shm/qb-stonith-ng-event-2526-6424-12-header
> May 18 23:13:30 [2526] apache-up001.itc4u.local stonith-ng: debug: qb_rb_close: Free'ing ringbuffer: /dev/shm/qb-stonith-ng-request-2526-6424-12-header
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Forwarding cib_replace operation for section 'all' to master (origin=local/cibadmin/2)
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_process_replace: Replaced 0.67.26 with 0.71.0 from apache-up001.ring0
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: --- 0.67.26 2
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: +++ 0.72.0 76bcac36d315cbcb11a682b93b01d023
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib: @epoch=72, @num_updates=0
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ /cib/configuration/resources: <clone id="dlm-clone"/>
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ <primitive class="ocf" id="dlm" provider="pacemaker" type="controld">
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ <instance_attributes id="dlm-instance_attributes"/>
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ <operations>
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ <op id="dlm-start-interval-0s" interval="0s" name="start" timeout="90"/>
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ <op id="dlm-stop-interval-0s" interval="0s" name="stop" timeout="100"/>
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ <op id="dlm-monitor-interval-120s" interval="120s" name="monitor" on-fail="fence"/>
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ </operations>
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ </primitive>
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ <meta_attributes id="dlm-clone-meta_attributes">
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ <nvpair id="dlm-interleave" name="interleave" value="true"/>
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ <nvpair id="dlm-ordered" name="ordered" value="true"/>
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ </meta_attributes>
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ </clone>
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ /cib/configuration/resources: <clone id="clvmd-clone"/>
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ <primitive class="ocf" id="clvmd" provider="heartbeat" type="clvm">
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ <instance_attributes id="clvmd-instance_attributes"/>
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ <operations>
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ <op id="clvmd-start-interval-0s" interval="0s" name="start" timeout="90"/>
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ <op id="clvmd-stop-interval-0s" interval="0s" name="stop" timeout="90"/>
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ <op id="clvmd-monitor-interval-120s" interval="120s" name="monitor" on-fail="fence"/>
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ </operations>
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ </primitive>
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ <meta_attributes id="clvmd-clone-meta_attributes">
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ <nvpair id="clvmd-interleave" name="interleave" value="true"/>
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ <nvpair id="clvmd-ordered" name="ordered" value="true"/>
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ </meta_attributes>
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ </clone>
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ /cib/configuration/constraints: <rsc_order first="dlm-clone" first-action="start" id="order-dlm-clone-clvmd-clone-mandatory" then="clvmd-clone" then-action="start"/>
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ /cib/configuration/constraints: <rsc_colocation id="colocation-clvmd-clone-dlm-clone-INFINITY" rsc="clvmd-clone" score="INFINITY" with-rsc="dlm-clone"/>
> May 18 23:17:34 [2526] apache-up001.itc4u.local stonith-ng: debug: xml_patch_version_check: Can apply patch 0.72.0 to 0.67.26
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_replace_notify: Replaced: 0.67.26 -> 0.72.0 from apache-up001.ring0
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_replace operation for section 'all': OK (rc=0, origin=apache-up001.ring0/cibadmin/2, version=0.72.0)
> May 18 23:17:34 [2530] apache-up001.itc4u.local crmd: info: abort_transition_graph: Transition aborted by clone.dlm-clone 'create': Non-status change (cib=0.72.0, source=te_update_diff:436, path=/cib/configuration/resources, 1)
> May 18 23:17:34 [2530] apache-up001.itc4u.local crmd: notice: do_state_transition: State transition S_IDLE -> S_POLICY_ENGINE [ input=I_PE_CALC cause=C_FSA_INTERNAL origin=abort_transition_graph ]
> May 18 23:17:34 [2530] apache-up001.itc4u.local crmd: info: do_state_transition: State transition S_POLICY_ENGINE -> S_ELECTION [ input=I_ELECTION cause=C_FSA_INTERNAL origin=do_cib_replaced ]
> May 18 23:17:34 [2530] apache-up001.itc4u.local crmd: info: update_dc: Unset DC. Was apache-up001.ring0
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Forwarding cib_modify operation for section nodes to master (origin=local/crmd/79)
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Forwarding cib_modify operation for section status to master (origin=local/crmd/80)
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section nodes: OK (rc=0, origin=apache-up001.ring0/crmd/79, version=0.72.0)
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: --- 0.72.0 2
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: +++ 0.72.1 (null)
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib: @num_updates=1
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib/status/node_state[@id='1']: @crm-debug-origin=do_cib_replaced
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib/status/node_state[@id='3']: @crm-debug-origin=do_cib_replaced
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib/status/node_state[@id='2']: @crm-debug-origin=do_cib_replaced
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section status: OK (rc=0, origin=apache-up001.ring0/crmd/80, version=0.72.1)
> May 18 23:17:34 [2526] apache-up001.itc4u.local stonith-ng: debug: xml_patch_version_check: Can apply patch 0.72.1 to 0.72.0
> May 18 23:17:34 [2528] apache-up001.itc4u.local attrd: notice: attrd_cib_replaced_cb: Updating all attributes after cib_refresh_notify event
> May 18 23:17:34 [2530] apache-up001.itc4u.local crmd: info: election_complete: Election election-0 complete
> May 18 23:17:34 [2530] apache-up001.itc4u.local crmd: info: election_timeout_popped: Election failed: Declaring ourselves the winner
> May 18 23:17:34 [2528] apache-up001.itc4u.local attrd: info: write_attribute: Sent update 6 with 3 changes for shutdown, id=<n/a>, set=(null)
> May 18 23:17:34 [2528] apache-up001.itc4u.local attrd: info: write_attribute: Sent update 7 with 3 changes for terminate, id=<n/a>, set=(null)
> May 18 23:17:34 [2530] apache-up001.itc4u.local crmd: info: do_log: FSA: Input I_ELECTION_DC from election_timeout_popped() received in state S_ELECTION
> May 18 23:17:34 [2530] apache-up001.itc4u.local crmd: notice: do_state_transition: State transition S_ELECTION -> S_INTEGRATION [ input=I_ELECTION_DC cause=C_TIMER_POPPED origin=election_timeout_popped ]
> May 18 23:17:34 [2530] apache-up001.itc4u.local crmd: info: do_dc_takeover: Taking over DC status for this partition
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_master operation for section 'all': OK (rc=0, origin=local/crmd/82, version=0.72.1)
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Forwarding cib_modify operation for section cib to master (origin=local/crmd/83)
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Forwarding cib_modify operation for section status to master (origin=local/attrd/6)
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Forwarding cib_modify operation for section status to master (origin=local/attrd/7)
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Forwarding cib_modify operation for section crm_config to master (origin=local/crmd/85)
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section cib: OK (rc=0, origin=apache-up001.ring0/crmd/83, version=0.72.1)
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: --- 0.72.1 2
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: +++ 0.72.2 (null)
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib: @num_updates=2
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ /cib/status/node_state[@id='3']: <transient_attributes id="3"/>
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ <instance_attributes id="status-3">
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ <nvpair id="status-3-shutdown" name="shutdown" value="0"/>
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ </instance_attributes>
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ </transient_attributes>
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section status: OK (rc=0, origin=apache-up001.ring0/attrd/6, version=0.72.2)
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: --- 0.72.2 2
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: +++ 0.72.3 (null)
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib: @num_updates=3
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section status: OK (rc=0, origin=apache-up001.ring0/attrd/7, version=0.72.3)
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section crm_config: OK (rc=0, origin=apache-up001.ring0/crmd/85, version=0.72.3)
> May 18 23:17:34 [2526] apache-up001.itc4u.local stonith-ng: debug: xml_patch_version_check: Can apply patch 0.72.2 to 0.72.1
> May 18 23:17:34 [2526] apache-up001.itc4u.local stonith-ng: debug: xml_patch_version_check: Can apply patch 0.72.3 to 0.72.2
> May 18 23:17:34 [2528] apache-up001.itc4u.local attrd: info: attrd_cib_callback: Update 6 for shutdown: OK (0)
> May 18 23:17:34 [2528] apache-up001.itc4u.local attrd: info: attrd_cib_callback: Update 6 for shutdown[apache-up001.ring0]=0: OK (0)
> May 18 23:17:34 [2528] apache-up001.itc4u.local attrd: info: attrd_cib_callback: Update 6 for shutdown[apache-up002.ring0]=0: OK (0)
> May 18 23:17:34 [2528] apache-up001.itc4u.local attrd: info: attrd_cib_callback: Update 6 for shutdown[apache-up003.ring0]=0: OK (0)
> May 18 23:17:34 [2528] apache-up001.itc4u.local attrd: info: attrd_cib_callback: Update 7 for terminate: OK (0)
> May 18 23:17:34 [2528] apache-up001.itc4u.local attrd: info: attrd_cib_callback: Update 7 for terminate[apache-up001.ring0]=(null): OK (0)
> May 18 23:17:34 [2528] apache-up001.itc4u.local attrd: info: attrd_cib_callback: Update 7 for terminate[apache-up002.ring0]=(null): OK (0)
> May 18 23:17:34 [2528] apache-up001.itc4u.local attrd: info: attrd_cib_callback: Update 7 for terminate[apache-up003.ring0]=(null): OK (0)
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Forwarding cib_modify operation for section crm_config to master (origin=local/crmd/87)
> May 18 23:17:34 [2530] apache-up001.itc4u.local crmd: info: crm_update_peer_join: initialize_join: Node apache-up002.ring0[2] - join-3 phase 4 -> 0
> May 18 23:17:34 [2530] apache-up001.itc4u.local crmd: info: crm_update_peer_join: initialize_join: Node apache-up001.ring0[1] - join-3 phase 4 -> 0
> May 18 23:17:34 [2530] apache-up001.itc4u.local crmd: info: crm_update_peer_join: initialize_join: Node apache-up003.ring0[3] - join-3 phase 4 -> 0
> May 18 23:17:34 [2530] apache-up001.itc4u.local crmd: info: join_make_offer: join-3: Sending offer to apache-up002.ring0
> May 18 23:17:34 [2530] apache-up001.itc4u.local crmd: info: crm_update_peer_join: join_make_offer: Node apache-up002.ring0[2] - join-3 phase 0 -> 1
> May 18 23:17:34 [2530] apache-up001.itc4u.local crmd: info: join_make_offer: join-3: Sending offer to apache-up001.ring0
> May 18 23:17:34 [2530] apache-up001.itc4u.local crmd: info: crm_update_peer_join: join_make_offer: Node apache-up001.ring0[1] - join-3 phase 0 -> 1
> May 18 23:17:34 [2530] apache-up001.itc4u.local crmd: info: join_make_offer: join-3: Sending offer to apache-up003.ring0
> May 18 23:17:34 [2530] apache-up001.itc4u.local crmd: info: crm_update_peer_join: join_make_offer: Node apache-up003.ring0[3] - join-3 phase 0 -> 1
> May 18 23:17:34 [2530] apache-up001.itc4u.local crmd: info: do_dc_join_offer_all: join-3: Waiting on 3 outstanding join acks
> May 18 23:17:34 [2530] apache-up001.itc4u.local crmd: warning: do_log: FSA: Input I_ELECTION_DC from do_election_check() received in state S_INTEGRATION
> May 18 23:17:34 [2530] apache-up001.itc4u.local crmd: info: crm_update_peer_join: initialize_join: Node apache-up002.ring0[2] - join-4 phase 1 -> 0
> May 18 23:17:34 [2530] apache-up001.itc4u.local crmd: info: crm_update_peer_join: initialize_join: Node apache-up001.ring0[1] - join-4 phase 1 -> 0
> May 18 23:17:34 [2530] apache-up001.itc4u.local crmd: info: crm_update_peer_join: initialize_join: Node apache-up003.ring0[3] - join-4 phase 1 -> 0
> May 18 23:17:34 [2530] apache-up001.itc4u.local crmd: info: join_make_offer: join-4: Sending offer to apache-up002.ring0
> May 18 23:17:34 [2530] apache-up001.itc4u.local crmd: info: crm_update_peer_join: join_make_offer: Node apache-up002.ring0[2] - join-4 phase 0 -> 1
> May 18 23:17:34 [2530] apache-up001.itc4u.local crmd: info: join_make_offer: join-4: Sending offer to apache-up001.ring0
> May 18 23:17:34 [2530] apache-up001.itc4u.local crmd: info: crm_update_peer_join: join_make_offer: Node apache-up001.ring0[1] - join-4 phase 0 -> 1
> May 18 23:17:34 [2530] apache-up001.itc4u.local crmd: info: join_make_offer: join-4: Sending offer to apache-up003.ring0
> May 18 23:17:34 [2530] apache-up001.itc4u.local crmd: info: crm_update_peer_join: join_make_offer: Node apache-up003.ring0[3] - join-4 phase 0 -> 1
> May 18 23:17:34 [2530] apache-up001.itc4u.local crmd: info: do_dc_join_offer_all: join-4: Waiting on 3 outstanding join acks
> May 18 23:17:34 [2530] apache-up001.itc4u.local crmd: info: update_dc: Set DC to apache-up001.ring0 (3.0.10)
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section crm_config: OK (rc=0, origin=apache-up001.ring0/crmd/87, version=0.72.3)
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Forwarding cib_modify operation for section crm_config to master (origin=local/crmd/89)
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section crm_config: OK (rc=0, origin=apache-up001.ring0/crmd/89, version=0.72.3)
> May 18 23:17:34 [2530] apache-up001.itc4u.local crmd: info: crm_update_peer_join: do_dc_join_filter_offer: Node apache-up002.ring0[2] - join-4 phase 1 -> 2
> May 18 23:17:34 [2530] apache-up001.itc4u.local crmd: info: crm_update_peer_join: do_dc_join_filter_offer: Node apache-up001.ring0[1] - join-4 phase 1 -> 2
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_file_backup: Archived previous version as /var/lib/pacemaker/cib/cib-6.raw
> May 18 23:17:34 [2530] apache-up001.itc4u.local crmd: info: crm_update_peer_join: do_dc_join_filter_offer: Node apache-up003.ring0[3] - join-4 phase 1 -> 2
> May 18 23:17:34 [2530] apache-up001.itc4u.local crmd: info: do_state_transition: State transition S_INTEGRATION -> S_FINALIZE_JOIN [ input=I_INTEGRATED cause=C_FSA_INTERNAL origin=check_join_state ]
> May 18 23:17:34 [2530] apache-up001.itc4u.local crmd: info: crmd_join_phase_log: join-4: apache-up002.ring0=integrated
> May 18 23:17:34 [2530] apache-up001.itc4u.local crmd: info: crmd_join_phase_log: join-4: apache-up001.ring0=integrated
> May 18 23:17:34 [2530] apache-up001.itc4u.local crmd: info: crmd_join_phase_log: join-4: apache-up003.ring0=integrated
> May 18 23:17:34 [2530] apache-up001.itc4u.local crmd: info: do_dc_join_finalize: join-4: Syncing our CIB to the rest of the cluster
> May 18 23:17:34 [2530] apache-up001.itc4u.local crmd: info: crm_update_peer_join: finalize_join_for: Node apache-up002.ring0[2] - join-4 phase 2 -> 3
> May 18 23:17:34 [2530] apache-up001.itc4u.local crmd: info: crm_update_peer_join: finalize_join_for: Node apache-up001.ring0[1] - join-4 phase 2 -> 3
> May 18 23:17:34 [2530] apache-up001.itc4u.local crmd: info: crm_update_peer_join: finalize_join_for: Node apache-up003.ring0[3] - join-4 phase 2 -> 3
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_process_replace: Digest matched on replace from apache-up001.ring0: 671872397ac3b8c8c3ae6321bcfd1033
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_process_replace: Replaced 0.72.3 with 0.72.3 from apache-up001.ring0
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_replace operation for section 'all': OK (rc=0, origin=apache-up001.ring0/crmd/93, version=0.72.3)
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Forwarding cib_modify operation for section nodes to master (origin=local/crmd/94)
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Forwarding cib_modify operation for section nodes to master (origin=local/crmd/95)
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Forwarding cib_modify operation for section nodes to master (origin=local/crmd/96)
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section nodes: OK (rc=0, origin=apache-up001.ring0/crmd/94, version=0.72.3)
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section nodes: OK (rc=0, origin=apache-up001.ring0/crmd/95, version=0.72.3)
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section nodes: OK (rc=0, origin=apache-up001.ring0/crmd/96, version=0.72.3)
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section status: OK (rc=0, origin=apache-up003.ring0/attrd/4, version=0.72.3)
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: --- 0.72.3 2
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: +++ 0.72.4 (null)
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib: @num_updates=4
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section status: OK (rc=0, origin=apache-up003.ring0/attrd/5, version=0.72.4)
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section status: OK (rc=0, origin=apache-up002.ring0/attrd/4, version=0.72.4)
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: --- 0.72.4 2
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: +++ 0.72.5 (null)
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib: @num_updates=5
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section status: OK (rc=0, origin=apache-up002.ring0/attrd/5, version=0.72.5)
> May 18 23:17:34 [2526] apache-up001.itc4u.local stonith-ng: debug: xml_patch_version_check: Can apply patch 0.72.4 to 0.72.3
> May 18 23:17:34 [2526] apache-up001.itc4u.local stonith-ng: debug: xml_patch_version_check: Can apply patch 0.72.5 to 0.72.4
> May 18 23:17:34 [2530] apache-up001.itc4u.local crmd: info: crm_update_peer_join: do_dc_join_ack: Node apache-up003.ring0[3] - join-4 phase 3 -> 4
> May 18 23:17:34 [2530] apache-up001.itc4u.local crmd: info: do_dc_join_ack: join-4: Updating node state to member for apache-up003.ring0
> May 18 23:17:34 [2530] apache-up001.itc4u.local crmd: info: erase_status_tag: Deleting xpath: //node_state[@uname='apache-up003.ring0']/lrm
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Forwarding cib_delete operation for section //node_state[@uname='apache-up003.ring0']/lrm to master (origin=local/crmd/97)
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Forwarding cib_modify operation for section status to master (origin=local/crmd/98)
> May 18 23:17:34 [2530] apache-up001.itc4u.local crmd: info: crm_update_peer_join: do_dc_join_ack: Node apache-up002.ring0[2] - join-4 phase 3 -> 4
> May 18 23:17:34 [2530] apache-up001.itc4u.local crmd: info: do_dc_join_ack: join-4: Updating node state to member for apache-up002.ring0
> May 18 23:17:34 [2530] apache-up001.itc4u.local crmd: info: erase_status_tag: Deleting xpath: //node_state[@uname='apache-up002.ring0']/lrm
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: --- 0.72.5 2
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: +++ 0.72.6 (null)
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: -- /cib/status/node_state[@id='3']/lrm[@id='3']
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib: @num_updates=6
> May 18 23:17:34 [2530] apache-up001.itc4u.local crmd: info: crm_update_peer_join: do_dc_join_ack: Node apache-up001.ring0[1] - join-4 phase 3 -> 4
> May 18 23:17:34 [2530] apache-up001.itc4u.local crmd: info: do_dc_join_ack: join-4: Updating node state to member for apache-up001.ring0
> May 18 23:17:34 [2530] apache-up001.itc4u.local crmd: info: erase_status_tag: Deleting xpath: //node_state[@uname='apache-up001.ring0']/lrm
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_delete operation for section //node_state[@uname='apache-up003.ring0']/lrm: OK (rc=0, origin=apache-up001.ring0/crmd/97, version=0.72.6)
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_file_write_with_digest: Wrote version 0.72.0 of the CIB to disk (digest: 564dd34a1d240aada3f793d621e91aff)
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: --- 0.72.6 2
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: +++ 0.72.7 (null)
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib: @num_updates=7
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib/status/node_state[@id='3']: @crm-debug-origin=do_lrm_query_internal
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ /cib/status/node_state[@id='3']: <lrm id="3"/>
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ <lrm_resources>
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ <lrm_resource id="scsi" type="fence_scsi" class="stonith">
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ <lrm_rsc_op id="scsi_last_0" operation_key="scsi_monitor_0" operation="monitor" crm-debug-origin="build_active_RAs" crm_feature_set="3.0.10" transition-key="4:1:7:cf3499c5-bb9c-4d70-b209-80cc23a4ff1d" transition-magic="0:7;4:1:7:cf3499c5-bb9c-4d70-b209-80cc23a4ff1d" on_node="apache-up003.ring0" call-id="5" rc-code="7" op-status="0" interval="0" last-run="1463605661" last-rc-chang
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ </lrm_resource>
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ </lrm_resources>
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ </lrm>
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section status: OK (rc=0, origin=apache-up001.ring0/crmd/98, version=0.72.7)
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Forwarding cib_delete operation for section //node_state[@uname='apache-up002.ring0']/lrm to master (origin=local/crmd/99)
> May 18 23:17:34 [2526] apache-up001.itc4u.local stonith-ng: debug: xml_patch_version_check: Can apply patch 0.72.6 to 0.72.5
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Forwarding cib_modify operation for section status to master (origin=local/crmd/100)
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Forwarding cib_delete operation for section //node_state[@uname='apache-up001.ring0']/lrm to master (origin=local/crmd/101)
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Forwarding cib_modify operation for section status to master (origin=local/crmd/102)
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: --- 0.72.7 2
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: +++ 0.72.8 (null)
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: -- /cib/status/node_state[@id='2']/lrm[@id='2']
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib: @num_updates=8
> May 18 23:17:34 [2526] apache-up001.itc4u.local stonith-ng: debug: xml_patch_version_check: Can apply patch 0.72.7 to 0.72.6
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_delete operation for section //node_state[@uname='apache-up002.ring0']/lrm: OK (rc=0, origin=apache-up001.ring0/crmd/99, version=0.72.8)
> May 18 23:17:34 [2526] apache-up001.itc4u.local stonith-ng: debug: xml_patch_version_check: Can apply patch 0.72.8 to 0.72.7
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: --- 0.72.8 2
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: +++ 0.72.9 (null)
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib: @num_updates=9
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib/status/node_state[@id='2']: @crm-debug-origin=do_lrm_query_internal
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ /cib/status/node_state[@id='2']: <lrm id="2"/>
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ <lrm_resources>
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ <lrm_resource id="scsi" type="fence_scsi" class="stonith">
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ <lrm_rsc_op id="scsi_last_0" operation_key="scsi_monitor_0" operation="monitor" crm-debug-origin="build_active_RAs" crm_feature_set="3.0.10" transition-key="2:1:7:cf3499c5-bb9c-4d70-b209-80cc23a4ff1d" transition-magic="0:7;2:1:7:cf3499c5-bb9c-4d70-b209-80cc23a4ff1d" on_node="apache-up002.ring0" call-id="5" rc-code="7" op-status="0" interval="0" last-run="1463605661" last-rc-chang
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ </lrm_resource>
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ </lrm_resources>
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ </lrm>
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_file_write_with_digest: Reading cluster configuration file /var/lib/pacemaker/cib/cib.6auAXf (digest: /var/lib/pacemaker/cib/cib.NFuvlT)
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section status: OK (rc=0, origin=apache-up001.ring0/crmd/100, version=0.72.9)
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: --- 0.72.9 2
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: +++ 0.72.10 (null)
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: -- /cib/status/node_state[@id='1']/lrm[@id='1']
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib: @num_updates=10
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_delete operation for section //node_state[@uname='apache-up001.ring0']/lrm: OK (rc=0, origin=apache-up001.ring0/crmd/101, version=0.72.10)
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: --- 0.72.10 2
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: +++ 0.72.11 (null)
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib: @num_updates=11
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib/status/node_state[@id='1']: @crm-debug-origin=do_lrm_query_internal
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ /cib/status/node_state[@id='1']: <lrm id="1"/>
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ <lrm_resources>
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ <lrm_resource id="scsi" type="fence_scsi" class="stonith">
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ <lrm_rsc_op id="scsi_last_0" operation_key="scsi_monitor_0" operation="monitor" crm-debug-origin="build_active_RAs" crm_feature_set="3.0.10" transition-key="2:0:7:cf3499c5-bb9c-4d70-b209-80cc23a4ff1d" transition-magic="0:7;2:0:7:cf3499c5-bb9c-4d70-b209-80cc23a4ff1d" on_node="apache-up001.ring0" call-id="5" rc-code="7" op-status="0" interval="0" last-run="1463605301" last-rc-chang
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ </lrm_resource>
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ </lrm_resources>
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ </lrm>
> May 18 23:17:34 [2530] apache-up001.itc4u.local crmd: info: do_state_transition: State transition S_FINALIZE_JOIN -> S_POLICY_ENGINE [ input=I_FINALIZED cause=C_FSA_INTERNAL origin=check_join_state ]
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section status: OK (rc=0, origin=apache-up001.ring0/crmd/102, version=0.72.11)
> May 18 23:17:34 [2526] apache-up001.itc4u.local stonith-ng: debug: xml_patch_version_check: Can apply patch 0.72.9 to 0.72.8
> May 18 23:17:34 [2526] apache-up001.itc4u.local stonith-ng: debug: xml_patch_version_check: Can apply patch 0.72.10 to 0.72.9
> May 18 23:17:34 [2526] apache-up001.itc4u.local stonith-ng: debug: xml_patch_version_check: Can apply patch 0.72.11 to 0.72.10
> May 18 23:17:34 [2530] apache-up001.itc4u.local crmd: info: abort_transition_graph: Transition aborted: Peer Cancelled (source=do_te_invoke:161, 1)
> May 18 23:17:34 [2530] apache-up001.itc4u.local crmd: info: abort_transition_graph: Transition aborted by scsi_monitor_0 'create' on apache-up001.ring0: Old event (magic=0:7;2:0:7:cf3499c5-bb9c-4d70-b209-80cc23a4ff1d, cib=0.72.11, source=process_graph_event:605, 1)
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Forwarding cib_modify operation for section nodes to master (origin=local/crmd/106)
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Forwarding cib_modify operation for section status to master (origin=local/crmd/107)
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Forwarding cib_modify operation for section cib to master (origin=local/crmd/108)
> May 18 23:17:34 [2530] apache-up001.itc4u.local crmd: info: process_graph_event: Detected action (0.2) scsi_monitor_0.5=not running: arrived really late
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section nodes: OK (rc=0, origin=apache-up001.ring0/crmd/106, version=0.72.11)
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: --- 0.72.11 2
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: +++ 0.72.12 (null)
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib: @num_updates=12
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib/status/node_state[@id='1']: @crm-debug-origin=do_state_transition
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib/status/node_state[@id='3']: @crm-debug-origin=do_state_transition
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib/status/node_state[@id='2']: @crm-debug-origin=do_state_transition
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section status: OK (rc=0, origin=apache-up001.ring0/crmd/107, version=0.72.12)
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section cib: OK (rc=0, origin=apache-up001.ring0/crmd/108, version=0.72.12)
> May 18 23:17:34 [2528] apache-up001.itc4u.local attrd: info: attrd_client_refresh: Updating all attributes
> May 18 23:17:34 [2528] apache-up001.itc4u.local attrd: info: write_attribute: Sent update 8 with 3 changes for shutdown, id=<n/a>, set=(null)
> May 18 23:17:34 [2528] apache-up001.itc4u.local attrd: info: write_attribute: Sent update 9 with 3 changes for terminate, id=<n/a>, set=(null)
> May 18 23:17:34 [2526] apache-up001.itc4u.local stonith-ng: debug: xml_patch_version_check: Can apply patch 0.72.12 to 0.72.11
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Forwarding cib_modify operation for section status to master (origin=local/attrd/8)
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Forwarding cib_modify operation for section status to master (origin=local/attrd/9)
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section status: OK (rc=0, origin=apache-up001.ring0/attrd/8, version=0.72.12)
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: --- 0.72.12 2
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: +++ 0.72.13 (null)
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib: @num_updates=13
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section status: OK (rc=0, origin=apache-up001.ring0/attrd/9, version=0.72.13)
> May 18 23:17:34 [2528] apache-up001.itc4u.local attrd: info: attrd_cib_callback: Update 8 for shutdown: OK (0)
> May 18 23:17:34 [2528] apache-up001.itc4u.local attrd: info: attrd_cib_callback: Update 8 for shutdown[apache-up001.ring0]=0: OK (0)
> May 18 23:17:34 [2528] apache-up001.itc4u.local attrd: info: attrd_cib_callback: Update 8 for shutdown[apache-up002.ring0]=0: OK (0)
> May 18 23:17:34 [2528] apache-up001.itc4u.local attrd: info: attrd_cib_callback: Update 8 for shutdown[apache-up003.ring0]=0: OK (0)
> May 18 23:17:34 [2528] apache-up001.itc4u.local attrd: info: attrd_cib_callback: Update 9 for terminate: OK (0)
> May 18 23:17:34 [2528] apache-up001.itc4u.local attrd: info: attrd_cib_callback: Update 9 for terminate[apache-up001.ring0]=(null): OK (0)
> May 18 23:17:34 [2528] apache-up001.itc4u.local attrd: info: attrd_cib_callback: Update 9 for terminate[apache-up002.ring0]=(null): OK (0)
> May 18 23:17:34 [2528] apache-up001.itc4u.local attrd: info: attrd_cib_callback: Update 9 for terminate[apache-up003.ring0]=(null): OK (0)
> May 18 23:17:34 [2526] apache-up001.itc4u.local stonith-ng: debug: xml_patch_version_check: Can apply patch 0.72.13 to 0.72.12
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_file_backup: Archived previous version as /var/lib/pacemaker/cib/cib-7.raw
> May 18 23:17:34 [2529] apache-up001.itc4u.local pengine: warning: common_unpack: dlm-clone requires (un)fencing but fencing is disabled
> May 18 23:17:34 [2529] apache-up001.itc4u.local pengine: warning: common_unpack: dlm:0 requires (un)fencing but fencing is disabled
> May 18 23:17:34 [2529] apache-up001.itc4u.local pengine: warning: common_unpack: dlm:1 requires (un)fencing but fencing is disabled
> May 18 23:17:34 [2529] apache-up001.itc4u.local pengine: warning: common_unpack: dlm:2 requires (un)fencing but fencing is disabled
> May 18 23:17:34 [2529] apache-up001.itc4u.local pengine: warning: common_unpack: clvmd-clone requires (un)fencing but fencing is disabled
> May 18 23:17:34 [2529] apache-up001.itc4u.local pengine: warning: common_unpack: clvmd:0 requires (un)fencing but fencing is disabled
> May 18 23:17:34 [2529] apache-up001.itc4u.local pengine: warning: common_unpack: clvmd:1 requires (un)fencing but fencing is disabled
> May 18 23:17:34 [2529] apache-up001.itc4u.local pengine: warning: common_unpack: clvmd:2 requires (un)fencing but fencing is disabled
> May 18 23:17:34 [2529] apache-up001.itc4u.local pengine: info: determine_online_status: Node apache-up001.ring0 is online
> May 18 23:17:34 [2529] apache-up001.itc4u.local pengine: info: determine_online_status: Node apache-up003.ring0 is online
> May 18 23:17:34 [2529] apache-up001.itc4u.local pengine: info: determine_online_status: Node apache-up002.ring0 is online
> May 18 23:17:34 [2529] apache-up001.itc4u.local pengine: info: native_print: scsi (stonith:fence_scsi): Stopped
> May 18 23:17:34 [2529] apache-up001.itc4u.local pengine: info: clone_print: Clone Set: dlm-clone [dlm]
> May 18 23:17:34 [2529] apache-up001.itc4u.local pengine: info: short_print: Stopped: [ apache-up001.ring0 apache-up002.ring0 apache-up003.ring0 ]
> May 18 23:17:34 [2529] apache-up001.itc4u.local pengine: info: clone_print: Clone Set: clvmd-clone [clvmd]
> May 18 23:17:34 [2529] apache-up001.itc4u.local pengine: info: short_print: Stopped: [ apache-up001.ring0 apache-up002.ring0 apache-up003.ring0 ]
> May 18 23:17:34 [2529] apache-up001.itc4u.local pengine: info: RecurringOp: Start recurring monitor (60s) for scsi on apache-up001.ring0
> May 18 23:17:34 [2529] apache-up001.itc4u.local pengine: error: unpack_operation: Specifying on_fail=fence and stonith-enabled=false makes no sense
> May 18 23:17:34 [2529] apache-up001.itc4u.local pengine: info: RecurringOp: Start recurring monitor (120s) for dlm:0 on apache-up002.ring0
> May 18 23:17:34 [2529] apache-up001.itc4u.local pengine: error: unpack_operation: Specifying on_fail=fence and stonith-enabled=false makes no sense
> May 18 23:17:34 [2529] apache-up001.itc4u.local pengine: info: RecurringOp: Start recurring monitor (120s) for dlm:1 on apache-up003.ring0
> May 18 23:17:34 [2529] apache-up001.itc4u.local pengine: error: unpack_operation: Specifying on_fail=fence and stonith-enabled=false makes no sense
> May 18 23:17:34 [2529] apache-up001.itc4u.local pengine: info: RecurringOp: Start recurring monitor (120s) for dlm:2 on apache-up001.ring0
> May 18 23:17:34 [2529] apache-up001.itc4u.local pengine: error: unpack_operation: Specifying on_fail=fence and stonith-enabled=false makes no sense
> May 18 23:17:34 [2529] apache-up001.itc4u.local pengine: info: RecurringOp: Start recurring monitor (120s) for clvmd:0 on apache-up002.ring0
> May 18 23:17:34 [2529] apache-up001.itc4u.local pengine: error: unpack_operation: Specifying on_fail=fence and stonith-enabled=false makes no sense
> May 18 23:17:34 [2529] apache-up001.itc4u.local pengine: info: RecurringOp: Start recurring monitor (120s) for clvmd:1 on apache-up003.ring0
> May 18 23:17:34 [2529] apache-up001.itc4u.local pengine: error: unpack_operation: Specifying on_fail=fence and stonith-enabled=false makes no sense
> May 18 23:17:34 [2529] apache-up001.itc4u.local pengine: info: RecurringOp: Start recurring monitor (120s) for clvmd:2 on apache-up001.ring0
> May 18 23:17:34 [2529] apache-up001.itc4u.local pengine: notice: LogActions: Start scsi (apache-up001.ring0)
> May 18 23:17:34 [2529] apache-up001.itc4u.local pengine: notice: LogActions: Start dlm:0 (apache-up002.ring0)
> May 18 23:17:34 [2529] apache-up001.itc4u.local pengine: notice: LogActions: Start dlm:1 (apache-up003.ring0)
> May 18 23:17:34 [2529] apache-up001.itc4u.local pengine: notice: LogActions: Start dlm:2 (apache-up001.ring0)
> May 18 23:17:34 [2529] apache-up001.itc4u.local pengine: notice: LogActions: Start clvmd:0 (apache-up002.ring0)
> May 18 23:17:34 [2529] apache-up001.itc4u.local pengine: notice: LogActions: Start clvmd:1 (apache-up003.ring0)
> May 18 23:17:34 [2529] apache-up001.itc4u.local pengine: notice: LogActions: Start clvmd:2 (apache-up001.ring0)
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_file_write_with_digest: Wrote version 0.72.0 of the CIB to disk (digest: 564dd34a1d240aada3f793d621e91aff)
> May 18 23:17:34 [2529] apache-up001.itc4u.local pengine: notice: process_pe_message: Calculated Transition 2: /var/lib/pacemaker/pengine/pe-input-124.bz2
> May 18 23:17:34 [2529] apache-up001.itc4u.local pengine: notice: process_pe_message: Configuration ERRORs found during PE processing. Please run "crm_verify -L" to identify issues.
> May 18 23:17:34 [2530] apache-up001.itc4u.local crmd: info: do_state_transition: State transition S_POLICY_ENGINE -> S_TRANSITION_ENGINE [ input=I_PE_SUCCESS cause=C_IPC_MESSAGE origin=handle_response ]
> May 18 23:17:34 [2530] apache-up001.itc4u.local crmd: info: do_te_invoke: Processing graph 2 (ref=pe_calc-dc-1463606254-44) derived from /var/lib/pacemaker/pengine/pe-input-124.bz2
> May 18 23:17:34 [2530] apache-up001.itc4u.local crmd: notice: te_rsc_command: Initiating action 8: start scsi_start_0 on apache-up001.ring0 (local)
> May 18 23:17:34 [2530] apache-up001.itc4u.local crmd: info: do_lrm_rsc_op: Performing key=8:2:0:cf3499c5-bb9c-4d70-b209-80cc23a4ff1d op=scsi_start_0
> May 18 23:17:34 [2527] apache-up001.itc4u.local lrmd: info: log_execute: executing - rsc:scsi action:start call_id:6
> May 18 23:17:34 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_command: Processing st_device_register 3 from lrmd.2527 ( 1000)
> May 18 23:17:34 [2526] apache-up001.itc4u.local stonith-ng: info: build_device_from_xml: The fencing device 'scsi' requires unfencing
> May 18 23:17:34 [2526] apache-up001.itc4u.local stonith-ng: info: build_device_from_xml: The fencing device 'scsi' requires actions (on) to be executed on the target node
> May 18 23:17:34 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_device_register: Device 'scsi' already existed in device list (1 active devices)
> May 18 23:17:34 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_command: Processed st_device_register from lrmd.2527: OK (0)
> May 18 23:17:34 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_command: Processing st_execute 4 from lrmd.2527 ( 0)
> May 18 23:17:34 [2526] apache-up001.itc4u.local stonith-ng: debug: schedule_stonith_command: Scheduling monitor on scsi for 2ae02530-4f1c-4c84-8ad5-a40e1acdc866 (timeout=20s)
> May 18 23:17:34 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_command: Processed st_execute from lrmd.2527: Operation now in progress (-115)
> May 18 23:17:34 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_action_create: Initiating action monitor for agent fence_scsi (target=(null))
> May 18 23:17:34 [2526] apache-up001.itc4u.local stonith-ng: debug: internal_stonith_action_execute: forking
> May 18 23:17:34 [2530] apache-up001.itc4u.local crmd: notice: te_rsc_command: Initiating action 4: monitor dlm:0_monitor_0 on apache-up002.ring0
> May 18 23:17:34 [2526] apache-up001.itc4u.local stonith-ng: debug: internal_stonith_action_execute: sending args
> May 18 23:17:34 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_device_execute: Operation monitor on scsi now running with pid=6643, timeout=20s
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_file_write_with_digest: Reading cluster configuration file /var/lib/pacemaker/cib/cib.Wu4Rhk (digest: /var/lib/pacemaker/cib/cib.4bJcIX)
> May 18 23:17:34 [2530] apache-up001.itc4u.local crmd: notice: te_rsc_command: Initiating action 6: monitor dlm:1_monitor_0 on apache-up003.ring0
> May 18 23:17:34 [2530] apache-up001.itc4u.local crmd: notice: te_rsc_command: Initiating action 2: monitor dlm:2_monitor_0 on apache-up001.ring0 (local)
> May 18 23:17:34 [2527] apache-up001.itc4u.local lrmd: info: process_lrmd_get_rsc_info: Resource 'dlm' not found (1 active resources)
> May 18 23:17:34 [2527] apache-up001.itc4u.local lrmd: info: process_lrmd_get_rsc_info: Resource 'dlm:2' not found (1 active resources)
> May 18 23:17:34 [2527] apache-up001.itc4u.local lrmd: info: process_lrmd_rsc_register: Added 'dlm' to the rsc list (2 active resources)
> May 18 23:17:34 [2530] apache-up001.itc4u.local crmd: info: do_lrm_rsc_op: Performing key=2:2:7:cf3499c5-bb9c-4d70-b209-80cc23a4ff1d op=dlm_monitor_0
> May 18 23:17:34 [2530] apache-up001.itc4u.local crmd: notice: te_rsc_command: Initiating action 5: monitor clvmd:0_monitor_0 on apache-up002.ring0
> May 18 23:17:34 [2530] apache-up001.itc4u.local crmd: notice: te_rsc_command: Initiating action 7: monitor clvmd:1_monitor_0 on apache-up003.ring0
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: --- 0.72.13 2
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: +++ 0.72.14 (null)
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib: @num_updates=14
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib/status/node_state[@id='3']: @crm-debug-origin=do_update_resource
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ /cib/status/node_state[@id='3']/lrm[@id='3']/lrm_resources: <lrm_resource id="dlm" type="controld" class="ocf" provider="pacemaker"/>
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ <lrm_rsc_op id="dlm_last_0" operation_key="dlm_monitor_0" operation="monitor" crm-debug-origin="do_update_resource" crm_feature_set="3.0.10" transition-key="6:2:7:cf3499c5-bb9c-4d70-b209-80cc23a4ff1d" transition-magic="0:7;6:2:7:cf3499c5-bb9c-4d70-b209-80cc23a4ff1d" on_node="apache-up003.ring0" call-id="10" rc-code="7" op-status="0" interval="0" last-run="1
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ </lrm_resource>
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section status: OK (rc=0, origin=apache-up003.ring0/crmd/16, version=0.72.14)
> May 18 23:17:34 [2526] apache-up001.itc4u.local stonith-ng: debug: xml_patch_version_check: Can apply patch 0.72.14 to 0.72.13
> May 18 23:17:34 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_action_async_done: Child process 6643 performing action 'monitor' exited with rc 1
> May 18 23:17:34 [2526] apache-up001.itc4u.local stonith-ng: warning: log_action: fence_scsi[6643] stderr: [ Failed: Cannot open file "/var/run/cluster/fence_scsi.key" ]
> May 18 23:17:34 [2526] apache-up001.itc4u.local stonith-ng: warning: log_action: fence_scsi[6643] stderr: [ ]
> May 18 23:17:34 [2526] apache-up001.itc4u.local stonith-ng: warning: log_action: fence_scsi[6643] stderr: [ Please use '-h' for usage ]
> May 18 23:17:34 [2526] apache-up001.itc4u.local stonith-ng: warning: log_action: fence_scsi[6643] stderr: [ ]
> May 18 23:17:34 [2526] apache-up001.itc4u.local stonith-ng: info: internal_stonith_action_execute: Attempt 2 to execute fence_scsi (monitor). remaining timeout is 20
> May 18 23:17:34 [2526] apache-up001.itc4u.local stonith-ng: debug: internal_stonith_action_execute: forking
> May 18 23:17:34 [2526] apache-up001.itc4u.local stonith-ng: debug: internal_stonith_action_execute: sending args
> May 18 23:17:34 [2526] apache-up001.itc4u.local stonith-ng: debug: child_waitpid: wait(6673) = 0: Success (0)
> May 18 23:17:34 [2530] apache-up001.itc4u.local crmd: info: action_synced_wait: Managed controld_meta-data_0 process 6666 exited with rc=0
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Forwarding cib_modify operation for section status to master (origin=local/crmd/111)
> May 18 23:17:34 [2530] apache-up001.itc4u.local crmd: notice: process_lrm_event: Operation dlm_monitor_0: not running (node=apache-up001.ring0, call=11, rc=7, cib-update=111, confirmed=true)
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: --- 0.72.14 2
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: +++ 0.72.15 (null)
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib: @num_updates=15
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib/status/node_state[@id='1']: @crm-debug-origin=do_update_resource
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ /cib/status/node_state[@id='1']/lrm[@id='1']/lrm_resources: <lrm_resource id="dlm" type="controld" class="ocf" provider="pacemaker"/>
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ <lrm_rsc_op id="dlm_last_0" operation_key="dlm_monitor_0" operation="monitor" crm-debug-origin="do_update_resource" crm_feature_set="3.0.10" transition-key="2:2:7:cf3499c5-bb9c-4d70-b209-80cc23a4ff1d" transition-magic="0:7;2:2:7:cf3499c5-bb9c-4d70-b209-80cc23a4ff1d" on_node="apache-up001.ring0" call-id="11" rc-code="7" op-status="0" interval="0" last-run="1
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ </lrm_resource>
> May 18 23:17:34 [2530] apache-up001.itc4u.local crmd: info: match_graph_event: Action dlm_monitor_0 (6) confirmed on apache-up003.ring0 (rc=7)
> May 18 23:17:34 [2530] apache-up001.itc4u.local crmd: info: match_graph_event: Action dlm_monitor_0 (2) confirmed on apache-up001.ring0 (rc=7)
> May 18 23:17:34 [2526] apache-up001.itc4u.local stonith-ng: debug: xml_patch_version_check: Can apply patch 0.72.15 to 0.72.14
> May 18 23:17:34 [2530] apache-up001.itc4u.local crmd: notice: te_rsc_command: Initiating action 3: monitor clvmd:2_monitor_0 on apache-up001.ring0 (local)
> May 18 23:17:34 [2527] apache-up001.itc4u.local lrmd: info: process_lrmd_get_rsc_info: Resource 'clvmd' not found (2 active resources)
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section status: OK (rc=0, origin=apache-up001.ring0/crmd/111, version=0.72.15)
> May 18 23:17:34 [2527] apache-up001.itc4u.local lrmd: info: process_lrmd_get_rsc_info: Resource 'clvmd:2' not found (2 active resources)
> May 18 23:17:34 [2527] apache-up001.itc4u.local lrmd: info: process_lrmd_rsc_register: Added 'clvmd' to the rsc list (3 active resources)
> May 18 23:17:34 [2530] apache-up001.itc4u.local crmd: info: do_lrm_rsc_op: Performing key=3:2:7:cf3499c5-bb9c-4d70-b209-80cc23a4ff1d op=clvmd_monitor_0
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: --- 0.72.15 2
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: +++ 0.72.16 (null)
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib: @num_updates=16
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib/status/node_state[@id='2']: @crm-debug-origin=do_update_resource
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ /cib/status/node_state[@id='2']/lrm[@id='2']/lrm_resources: <lrm_resource id="dlm" type="controld" class="ocf" provider="pacemaker"/>
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ <lrm_rsc_op id="dlm_last_0" operation_key="dlm_monitor_0" operation="monitor" crm-debug-origin="do_update_resource" crm_feature_set="3.0.10" transition-key="4:2:7:cf3499c5-bb9c-4d70-b209-80cc23a4ff1d" transition-magic="0:7;4:2:7:cf3499c5-bb9c-4d70-b209-80cc23a4ff1d" on_node="apache-up002.ring0" call-id="10" rc-code="7" op-status="0" interval="0" last-run="1
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ </lrm_resource>
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section status: OK (rc=0, origin=apache-up002.ring0/crmd/16, version=0.72.16)
> May 18 23:17:34 [2530] apache-up001.itc4u.local crmd: info: match_graph_event: Action dlm_monitor_0 (4) confirmed on apache-up002.ring0 (rc=7)
> May 18 23:17:34 [2530] apache-up001.itc4u.local crmd: notice: te_rsc_command: Initiating action 10: start dlm:0_start_0 on apache-up002.ring0
> May 18 23:17:34 [2526] apache-up001.itc4u.local stonith-ng: debug: xml_patch_version_check: Can apply patch 0.72.16 to 0.72.15
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: --- 0.72.16 2
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: +++ 0.72.17 (null)
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib: @num_updates=17
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ /cib/status/node_state[@id='3']/lrm[@id='3']/lrm_resources: <lrm_resource id="clvmd" type="clvm" class="ocf" provider="heartbeat"/>
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ <lrm_rsc_op id="clvmd_last_0" operation_key="clvmd_monitor_0" operation="monitor" crm-debug-origin="do_update_resource" crm_feature_set="3.0.10" transition-key="7:2:7:cf3499c5-bb9c-4d70-b209-80cc23a4ff1d" transition-magic="0:7;7:2:7:cf3499c5-bb9c-4d70-b209-80cc23a4ff1d" on_node="apache-up003.ring0" call-id="15" rc-code="7" op-status="0" interval="0" last-ru
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ </lrm_resource>
> May 18 23:17:34 [2526] apache-up001.itc4u.local stonith-ng: debug: xml_patch_version_check: Can apply patch 0.72.17 to 0.72.16
> May 18 23:17:34 [2530] apache-up001.itc4u.local crmd: info: match_graph_event: Action clvmd_monitor_0 (7) confirmed on apache-up003.ring0 (rc=7)
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section status: OK (rc=0, origin=apache-up003.ring0/crmd/17, version=0.72.17)
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: --- 0.72.17 2
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: +++ 0.72.18 (null)
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib: @num_updates=18
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ /cib/status/node_state[@id='2']/lrm[@id='2']/lrm_resources: <lrm_resource id="clvmd" type="clvm" class="ocf" provider="heartbeat"/>
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ <lrm_rsc_op id="clvmd_last_0" operation_key="clvmd_monitor_0" operation="monitor" crm-debug-origin="do_update_resource" crm_feature_set="3.0.10" transition-key="5:2:7:cf3499c5-bb9c-4d70-b209-80cc23a4ff1d" transition-magic="0:7;5:2:7:cf3499c5-bb9c-4d70-b209-80cc23a4ff1d" on_node="apache-up002.ring0" call-id="15" rc-code="7" op-status="0" interval="0" last-ru
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ </lrm_resource>
> May 18 23:17:34 [2526] apache-up001.itc4u.local stonith-ng: debug: xml_patch_version_check: Can apply patch 0.72.18 to 0.72.17
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section status: OK (rc=0, origin=apache-up002.ring0/crmd/17, version=0.72.18)
> May 18 23:17:34 [2530] apache-up001.itc4u.local crmd: info: match_graph_event: Action clvmd_monitor_0 (5) confirmed on apache-up002.ring0 (rc=7)
> clvm(clvmd)[6674]: 2016/05/18_23:17:34 INFO: clvmd is not running
> May 18 23:17:34 [2530] apache-up001.itc4u.local crmd: info: action_synced_wait: Managed clvm_meta-data_0 process 6703 exited with rc=0
> May 18 23:17:34 [2530] apache-up001.itc4u.local crmd: notice: process_lrm_event: Operation clvmd_monitor_0: not running (node=apache-up001.ring0, call=16, rc=7, cib-update=112, confirmed=true)
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Forwarding cib_modify operation for section status to master (origin=local/crmd/112)
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: --- 0.72.18 2
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: +++ 0.72.19 (null)
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib: @num_updates=19
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ /cib/status/node_state[@id='1']/lrm[@id='1']/lrm_resources: <lrm_resource id="clvmd" type="clvm" class="ocf" provider="heartbeat"/>
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ <lrm_rsc_op id="clvmd_last_0" operation_key="clvmd_monitor_0" operation="monitor" crm-debug-origin="do_update_resource" crm_feature_set="3.0.10" transition-key="3:2:7:cf3499c5-bb9c-4d70-b209-80cc23a4ff1d" transition-magic="0:7;3:2:7:cf3499c5-bb9c-4d70-b209-80cc23a4ff1d" on_node="apache-up001.ring0" call-id="16" rc-code="7" op-status="0" interval="0" last-ru
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ </lrm_resource>
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section status: OK (rc=0, origin=apache-up001.ring0/crmd/112, version=0.72.19)
> May 18 23:17:34 [2526] apache-up001.itc4u.local stonith-ng: debug: xml_patch_version_check: Can apply patch 0.72.19 to 0.72.18
> May 18 23:17:34 [2530] apache-up001.itc4u.local crmd: info: match_graph_event: Action clvmd_monitor_0 (3) confirmed on apache-up001.ring0 (rc=7)
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: --- 0.72.19 2
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: +++ 0.72.20 (null)
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib: @num_updates=20
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib/status/node_state[@id='2']/lrm[@id='2']/lrm_resources/lrm_resource[@id='dlm']/lrm_rsc_op[@id='dlm_last_0']: @operation_key=dlm_start_0, @operation=start, @transition-key=10:2:0:cf3499c5-bb9c-4d70-b209-80cc23a4ff1d, @transition-magic=0:6;10:2:0:cf3499c5-bb9c-4d70-b209-80cc23a4ff1d, @call-id=16, @rc-code=6, @exec-time=102
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ /cib/status/node_state[@id='2']/lrm[@id='2']/lrm_resources/lrm_resource[@id='dlm']: <lrm_rsc_op id="dlm_last_failure_0" operation_key="dlm_start_0" operation="start" crm-debug-origin="do_update_resource" crm_feature_set="3.0.10" transition-key="10:2:0:cf3499c5-bb9c-4d70-b209-80cc23a4ff1d" transition-magic="0:6;10:2:0:cf3499c5-bb9c-4d70-b209-80cc23a4ff1d" on_node="apache-up002.ring0" call-id="16" rc-code="6" op-status=
> May 18 23:17:34 [2526] apache-up001.itc4u.local stonith-ng: debug: xml_patch_version_check: Can apply patch 0.72.20 to 0.72.19
> May 18 23:17:34 [2530] apache-up001.itc4u.local crmd: warning: status_from_rc: Action 10 (dlm:0_start_0) on apache-up002.ring0 failed (target: 0 vs. rc: 6): Error
> May 18 23:17:34 [2530] apache-up001.itc4u.local crmd: notice: abort_transition_graph: Transition aborted by dlm_start_0 'modify' on apache-up002.ring0: Event failed (magic=0:6;10:2:0:cf3499c5-bb9c-4d70-b209-80cc23a4ff1d, cib=0.72.20, source=match_graph_event:381, 0)
> May 18 23:17:34 [2530] apache-up001.itc4u.local crmd: info: match_graph_event: Action dlm_start_0 (10) confirmed on apache-up002.ring0 (rc=6)
> May 18 23:17:34 [2530] apache-up001.itc4u.local crmd: info: update_failcount: Updating failcount for dlm on apache-up002.ring0 after failed start: rc=6 (update=INFINITY, time=1463606254)
> May 18 23:17:34 [2530] apache-up001.itc4u.local crmd: info: process_graph_event: Detected action (2.10) dlm_start_0.16=not configured: failed
> May 18 23:17:34 [2530] apache-up001.itc4u.local crmd: warning: status_from_rc: Action 10 (dlm:0_start_0) on apache-up002.ring0 failed (target: 0 vs. rc: 6): Error
> May 18 23:17:34 [2530] apache-up001.itc4u.local crmd: info: abort_transition_graph: Transition aborted by dlm_start_0 'create' on apache-up002.ring0: Event failed (magic=0:6;10:2:0:cf3499c5-bb9c-4d70-b209-80cc23a4ff1d, cib=0.72.20, source=match_graph_event:381, 0)
> May 18 23:17:34 [2530] apache-up001.itc4u.local crmd: info: match_graph_event: Action dlm_start_0 (10) confirmed on apache-up002.ring0 (rc=6)
> May 18 23:17:34 [2530] apache-up001.itc4u.local crmd: info: update_failcount: Updating failcount for dlm on apache-up002.ring0 after failed start: rc=6 (update=INFINITY, time=1463606254)
> May 18 23:17:34 [2530] apache-up001.itc4u.local crmd: info: process_graph_event: Detected action (2.10) dlm_start_0.16=not configured: failed
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section status: OK (rc=0, origin=apache-up002.ring0/crmd/18, version=0.72.20)
> May 18 23:17:34 [2528] apache-up001.itc4u.local attrd: info: attrd_peer_update: Setting fail-count-dlm[apache-up002.ring0]: (null) -> INFINITY from apache-up001.ring0
> May 18 23:17:34 [2528] apache-up001.itc4u.local attrd: info: write_attribute: Sent update 10 with 1 changes for fail-count-dlm, id=<n/a>, set=(null)
> May 18 23:17:34 [2528] apache-up001.itc4u.local attrd: info: attrd_peer_update: Setting last-failure-dlm[apache-up002.ring0]: (null) -> 1463606254 from apache-up001.ring0
> May 18 23:17:34 [2528] apache-up001.itc4u.local attrd: info: write_attribute: Sent update 11 with 1 changes for last-failure-dlm, id=<n/a>, set=(null)
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Forwarding cib_modify operation for section status to master (origin=local/attrd/10)
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Forwarding cib_modify operation for section status to master (origin=local/attrd/11)
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: --- 0.72.20 2
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: +++ 0.72.21 (null)
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib: @num_updates=21
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ /cib/status/node_state[@id='2']/transient_attributes[@id='2']/instance_attributes[@id='status-2']: <nvpair id="status-2-fail-count-dlm" name="fail-count-dlm" value="INFINITY"/>
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section status: OK (rc=0, origin=apache-up001.ring0/attrd/10, version=0.72.21)
> May 18 23:17:34 [2526] apache-up001.itc4u.local stonith-ng: debug: xml_patch_version_check: Can apply patch 0.72.21 to 0.72.20
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: --- 0.72.21 2
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: +++ 0.72.22 (null)
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib: @num_updates=22
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ /cib/status/node_state[@id='2']/transient_attributes[@id='2']/instance_attributes[@id='status-2']: <nvpair id="status-2-last-failure-dlm" name="last-failure-dlm" value="1463606254"/>
> May 18 23:17:34 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section status: OK (rc=0, origin=apache-up001.ring0/attrd/11, version=0.72.22)
> May 18 23:17:34 [2528] apache-up001.itc4u.local attrd: info: attrd_cib_callback: Update 10 for fail-count-dlm: OK (0)
> May 18 23:17:34 [2528] apache-up001.itc4u.local attrd: info: attrd_cib_callback: Update 10 for fail-count-dlm[apache-up002.ring0]=INFINITY: OK (0)
> May 18 23:17:34 [2528] apache-up001.itc4u.local attrd: info: attrd_cib_callback: Update 11 for last-failure-dlm: OK (0)
> May 18 23:17:34 [2528] apache-up001.itc4u.local attrd: info: attrd_cib_callback: Update 11 for last-failure-dlm[apache-up002.ring0]=1463606254: OK (0)
> May 18 23:17:34 [2526] apache-up001.itc4u.local stonith-ng: debug: xml_patch_version_check: Can apply patch 0.72.22 to 0.72.21
> May 18 23:17:34 [2530] apache-up001.itc4u.local crmd: notice: abort_transition_graph: Transition aborted by status-2-fail-count-dlm, fail-count-dlm=INFINITY: Transient attribute change (create cib=0.72.21, source=abort_unless_down:329, path=/cib/status/node_state[@id='2']/transient_attributes[@id='2']/instance_attributes[@id='status-2'], 0)
> May 18 23:17:34 [2530] apache-up001.itc4u.local crmd: info: abort_transition_graph: Transition aborted by status-2-last-failure-dlm, last-failure-dlm=1463606254: Transient attribute change (create cib=0.72.22, source=abort_unless_down:329, path=/cib/status/node_state[@id='2']/transient_attributes[@id='2']/instance_attributes[@id='status-2'], 0)
> May 18 23:17:35 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_action_async_done: Child process 6673 performing action 'monitor' exited with rc 1
> May 18 23:17:35 [2526] apache-up001.itc4u.local stonith-ng: warning: log_action: fence_scsi[6673] stderr: [ Failed: Cannot open file "/var/run/cluster/fence_scsi.key" ]
> May 18 23:17:35 [2526] apache-up001.itc4u.local stonith-ng: warning: log_action: fence_scsi[6673] stderr: [ ]
> May 18 23:17:35 [2526] apache-up001.itc4u.local stonith-ng: warning: log_action: fence_scsi[6673] stderr: [ Please use '-h' for usage ]
> May 18 23:17:35 [2526] apache-up001.itc4u.local stonith-ng: warning: log_action: fence_scsi[6673] stderr: [ ]
> May 18 23:17:35 [2526] apache-up001.itc4u.local stonith-ng: info: update_remaining_timeout: Attempted to execute agent fence_scsi (monitor) the maximum number of times (2) allowed
> May 18 23:17:35 [2526] apache-up001.itc4u.local stonith-ng: debug: st_child_done: Operation 'monitor' on 'scsi' completed with rc=-201 (0 remaining)
> May 18 23:17:35 [2526] apache-up001.itc4u.local stonith-ng: notice: log_operation: Operation 'monitor' [6673] for device 'scsi' returned: -201 (Generic Pacemaker error)
> May 18 23:17:35 [2527] apache-up001.itc4u.local lrmd: info: log_finished: finished - rsc:scsi action:start call_id:6 exit-code:1 exec-time:1129ms queue-time:0ms
> May 18 23:17:36 [2530] apache-up001.itc4u.local crmd: error: process_lrm_event: Operation scsi_start_0 (node=apache-up001.ring0, call=6, status=4, cib-update=113, confirmed=true) Error
> May 18 23:17:36 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Forwarding cib_modify operation for section status to master (origin=local/crmd/113)
> May 18 23:17:36 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: --- 0.72.22 2
> May 18 23:17:36 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: +++ 0.72.23 (null)
> May 18 23:17:36 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib: @num_updates=23
> May 18 23:17:36 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib/status/node_state[@id='1']/lrm[@id='1']/lrm_resources/lrm_resource[@id='scsi']/lrm_rsc_op[@id='scsi_last_0']: @operation_key=scsi_start_0, @operation=start, @crm-debug-origin=do_update_resource, @transition-key=8:2:0:cf3499c5-bb9c-4d70-b209-80cc23a4ff1d, @transition-magic=4:1;8:2:0:cf3499c5-bb9c-4d70-b209-80cc23a4ff1d, @call-id=6, @rc-code=1, @op-status=4, @last-run=1463606254, @last-rc-change=1463606254, @exec-t
> May 18 23:17:36 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ /cib/status/node_state[@id='1']/lrm[@id='1']/lrm_resources/lrm_resource[@id='scsi']: <lrm_rsc_op id="scsi_last_failure_0" operation_key="scsi_start_0" operation="start" crm-debug-origin="do_update_resource" crm_feature_set="3.0.10" transition-key="8:2:0:cf3499c5-bb9c-4d70-b209-80cc23a4ff1d" transition-magic="4:1;8:2:0:cf3499c5-bb9c-4d70-b209-80cc23a4ff1d" on_node="apache-up001.ring0" call-id="6" rc-code="1" op-status=
> May 18 23:17:36 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section status: OK (rc=0, origin=apache-up001.ring0/crmd/113, version=0.72.23)
> May 18 23:17:36 [2530] apache-up001.itc4u.local crmd: warning: status_from_rc: Action 8 (scsi_start_0) on apache-up001.ring0 failed (target: 0 vs. rc: 1): Error
> May 18 23:17:36 [2530] apache-up001.itc4u.local crmd: info: abort_transition_graph: Transition aborted by scsi_start_0 'modify' on apache-up001.ring0: Event failed (magic=4:1;8:2:0:cf3499c5-bb9c-4d70-b209-80cc23a4ff1d, cib=0.72.23, source=match_graph_event:381, 0)
> May 18 23:17:36 [2530] apache-up001.itc4u.local crmd: info: match_graph_event: Action scsi_start_0 (8) confirmed on apache-up001.ring0 (rc=1)
> May 18 23:17:36 [2530] apache-up001.itc4u.local crmd: info: update_failcount: Updating failcount for scsi on apache-up001.ring0 after failed start: rc=1 (update=INFINITY, time=1463606256)
> May 18 23:17:36 [2530] apache-up001.itc4u.local crmd: info: process_graph_event: Detected action (2.8) scsi_start_0.6=unknown error: failed
> May 18 23:17:36 [2530] apache-up001.itc4u.local crmd: warning: status_from_rc: Action 8 (scsi_start_0) on apache-up001.ring0 failed (target: 0 vs. rc: 1): Error
> May 18 23:17:36 [2530] apache-up001.itc4u.local crmd: info: abort_transition_graph: Transition aborted by scsi_start_0 'create' on apache-up001.ring0: Event failed (magic=4:1;8:2:0:cf3499c5-bb9c-4d70-b209-80cc23a4ff1d, cib=0.72.23, source=match_graph_event:381, 0)
> May 18 23:17:36 [2530] apache-up001.itc4u.local crmd: info: match_graph_event: Action scsi_start_0 (8) confirmed on apache-up001.ring0 (rc=1)
> May 18 23:17:36 [2530] apache-up001.itc4u.local crmd: info: update_failcount: Updating failcount for scsi on apache-up001.ring0 after failed start: rc=1 (update=INFINITY, time=1463606256)
> May 18 23:17:36 [2530] apache-up001.itc4u.local crmd: info: process_graph_event: Detected action (2.8) scsi_start_0.6=unknown error: failed
> May 18 23:17:36 [2526] apache-up001.itc4u.local stonith-ng: debug: xml_patch_version_check: Can apply patch 0.72.23 to 0.72.22
> May 18 23:17:36 [2530] apache-up001.itc4u.local crmd: notice: run_graph: Transition 2 (Complete=9, Pending=0, Fired=0, Skipped=0, Incomplete=15, Source=/var/lib/pacemaker/pengine/pe-input-124.bz2): Complete
> May 18 23:17:36 [2530] apache-up001.itc4u.local crmd: notice: too_many_st_failures: No devices found in cluster to fence apache-up001.ring0, giving up
> May 18 23:17:36 [2530] apache-up001.itc4u.local crmd: info: do_log: FSA: Input I_TE_SUCCESS from notify_crmd() received in state S_TRANSITION_ENGINE
> May 18 23:17:36 [2530] apache-up001.itc4u.local crmd: notice: do_state_transition: State transition S_TRANSITION_ENGINE -> S_IDLE [ input=I_TE_SUCCESS cause=C_FSA_INTERNAL origin=notify_crmd ]
> May 18 23:17:36 [2528] apache-up001.itc4u.local attrd: info: attrd_peer_update: Setting fail-count-scsi[apache-up001.ring0]: (null) -> INFINITY from apache-up001.ring0
> May 18 23:17:36 [2528] apache-up001.itc4u.local attrd: info: write_attribute: Sent update 12 with 1 changes for fail-count-scsi, id=<n/a>, set=(null)
> May 18 23:17:36 [2528] apache-up001.itc4u.local attrd: info: attrd_peer_update: Setting last-failure-scsi[apache-up001.ring0]: (null) -> 1463606256 from apache-up001.ring0
> May 18 23:17:36 [2528] apache-up001.itc4u.local attrd: info: write_attribute: Sent update 13 with 1 changes for last-failure-scsi, id=<n/a>, set=(null)
> May 18 23:17:36 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Forwarding cib_modify operation for section status to master (origin=local/attrd/12)
> May 18 23:17:36 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Forwarding cib_modify operation for section status to master (origin=local/attrd/13)
> May 18 23:17:36 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: --- 0.72.23 2
> May 18 23:17:36 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: +++ 0.72.24 (null)
> May 18 23:17:36 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib: @num_updates=24
> May 18 23:17:36 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ /cib/status/node_state[@id='1']/transient_attributes[@id='1']/instance_attributes[@id='status-1']: <nvpair id="status-1-fail-count-scsi" name="fail-count-scsi" value="INFINITY"/>
> May 18 23:17:36 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section status: OK (rc=0, origin=apache-up001.ring0/attrd/12, version=0.72.24)
> May 18 23:17:36 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: --- 0.72.24 2
> May 18 23:17:36 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: +++ 0.72.25 (null)
> May 18 23:17:36 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib: @num_updates=25
> May 18 23:17:36 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ /cib/status/node_state[@id='1']/transient_attributes[@id='1']/instance_attributes[@id='status-1']: <nvpair id="status-1-last-failure-scsi" name="last-failure-scsi" value="1463606256"/>
> May 18 23:17:36 [2526] apache-up001.itc4u.local stonith-ng: debug: xml_patch_version_check: Can apply patch 0.72.24 to 0.72.23
> May 18 23:17:36 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section status: OK (rc=0, origin=apache-up001.ring0/attrd/13, version=0.72.25)
> May 18 23:17:36 [2528] apache-up001.itc4u.local attrd: info: attrd_cib_callback: Update 12 for fail-count-scsi: OK (0)
> May 18 23:17:36 [2528] apache-up001.itc4u.local attrd: info: attrd_cib_callback: Update 12 for fail-count-scsi[apache-up001.ring0]=INFINITY: OK (0)
> May 18 23:17:36 [2528] apache-up001.itc4u.local attrd: info: attrd_cib_callback: Update 13 for last-failure-scsi: OK (0)
> May 18 23:17:36 [2528] apache-up001.itc4u.local attrd: info: attrd_cib_callback: Update 13 for last-failure-scsi[apache-up001.ring0]=1463606256: OK (0)
> May 18 23:17:36 [2526] apache-up001.itc4u.local stonith-ng: debug: xml_patch_version_check: Can apply patch 0.72.25 to 0.72.24
> May 18 23:17:36 [2530] apache-up001.itc4u.local crmd: info: abort_transition_graph: Transition aborted by status-1-fail-count-scsi, fail-count-scsi=INFINITY: Transient attribute change (create cib=0.72.24, source=abort_unless_down:329, path=/cib/status/node_state[@id='1']/transient_attributes[@id='1']/instance_attributes[@id='status-1'], 1)
> May 18 23:17:36 [2530] apache-up001.itc4u.local crmd: info: abort_transition_graph: Transition aborted by status-1-last-failure-scsi, last-failure-scsi=1463606256: Transient attribute change (create cib=0.72.25, source=abort_unless_down:329, path=/cib/status/node_state[@id='1']/transient_attributes[@id='1']/instance_attributes[@id='status-1'], 1)
> May 18 23:17:36 [2530] apache-up001.itc4u.local crmd: notice: do_state_transition: State transition S_IDLE -> S_POLICY_ENGINE [ input=I_PE_CALC cause=C_FSA_INTERNAL origin=abort_transition_graph ]
> May 18 23:17:36 [2529] apache-up001.itc4u.local pengine: warning: common_unpack: dlm-clone requires (un)fencing but fencing is disabled
> May 18 23:17:36 [2529] apache-up001.itc4u.local pengine: warning: common_unpack: dlm:0 requires (un)fencing but fencing is disabled
> May 18 23:17:36 [2529] apache-up001.itc4u.local pengine: warning: common_unpack: dlm:1 requires (un)fencing but fencing is disabled
> May 18 23:17:36 [2529] apache-up001.itc4u.local pengine: warning: common_unpack: dlm:2 requires (un)fencing but fencing is disabled
> May 18 23:17:36 [2529] apache-up001.itc4u.local pengine: warning: common_unpack: clvmd-clone requires (un)fencing but fencing is disabled
> May 18 23:17:36 [2529] apache-up001.itc4u.local pengine: warning: common_unpack: clvmd:0 requires (un)fencing but fencing is disabled
> May 18 23:17:36 [2529] apache-up001.itc4u.local pengine: warning: common_unpack: clvmd:1 requires (un)fencing but fencing is disabled
> May 18 23:17:36 [2529] apache-up001.itc4u.local pengine: warning: common_unpack: clvmd:2 requires (un)fencing but fencing is disabled
> May 18 23:17:36 [2529] apache-up001.itc4u.local pengine: info: determine_online_status: Node apache-up001.ring0 is online
> May 18 23:17:36 [2529] apache-up001.itc4u.local pengine: info: determine_online_status: Node apache-up003.ring0 is online
> May 18 23:17:36 [2529] apache-up001.itc4u.local pengine: info: determine_online_status: Node apache-up002.ring0 is online
> May 18 23:17:36 [2529] apache-up001.itc4u.local pengine: warning: unpack_rsc_op_failure: Processing failed op start for scsi on apache-up001.ring0: unknown error (1)
> May 18 23:17:36 [2529] apache-up001.itc4u.local pengine: warning: unpack_rsc_op_failure: Processing failed op start for scsi on apache-up001.ring0: unknown error (1)
> May 18 23:17:36 [2529] apache-up001.itc4u.local pengine: warning: unpack_rsc_op_failure: Processing failed op start for dlm:0 on apache-up002.ring0: not configured (6)
> May 18 23:17:36 [2529] apache-up001.itc4u.local pengine: error: unpack_rsc_op: Preventing dlm-clone from re-starting anywhere: operation start failed 'not configured' (6)
> May 18 23:17:36 [2529] apache-up001.itc4u.local pengine: warning: unpack_rsc_op_failure: Processing failed op start for dlm:0 on apache-up002.ring0: not configured (6)
> May 18 23:17:36 [2529] apache-up001.itc4u.local pengine: error: unpack_rsc_op: Preventing dlm-clone from re-starting anywhere: operation start failed 'not configured' (6)
> May 18 23:17:36 [2529] apache-up001.itc4u.local pengine: info: native_print: scsi (stonith:fence_scsi): FAILED apache-up001.ring0
> May 18 23:17:36 [2529] apache-up001.itc4u.local pengine: info: clone_print: Clone Set: dlm-clone [dlm]
> May 18 23:17:36 [2529] apache-up001.itc4u.local pengine: info: native_print: dlm (ocf::pacemaker:controld): FAILED apache-up002.ring0
> May 18 23:17:36 [2529] apache-up001.itc4u.local pengine: info: short_print: Stopped: [ apache-up001.ring0 apache-up003.ring0 ]
> May 18 23:17:36 [2529] apache-up001.itc4u.local pengine: info: clone_print: Clone Set: clvmd-clone [clvmd]
> May 18 23:17:36 [2529] apache-up001.itc4u.local pengine: info: short_print: Stopped: [ apache-up001.ring0 apache-up002.ring0 apache-up003.ring0 ]
> May 18 23:17:36 [2529] apache-up001.itc4u.local pengine: info: get_failcount_full: scsi has failed INFINITY times on apache-up001.ring0
> May 18 23:17:36 [2529] apache-up001.itc4u.local pengine: warning: common_apply_stickiness: Forcing scsi away from apache-up001.ring0 after 1000000 failures (max=1000000)
> May 18 23:17:36 [2529] apache-up001.itc4u.local pengine: info: get_failcount_full: dlm:0 has failed INFINITY times on apache-up002.ring0
> May 18 23:17:36 [2529] apache-up001.itc4u.local pengine: warning: common_apply_stickiness: Forcing dlm-clone away from apache-up002.ring0 after 1000000 failures (max=1000000)
> May 18 23:17:36 [2529] apache-up001.itc4u.local pengine: info: get_failcount_full: dlm-clone has failed INFINITY times on apache-up002.ring0
> May 18 23:17:36 [2529] apache-up001.itc4u.local pengine: warning: common_apply_stickiness: Forcing dlm-clone away from apache-up002.ring0 after 1000000 failures (max=1000000)
> May 18 23:17:36 [2529] apache-up001.itc4u.local pengine: info: get_failcount_full: dlm-clone has failed INFINITY times on apache-up002.ring0
> May 18 23:17:36 [2529] apache-up001.itc4u.local pengine: warning: common_apply_stickiness: Forcing dlm-clone away from apache-up002.ring0 after 1000000 failures (max=1000000)
> May 18 23:17:36 [2529] apache-up001.itc4u.local pengine: info: rsc_merge_weights: dlm-clone: Rolling back scores from clvmd-clone
> May 18 23:17:36 [2529] apache-up001.itc4u.local pengine: info: native_color: Resource dlm:1 cannot run anywhere
> May 18 23:17:36 [2529] apache-up001.itc4u.local pengine: info: native_color: Resource dlm:2 cannot run anywhere
> May 18 23:17:36 [2529] apache-up001.itc4u.local pengine: info: native_color: Resource dlm:0 cannot run anywhere
> May 18 23:17:36 [2529] apache-up001.itc4u.local pengine: notice: clone_rsc_colocation_rh: Cannot pair clvmd:0 with instance of dlm-clone
> May 18 23:17:36 [2529] apache-up001.itc4u.local pengine: info: native_color: Resource clvmd:0 cannot run anywhere
> May 18 23:17:36 [2529] apache-up001.itc4u.local pengine: notice: clone_rsc_colocation_rh: Cannot pair clvmd:1 with instance of dlm-clone
> May 18 23:17:36 [2529] apache-up001.itc4u.local pengine: info: native_color: Resource clvmd:1 cannot run anywhere
> May 18 23:17:36 [2529] apache-up001.itc4u.local pengine: notice: clone_rsc_colocation_rh: Cannot pair clvmd:2 with instance of dlm-clone
> May 18 23:17:36 [2529] apache-up001.itc4u.local pengine: info: native_color: Resource clvmd:2 cannot run anywhere
> May 18 23:17:36 [2529] apache-up001.itc4u.local pengine: info: RecurringOp: Start recurring monitor (60s) for scsi on apache-up002.ring0
> May 18 23:17:36 [2529] apache-up001.itc4u.local pengine: info: clone_update_actions_interleave: Inhibiting clvmd:0 from being active
> May 18 23:17:36 [2529] apache-up001.itc4u.local pengine: info: clone_update_actions_interleave: Inhibiting clvmd:1 from being active
> May 18 23:17:36 [2529] apache-up001.itc4u.local pengine: info: clone_update_actions_interleave: Inhibiting clvmd:2 from being active
> May 18 23:17:36 [2529] apache-up001.itc4u.local pengine: info: clone_update_actions_interleave: Inhibiting clvmd:0 from being active
> May 18 23:17:36 [2529] apache-up001.itc4u.local pengine: info: clone_update_actions_interleave: Inhibiting clvmd:1 from being active
> May 18 23:17:36 [2529] apache-up001.itc4u.local pengine: info: clone_update_actions_interleave: Inhibiting clvmd:2 from being active
> May 18 23:17:36 [2529] apache-up001.itc4u.local pengine: info: clone_update_actions_interleave: Inhibiting clvmd:0 from being active
> May 18 23:17:36 [2529] apache-up001.itc4u.local pengine: info: clone_update_actions_interleave: Inhibiting clvmd:1 from being active
> May 18 23:17:36 [2529] apache-up001.itc4u.local pengine: info: clone_update_actions_interleave: Inhibiting clvmd:2 from being active
> May 18 23:17:36 [2529] apache-up001.itc4u.local pengine: info: clone_update_actions_interleave: Inhibiting clvmd:0 from being active
> May 18 23:17:36 [2529] apache-up001.itc4u.local pengine: info: clone_update_actions_interleave: Inhibiting clvmd:1 from being active
> May 18 23:17:36 [2529] apache-up001.itc4u.local pengine: info: clone_update_actions_interleave: Inhibiting clvmd:2 from being active
> May 18 23:17:36 [2529] apache-up001.itc4u.local pengine: notice: LogActions: Recover scsi (Started apache-up001.ring0 -> apache-up002.ring0)
> May 18 23:17:36 [2529] apache-up001.itc4u.local pengine: notice: LogActions: Stop dlm:0 (apache-up002.ring0)
> May 18 23:17:36 [2529] apache-up001.itc4u.local pengine: info: LogActions: Leave dlm:1 (Stopped)
> May 18 23:17:36 [2529] apache-up001.itc4u.local pengine: info: LogActions: Leave dlm:2 (Stopped)
> May 18 23:17:36 [2529] apache-up001.itc4u.local pengine: info: LogActions: Leave clvmd:0 (Stopped)
> May 18 23:17:36 [2529] apache-up001.itc4u.local pengine: info: LogActions: Leave clvmd:1 (Stopped)
> May 18 23:17:36 [2529] apache-up001.itc4u.local pengine: info: LogActions: Leave clvmd:2 (Stopped)
> May 18 23:17:36 [2529] apache-up001.itc4u.local pengine: notice: process_pe_message: Calculated Transition 3: /var/lib/pacemaker/pengine/pe-input-125.bz2
> May 18 23:17:36 [2530] apache-up001.itc4u.local crmd: info: do_state_transition: State transition S_POLICY_ENGINE -> S_TRANSITION_ENGINE [ input=I_PE_SUCCESS cause=C_IPC_MESSAGE origin=handle_response ]
> May 18 23:17:36 [2530] apache-up001.itc4u.local crmd: info: do_te_invoke: Processing graph 3 (ref=pe_calc-dc-1463606256-53) derived from /var/lib/pacemaker/pengine/pe-input-125.bz2
> May 18 23:17:36 [2530] apache-up001.itc4u.local crmd: notice: te_rsc_command: Initiating action 1: stop scsi_stop_0 on apache-up001.ring0 (local)
> May 18 23:17:36 [2530] apache-up001.itc4u.local crmd: info: do_lrm_rsc_op: Performing key=1:3:0:cf3499c5-bb9c-4d70-b209-80cc23a4ff1d op=scsi_stop_0
> May 18 23:17:36 [2527] apache-up001.itc4u.local lrmd: info: log_execute: executing - rsc:scsi action:stop call_id:17
> May 18 23:17:36 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_command: Processing st_device_remove 5 from lrmd.2527 ( 1000)
> May 18 23:17:36 [2530] apache-up001.itc4u.local crmd: notice: te_rsc_command: Initiating action 2: stop dlm_stop_0 on apache-up002.ring0
> May 18 23:17:36 [2526] apache-up001.itc4u.local stonith-ng: debug: stonith_command: Processed st_device_remove from lrmd.2527: OK (0)
> May 18 23:17:36 [2527] apache-up001.itc4u.local lrmd: info: log_finished: finished - rsc:scsi action:stop call_id:17 exit-code:0 exec-time:0ms queue-time:0ms
> May 18 23:17:36 [2530] apache-up001.itc4u.local crmd: notice: process_lrm_event: Operation scsi_stop_0: ok (node=apache-up001.ring0, call=17, rc=0, cib-update=116, confirmed=true)
> May 18 23:17:36 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Forwarding cib_modify operation for section status to master (origin=local/crmd/116)
> May 18 23:17:36 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: --- 0.72.25 2
> May 18 23:17:36 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: +++ 0.72.26 (null)
> May 18 23:17:36 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib: @num_updates=26
> May 18 23:17:36 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib/status/node_state[@id='1']/lrm[@id='1']/lrm_resources/lrm_resource[@id='scsi']/lrm_rsc_op[@id='scsi_last_0']: @operation_key=scsi_stop_0, @operation=stop, @transition-key=1:3:0:cf3499c5-bb9c-4d70-b209-80cc23a4ff1d, @transition-magic=0:0;1:3:0:cf3499c5-bb9c-4d70-b209-80cc23a4ff1d, @call-id=17, @rc-code=0, @op-status=0, @last-run=1463606256, @last-rc-change=1463606256, @exec-time=0
> May 18 23:17:36 [2526] apache-up001.itc4u.local stonith-ng: debug: xml_patch_version_check: Can apply patch 0.72.26 to 0.72.25
> May 18 23:17:36 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section status: OK (rc=0, origin=apache-up001.ring0/crmd/116, version=0.72.26)
> May 18 23:17:36 [2530] apache-up001.itc4u.local crmd: info: match_graph_event: Action scsi_stop_0 (1) confirmed on apache-up001.ring0 (rc=0)
> May 18 23:17:36 [2530] apache-up001.itc4u.local crmd: notice: te_rsc_command: Initiating action 4: start scsi_start_0 on apache-up002.ring0
> May 18 23:17:36 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: --- 0.72.26 2
> May 18 23:17:36 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: +++ 0.72.27 (null)
> May 18 23:17:36 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib: @num_updates=27
> May 18 23:17:36 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib/status/node_state[@id='2']/lrm[@id='2']/lrm_resources/lrm_resource[@id='dlm']/lrm_rsc_op[@id='dlm_last_0']: @operation_key=dlm_stop_0, @operation=stop, @transition-key=2:3:0:cf3499c5-bb9c-4d70-b209-80cc23a4ff1d, @transition-magic=0:0;2:3:0:cf3499c5-bb9c-4d70-b209-80cc23a4ff1d, @call-id=17, @rc-code=0, @last-run=1463606256, @last-rc-change=1463606256, @exec-time=20
> May 18 23:17:36 [2526] apache-up001.itc4u.local stonith-ng: debug: xml_patch_version_check: Can apply patch 0.72.27 to 0.72.26
> May 18 23:17:36 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section status: OK (rc=0, origin=apache-up002.ring0/crmd/19, version=0.72.27)
> May 18 23:17:36 [2530] apache-up001.itc4u.local crmd: info: match_graph_event: Action dlm_stop_0 (2) confirmed on apache-up002.ring0 (rc=0)
> May 18 23:17:38 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: --- 0.72.27 2
> May 18 23:17:38 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: +++ 0.72.28 (null)
> May 18 23:17:38 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib: @num_updates=28
> May 18 23:17:38 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib/status/node_state[@id='2']/lrm[@id='2']/lrm_resources/lrm_resource[@id='scsi']/lrm_rsc_op[@id='scsi_last_0']: @operation_key=scsi_start_0, @operation=start, @crm-debug-origin=do_update_resource, @transition-key=4:3:0:cf3499c5-bb9c-4d70-b209-80cc23a4ff1d, @transition-magic=4:1;4:3:0:cf3499c5-bb9c-4d70-b209-80cc23a4ff1d, @call-id=18, @rc-code=1, @op-status=4, @last-run=1463606256, @last-rc-change=1463606256, @exec-
> May 18 23:17:38 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ /cib/status/node_state[@id='2']/lrm[@id='2']/lrm_resources/lrm_resource[@id='scsi']: <lrm_rsc_op id="scsi_last_failure_0" operation_key="scsi_start_0" operation="start" crm-debug-origin="do_update_resource" crm_feature_set="3.0.10" transition-key="4:3:0:cf3499c5-bb9c-4d70-b209-80cc23a4ff1d" transition-magic="4:1;4:3:0:cf3499c5-bb9c-4d70-b209-80cc23a4ff1d" on_node="apache-up002.ring0" call-id="18" rc-code="1" op-status
> May 18 23:17:38 [2526] apache-up001.itc4u.local stonith-ng: debug: xml_patch_version_check: Can apply patch 0.72.28 to 0.72.27
> May 18 23:17:38 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section status: OK (rc=0, origin=apache-up002.ring0/crmd/20, version=0.72.28)
> May 18 23:17:38 [2530] apache-up001.itc4u.local crmd: warning: status_from_rc: Action 4 (scsi_start_0) on apache-up002.ring0 failed (target: 0 vs. rc: 1): Error
> May 18 23:17:38 [2530] apache-up001.itc4u.local crmd: notice: abort_transition_graph: Transition aborted by scsi_start_0 'modify' on apache-up002.ring0: Event failed (magic=4:1;4:3:0:cf3499c5-bb9c-4d70-b209-80cc23a4ff1d, cib=0.72.28, source=match_graph_event:381, 0)
> May 18 23:17:38 [2530] apache-up001.itc4u.local crmd: info: match_graph_event: Action scsi_start_0 (4) confirmed on apache-up002.ring0 (rc=1)
> May 18 23:17:38 [2530] apache-up001.itc4u.local crmd: info: update_failcount: Updating failcount for scsi on apache-up002.ring0 after failed start: rc=1 (update=INFINITY, time=1463606258)
> May 18 23:17:38 [2530] apache-up001.itc4u.local crmd: info: process_graph_event: Detected action (3.4) scsi_start_0.18=unknown error: failed
> May 18 23:17:38 [2530] apache-up001.itc4u.local crmd: warning: status_from_rc: Action 4 (scsi_start_0) on apache-up002.ring0 failed (target: 0 vs. rc: 1): Error
> May 18 23:17:38 [2530] apache-up001.itc4u.local crmd: info: abort_transition_graph: Transition aborted by scsi_start_0 'create' on apache-up002.ring0: Event failed (magic=4:1;4:3:0:cf3499c5-bb9c-4d70-b209-80cc23a4ff1d, cib=0.72.28, source=match_graph_event:381, 0)
> May 18 23:17:38 [2530] apache-up001.itc4u.local crmd: info: match_graph_event: Action scsi_start_0 (4) confirmed on apache-up002.ring0 (rc=1)
> May 18 23:17:38 [2530] apache-up001.itc4u.local crmd: info: update_failcount: Updating failcount for scsi on apache-up002.ring0 after failed start: rc=1 (update=INFINITY, time=1463606258)
> May 18 23:17:38 [2530] apache-up001.itc4u.local crmd: info: process_graph_event: Detected action (3.4) scsi_start_0.18=unknown error: failed
> May 18 23:17:38 [2530] apache-up001.itc4u.local crmd: notice: run_graph: Transition 3 (Complete=6, Pending=0, Fired=0, Skipped=0, Incomplete=1, Source=/var/lib/pacemaker/pengine/pe-input-125.bz2): Complete
> May 18 23:17:38 [2530] apache-up001.itc4u.local crmd: notice: too_many_st_failures: No devices found in cluster to fence apache-up001.ring0, giving up
> May 18 23:17:38 [2530] apache-up001.itc4u.local crmd: info: do_log: FSA: Input I_TE_SUCCESS from notify_crmd() received in state S_TRANSITION_ENGINE
> May 18 23:17:38 [2530] apache-up001.itc4u.local crmd: notice: do_state_transition: State transition S_TRANSITION_ENGINE -> S_IDLE [ input=I_TE_SUCCESS cause=C_FSA_INTERNAL origin=notify_crmd ]
> May 18 23:17:38 [2528] apache-up001.itc4u.local attrd: info: attrd_peer_update: Setting fail-count-scsi[apache-up002.ring0]: (null) -> INFINITY from apache-up001.ring0
> May 18 23:17:38 [2528] apache-up001.itc4u.local attrd: info: write_attribute: Sent update 14 with 2 changes for fail-count-scsi, id=<n/a>, set=(null)
> May 18 23:17:38 [2528] apache-up001.itc4u.local attrd: info: attrd_peer_update: Setting last-failure-scsi[apache-up002.ring0]: (null) -> 1463606258 from apache-up001.ring0
> May 18 23:17:38 [2528] apache-up001.itc4u.local attrd: info: write_attribute: Sent update 15 with 2 changes for last-failure-scsi, id=<n/a>, set=(null)
> May 18 23:17:38 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Forwarding cib_modify operation for section status to master (origin=local/attrd/14)
> May 18 23:17:38 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Forwarding cib_modify operation for section status to master (origin=local/attrd/15)
> May 18 23:17:38 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: --- 0.72.28 2
> May 18 23:17:38 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: +++ 0.72.29 (null)
> May 18 23:17:38 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib: @num_updates=29
> May 18 23:17:38 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ /cib/status/node_state[@id='2']/transient_attributes[@id='2']/instance_attributes[@id='status-2']: <nvpair id="status-2-fail-count-scsi" name="fail-count-scsi" value="INFINITY"/>
> May 18 23:17:38 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section status: OK (rc=0, origin=apache-up001.ring0/attrd/14, version=0.72.29)
> May 18 23:17:38 [2528] apache-up001.itc4u.local attrd: info: attrd_cib_callback: Update 14 for fail-count-scsi: OK (0)
> May 18 23:17:38 [2528] apache-up001.itc4u.local attrd: info: attrd_cib_callback: Update 14 for fail-count-scsi[apache-up001.ring0]=INFINITY: OK (0)
> May 18 23:17:38 [2528] apache-up001.itc4u.local attrd: info: attrd_cib_callback: Update 14 for fail-count-scsi[apache-up002.ring0]=INFINITY: OK (0)
> May 18 23:17:38 [2526] apache-up001.itc4u.local stonith-ng: debug: xml_patch_version_check: Can apply patch 0.72.29 to 0.72.28
> May 18 23:17:38 [2530] apache-up001.itc4u.local crmd: info: abort_transition_graph: Transition aborted by status-2-fail-count-scsi, fail-count-scsi=INFINITY: Transient attribute change (create cib=0.72.29, source=abort_unless_down:329, path=/cib/status/node_state[@id='2']/transient_attributes[@id='2']/instance_attributes[@id='status-2'], 1)
> May 18 23:17:38 [2530] apache-up001.itc4u.local crmd: notice: do_state_transition: State transition S_IDLE -> S_POLICY_ENGINE [ input=I_PE_CALC cause=C_FSA_INTERNAL origin=abort_transition_graph ]
> May 18 23:17:38 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: --- 0.72.29 2
> May 18 23:17:38 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: +++ 0.72.30 (null)
> May 18 23:17:38 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib: @num_updates=30
> May 18 23:17:38 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ /cib/status/node_state[@id='2']/transient_attributes[@id='2']/instance_attributes[@id='status-2']: <nvpair id="status-2-last-failure-scsi" name="last-failure-scsi" value="1463606258"/>
> May 18 23:17:38 [2530] apache-up001.itc4u.local crmd: info: abort_transition_graph: Transition aborted by status-2-last-failure-scsi, last-failure-scsi=1463606258: Transient attribute change (create cib=0.72.30, source=abort_unless_down:329, path=/cib/status/node_state[@id='2']/transient_attributes[@id='2']/instance_attributes[@id='status-2'], 1)
> May 18 23:17:38 [2526] apache-up001.itc4u.local stonith-ng: debug: xml_patch_version_check: Can apply patch 0.72.30 to 0.72.29
> May 18 23:17:38 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section status: OK (rc=0, origin=apache-up001.ring0/attrd/15, version=0.72.30)
> May 18 23:17:38 [2528] apache-up001.itc4u.local attrd: info: attrd_cib_callback: Update 15 for last-failure-scsi: OK (0)
> May 18 23:17:38 [2528] apache-up001.itc4u.local attrd: info: attrd_cib_callback: Update 15 for last-failure-scsi[apache-up001.ring0]=1463606256: OK (0)
> May 18 23:17:38 [2528] apache-up001.itc4u.local attrd: info: attrd_cib_callback: Update 15 for last-failure-scsi[apache-up002.ring0]=1463606258: OK (0)
> May 18 23:17:38 [2529] apache-up001.itc4u.local pengine: warning: common_unpack: dlm-clone requires (un)fencing but fencing is disabled
> May 18 23:17:38 [2529] apache-up001.itc4u.local pengine: warning: common_unpack: dlm:0 requires (un)fencing but fencing is disabled
> May 18 23:17:38 [2529] apache-up001.itc4u.local pengine: warning: common_unpack: dlm:1 requires (un)fencing but fencing is disabled
> May 18 23:17:38 [2529] apache-up001.itc4u.local pengine: warning: common_unpack: dlm:2 requires (un)fencing but fencing is disabled
> May 18 23:17:38 [2529] apache-up001.itc4u.local pengine: warning: common_unpack: clvmd-clone requires (un)fencing but fencing is disabled
> May 18 23:17:38 [2529] apache-up001.itc4u.local pengine: warning: common_unpack: clvmd:0 requires (un)fencing but fencing is disabled
> May 18 23:17:38 [2529] apache-up001.itc4u.local pengine: warning: common_unpack: clvmd:1 requires (un)fencing but fencing is disabled
> May 18 23:17:38 [2529] apache-up001.itc4u.local pengine: warning: common_unpack: clvmd:2 requires (un)fencing but fencing is disabled
> May 18 23:17:38 [2529] apache-up001.itc4u.local pengine: info: determine_online_status: Node apache-up001.ring0 is online
> May 18 23:17:38 [2529] apache-up001.itc4u.local pengine: info: determine_online_status: Node apache-up003.ring0 is online
> May 18 23:17:38 [2529] apache-up001.itc4u.local pengine: info: determine_online_status: Node apache-up002.ring0 is online
> May 18 23:17:38 [2529] apache-up001.itc4u.local pengine: warning: unpack_rsc_op_failure: Processing failed op start for scsi on apache-up001.ring0: unknown error (1)
> May 18 23:17:38 [2529] apache-up001.itc4u.local pengine: warning: unpack_rsc_op_failure: Processing failed op start for scsi on apache-up002.ring0: unknown error (1)
> May 18 23:17:38 [2529] apache-up001.itc4u.local pengine: warning: unpack_rsc_op_failure: Processing failed op start for scsi on apache-up002.ring0: unknown error (1)
> May 18 23:17:38 [2529] apache-up001.itc4u.local pengine: warning: unpack_rsc_op_failure: Processing failed op start for dlm:0 on apache-up002.ring0: not configured (6)
> May 18 23:17:38 [2529] apache-up001.itc4u.local pengine: error: unpack_rsc_op: Preventing dlm-clone from re-starting anywhere: operation start failed 'not configured' (6)
> May 18 23:17:38 [2529] apache-up001.itc4u.local pengine: info: native_print: scsi (stonith:fence_scsi): FAILED apache-up002.ring0
> May 18 23:17:38 [2529] apache-up001.itc4u.local pengine: info: clone_print: Clone Set: dlm-clone [dlm]
> May 18 23:17:38 [2529] apache-up001.itc4u.local pengine: info: short_print: Stopped: [ apache-up001.ring0 apache-up002.ring0 apache-up003.ring0 ]
> May 18 23:17:38 [2529] apache-up001.itc4u.local pengine: info: clone_print: Clone Set: clvmd-clone [clvmd]
> May 18 23:17:38 [2529] apache-up001.itc4u.local pengine: info: short_print: Stopped: [ apache-up001.ring0 apache-up002.ring0 apache-up003.ring0 ]
> May 18 23:17:38 [2529] apache-up001.itc4u.local pengine: info: get_failcount_full: scsi has failed INFINITY times on apache-up001.ring0
> May 18 23:17:38 [2529] apache-up001.itc4u.local pengine: warning: common_apply_stickiness: Forcing scsi away from apache-up001.ring0 after 1000000 failures (max=1000000)
> May 18 23:17:38 [2529] apache-up001.itc4u.local pengine: info: get_failcount_full: scsi has failed INFINITY times on apache-up002.ring0
> May 18 23:17:38 [2529] apache-up001.itc4u.local pengine: warning: common_apply_stickiness: Forcing scsi away from apache-up002.ring0 after 1000000 failures (max=1000000)
> May 18 23:17:38 [2529] apache-up001.itc4u.local pengine: info: get_failcount_full: dlm:0 has failed INFINITY times on apache-up002.ring0
> May 18 23:17:38 [2529] apache-up001.itc4u.local pengine: warning: common_apply_stickiness: Forcing dlm-clone away from apache-up002.ring0 after 1000000 failures (max=1000000)
> May 18 23:17:38 [2529] apache-up001.itc4u.local pengine: info: get_failcount_full: dlm-clone has failed INFINITY times on apache-up002.ring0
> May 18 23:17:38 [2529] apache-up001.itc4u.local pengine: warning: common_apply_stickiness: Forcing dlm-clone away from apache-up002.ring0 after 1000000 failures (max=1000000)
> May 18 23:17:38 [2529] apache-up001.itc4u.local pengine: info: get_failcount_full: dlm-clone has failed INFINITY times on apache-up002.ring0
> May 18 23:17:38 [2529] apache-up001.itc4u.local pengine: warning: common_apply_stickiness: Forcing dlm-clone away from apache-up002.ring0 after 1000000 failures (max=1000000)
> May 18 23:17:38 [2529] apache-up001.itc4u.local pengine: info: rsc_merge_weights: dlm-clone: Rolling back scores from clvmd-clone
> May 18 23:17:38 [2529] apache-up001.itc4u.local pengine: info: native_color: Resource dlm:0 cannot run anywhere
> May 18 23:17:38 [2529] apache-up001.itc4u.local pengine: info: native_color: Resource dlm:1 cannot run anywhere
> May 18 23:17:38 [2529] apache-up001.itc4u.local pengine: info: native_color: Resource dlm:2 cannot run anywhere
> May 18 23:17:38 [2529] apache-up001.itc4u.local pengine: notice: clone_rsc_colocation_rh: Cannot pair clvmd:0 with instance of dlm-clone
> May 18 23:17:38 [2529] apache-up001.itc4u.local pengine: info: native_color: Resource clvmd:0 cannot run anywhere
> May 18 23:17:38 [2529] apache-up001.itc4u.local pengine: notice: clone_rsc_colocation_rh: Cannot pair clvmd:1 with instance of dlm-clone
> May 18 23:17:38 [2529] apache-up001.itc4u.local pengine: info: native_color: Resource clvmd:1 cannot run anywhere
> May 18 23:17:38 [2529] apache-up001.itc4u.local pengine: notice: clone_rsc_colocation_rh: Cannot pair clvmd:2 with instance of dlm-clone
> May 18 23:17:38 [2529] apache-up001.itc4u.local pengine: info: native_color: Resource clvmd:2 cannot run anywhere
> May 18 23:17:38 [2529] apache-up001.itc4u.local pengine: info: RecurringOp: Start recurring monitor (60s) for scsi on apache-up003.ring0
> May 18 23:17:38 [2529] apache-up001.itc4u.local pengine: info: clone_update_actions_interleave: Inhibiting clvmd:0 from being active
> May 18 23:17:38 [2529] apache-up001.itc4u.local pengine: info: clone_update_actions_interleave: Inhibiting clvmd:1 from being active
> May 18 23:17:38 [2529] apache-up001.itc4u.local pengine: info: clone_update_actions_interleave: Inhibiting clvmd:2 from being active
> May 18 23:17:38 [2529] apache-up001.itc4u.local pengine: info: clone_update_actions_interleave: Inhibiting clvmd:0 from being active
> May 18 23:17:38 [2529] apache-up001.itc4u.local pengine: info: clone_update_actions_interleave: Inhibiting clvmd:1 from being active
> May 18 23:17:38 [2529] apache-up001.itc4u.local pengine: info: clone_update_actions_interleave: Inhibiting clvmd:2 from being active
> May 18 23:17:38 [2529] apache-up001.itc4u.local pengine: info: clone_update_actions_interleave: Inhibiting clvmd:0 from being active
> May 18 23:17:38 [2529] apache-up001.itc4u.local pengine: info: clone_update_actions_interleave: Inhibiting clvmd:1 from being active
> May 18 23:17:38 [2529] apache-up001.itc4u.local pengine: info: clone_update_actions_interleave: Inhibiting clvmd:2 from being active
> May 18 23:17:38 [2529] apache-up001.itc4u.local pengine: info: clone_update_actions_interleave: Inhibiting clvmd:0 from being active
> May 18 23:17:38 [2529] apache-up001.itc4u.local pengine: info: clone_update_actions_interleave: Inhibiting clvmd:1 from being active
> May 18 23:17:38 [2529] apache-up001.itc4u.local pengine: info: clone_update_actions_interleave: Inhibiting clvmd:2 from being active
> May 18 23:17:38 [2529] apache-up001.itc4u.local pengine: notice: LogActions: Recover scsi (Started apache-up002.ring0 -> apache-up003.ring0)
> May 18 23:17:38 [2529] apache-up001.itc4u.local pengine: info: LogActions: Leave dlm:0 (Stopped)
> May 18 23:17:38 [2529] apache-up001.itc4u.local pengine: info: LogActions: Leave dlm:1 (Stopped)
> May 18 23:17:38 [2529] apache-up001.itc4u.local pengine: info: LogActions: Leave dlm:2 (Stopped)
> May 18 23:17:38 [2529] apache-up001.itc4u.local pengine: info: LogActions: Leave clvmd:0 (Stopped)
> May 18 23:17:38 [2529] apache-up001.itc4u.local pengine: info: LogActions: Leave clvmd:1 (Stopped)
> May 18 23:17:38 [2529] apache-up001.itc4u.local pengine: info: LogActions: Leave clvmd:2 (Stopped)
> May 18 23:17:38 [2529] apache-up001.itc4u.local pengine: notice: process_pe_message: Calculated Transition 4: /var/lib/pacemaker/pengine/pe-input-126.bz2
> May 18 23:17:38 [2530] apache-up001.itc4u.local crmd: info: do_state_transition: State transition S_POLICY_ENGINE -> S_TRANSITION_ENGINE [ input=I_PE_SUCCESS cause=C_IPC_MESSAGE origin=handle_response ]
> May 18 23:17:38 [2530] apache-up001.itc4u.local crmd: info: do_te_invoke: Processing graph 4 (ref=pe_calc-dc-1463606258-57) derived from /var/lib/pacemaker/pengine/pe-input-126.bz2
> May 18 23:17:38 [2530] apache-up001.itc4u.local crmd: notice: te_rsc_command: Initiating action 1: stop scsi_stop_0 on apache-up002.ring0
> May 18 23:17:38 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: --- 0.72.30 2
> May 18 23:17:38 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: +++ 0.72.31 (null)
> May 18 23:17:38 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib: @num_updates=31
> May 18 23:17:38 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib/status/node_state[@id='2']/lrm[@id='2']/lrm_resources/lrm_resource[@id='scsi']/lrm_rsc_op[@id='scsi_last_0']: @operation_key=scsi_stop_0, @operation=stop, @transition-key=1:4:0:cf3499c5-bb9c-4d70-b209-80cc23a4ff1d, @transition-magic=0:0;1:4:0:cf3499c5-bb9c-4d70-b209-80cc23a4ff1d, @call-id=19, @rc-code=0, @op-status=0, @last-run=1463606258, @last-rc-change=1463606258, @exec-time=0
> May 18 23:17:38 [2526] apache-up001.itc4u.local stonith-ng: debug: xml_patch_version_check: Can apply patch 0.72.31 to 0.72.30
> May 18 23:17:38 [2530] apache-up001.itc4u.local crmd: info: match_graph_event: Action scsi_stop_0 (1) confirmed on apache-up002.ring0 (rc=0)
> May 18 23:17:38 [2530] apache-up001.itc4u.local crmd: notice: te_rsc_command: Initiating action 3: start scsi_start_0 on apache-up003.ring0
> May 18 23:17:38 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section status: OK (rc=0, origin=apache-up002.ring0/crmd/21, version=0.72.31)
> May 18 23:17:40 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: --- 0.72.31 2
> May 18 23:17:40 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: +++ 0.72.32 (null)
> May 18 23:17:40 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib: @num_updates=32
> May 18 23:17:40 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib/status/node_state[@id='3']/lrm[@id='3']/lrm_resources/lrm_resource[@id='scsi']/lrm_rsc_op[@id='scsi_last_0']: @operation_key=scsi_start_0, @operation=start, @crm-debug-origin=do_update_resource, @transition-key=3:4:0:cf3499c5-bb9c-4d70-b209-80cc23a4ff1d, @transition-magic=4:1;3:4:0:cf3499c5-bb9c-4d70-b209-80cc23a4ff1d, @call-id=16, @rc-code=1, @op-status=4, @last-run=1463606258, @last-rc-change=1463606258, @exec-
> May 18 23:17:40 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ /cib/status/node_state[@id='3']/lrm[@id='3']/lrm_resources/lrm_resource[@id='scsi']: <lrm_rsc_op id="scsi_last_failure_0" operation_key="scsi_start_0" operation="start" crm-debug-origin="do_update_resource" crm_feature_set="3.0.10" transition-key="3:4:0:cf3499c5-bb9c-4d70-b209-80cc23a4ff1d" transition-magic="4:1;3:4:0:cf3499c5-bb9c-4d70-b209-80cc23a4ff1d" on_node="apache-up003.ring0" call-id="16" rc-code="1" op-status
> May 18 23:17:40 [2530] apache-up001.itc4u.local crmd: warning: status_from_rc: Action 3 (scsi_start_0) on apache-up003.ring0 failed (target: 0 vs. rc: 1): Error
> May 18 23:17:40 [2526] apache-up001.itc4u.local stonith-ng: debug: xml_patch_version_check: Can apply patch 0.72.32 to 0.72.31
> May 18 23:17:40 [2530] apache-up001.itc4u.local crmd: notice: abort_transition_graph: Transition aborted by scsi_start_0 'modify' on apache-up003.ring0: Event failed (magic=4:1;3:4:0:cf3499c5-bb9c-4d70-b209-80cc23a4ff1d, cib=0.72.32, source=match_graph_event:381, 0)
> May 18 23:17:40 [2530] apache-up001.itc4u.local crmd: info: match_graph_event: Action scsi_start_0 (3) confirmed on apache-up003.ring0 (rc=1)
> May 18 23:17:40 [2530] apache-up001.itc4u.local crmd: info: update_failcount: Updating failcount for scsi on apache-up003.ring0 after failed start: rc=1 (update=INFINITY, time=1463606260)
> May 18 23:17:40 [2530] apache-up001.itc4u.local crmd: info: process_graph_event: Detected action (4.3) scsi_start_0.16=unknown error: failed
> May 18 23:17:40 [2530] apache-up001.itc4u.local crmd: warning: status_from_rc: Action 3 (scsi_start_0) on apache-up003.ring0 failed (target: 0 vs. rc: 1): Error
> May 18 23:17:40 [2530] apache-up001.itc4u.local crmd: info: abort_transition_graph: Transition aborted by scsi_start_0 'create' on apache-up003.ring0: Event failed (magic=4:1;3:4:0:cf3499c5-bb9c-4d70-b209-80cc23a4ff1d, cib=0.72.32, source=match_graph_event:381, 0)
> May 18 23:17:40 [2530] apache-up001.itc4u.local crmd: info: match_graph_event: Action scsi_start_0 (3) confirmed on apache-up003.ring0 (rc=1)
> May 18 23:17:40 [2530] apache-up001.itc4u.local crmd: info: update_failcount: Updating failcount for scsi on apache-up003.ring0 after failed start: rc=1 (update=INFINITY, time=1463606260)
> May 18 23:17:40 [2530] apache-up001.itc4u.local crmd: info: process_graph_event: Detected action (4.3) scsi_start_0.16=unknown error: failed
> May 18 23:17:40 [2530] apache-up001.itc4u.local crmd: notice: run_graph: Transition 4 (Complete=3, Pending=0, Fired=0, Skipped=0, Incomplete=1, Source=/var/lib/pacemaker/pengine/pe-input-126.bz2): Complete
> May 18 23:17:40 [2530] apache-up001.itc4u.local crmd: notice: too_many_st_failures: No devices found in cluster to fence apache-up001.ring0, giving up
> May 18 23:17:40 [2530] apache-up001.itc4u.local crmd: info: do_log: FSA: Input I_TE_SUCCESS from notify_crmd() received in state S_TRANSITION_ENGINE
> May 18 23:17:40 [2530] apache-up001.itc4u.local crmd: notice: do_state_transition: State transition S_TRANSITION_ENGINE -> S_IDLE [ input=I_TE_SUCCESS cause=C_FSA_INTERNAL origin=notify_crmd ]
> May 18 23:17:40 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section status: OK (rc=0, origin=apache-up003.ring0/crmd/18, version=0.72.32)
> May 18 23:17:40 [2528] apache-up001.itc4u.local attrd: info: attrd_peer_update: Setting fail-count-scsi[apache-up003.ring0]: (null) -> INFINITY from apache-up001.ring0
> May 18 23:17:40 [2528] apache-up001.itc4u.local attrd: info: write_attribute: Sent update 16 with 3 changes for fail-count-scsi, id=<n/a>, set=(null)
> May 18 23:17:40 [2528] apache-up001.itc4u.local attrd: info: attrd_peer_update: Setting last-failure-scsi[apache-up003.ring0]: (null) -> 1463606260 from apache-up001.ring0
> May 18 23:17:40 [2528] apache-up001.itc4u.local attrd: info: write_attribute: Sent update 17 with 3 changes for last-failure-scsi, id=<n/a>, set=(null)
> May 18 23:17:40 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Forwarding cib_modify operation for section status to master (origin=local/attrd/16)
> May 18 23:17:40 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Forwarding cib_modify operation for section status to master (origin=local/attrd/17)
> May 18 23:17:40 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: --- 0.72.32 2
> May 18 23:17:40 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: +++ 0.72.33 (null)
> May 18 23:17:40 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib: @num_updates=33
> May 18 23:17:40 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ /cib/status/node_state[@id='3']/transient_attributes[@id='3']/instance_attributes[@id='status-3']: <nvpair id="status-3-fail-count-scsi" name="fail-count-scsi" value="INFINITY"/>
> May 18 23:17:40 [2530] apache-up001.itc4u.local crmd: info: abort_transition_graph: Transition aborted by status-3-fail-count-scsi, fail-count-scsi=INFINITY: Transient attribute change (create cib=0.72.33, source=abort_unless_down:329, path=/cib/status/node_state[@id='3']/transient_attributes[@id='3']/instance_attributes[@id='status-3'], 1)
> May 18 23:17:40 [2526] apache-up001.itc4u.local stonith-ng: debug: xml_patch_version_check: Can apply patch 0.72.33 to 0.72.32
> May 18 23:17:40 [2530] apache-up001.itc4u.local crmd: notice: do_state_transition: State transition S_IDLE -> S_POLICY_ENGINE [ input=I_PE_CALC cause=C_FSA_INTERNAL origin=abort_transition_graph ]
> May 18 23:17:40 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section status: OK (rc=0, origin=apache-up001.ring0/attrd/16, version=0.72.33)
> May 18 23:17:40 [2528] apache-up001.itc4u.local attrd: info: attrd_cib_callback: Update 16 for fail-count-scsi: OK (0)
> May 18 23:17:40 [2528] apache-up001.itc4u.local attrd: info: attrd_cib_callback: Update 16 for fail-count-scsi[apache-up001.ring0]=INFINITY: OK (0)
> May 18 23:17:40 [2528] apache-up001.itc4u.local attrd: info: attrd_cib_callback: Update 16 for fail-count-scsi[apache-up002.ring0]=INFINITY: OK (0)
> May 18 23:17:40 [2528] apache-up001.itc4u.local attrd: info: attrd_cib_callback: Update 16 for fail-count-scsi[apache-up003.ring0]=INFINITY: OK (0)
> May 18 23:17:40 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: --- 0.72.33 2
> May 18 23:17:40 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: +++ 0.72.34 (null)
> May 18 23:17:40 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib: @num_updates=34
> May 18 23:17:40 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: ++ /cib/status/node_state[@id='3']/transient_attributes[@id='3']/instance_attributes[@id='status-3']: <nvpair id="status-3-last-failure-scsi" name="last-failure-scsi" value="1463606260"/>
> May 18 23:17:40 [2526] apache-up001.itc4u.local stonith-ng: debug: xml_patch_version_check: Can apply patch 0.72.34 to 0.72.33
> May 18 23:17:40 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section status: OK (rc=0, origin=apache-up001.ring0/attrd/17, version=0.72.34)
> May 18 23:17:40 [2530] apache-up001.itc4u.local crmd: info: abort_transition_graph: Transition aborted by status-3-last-failure-scsi, last-failure-scsi=1463606260: Transient attribute change (create cib=0.72.34, source=abort_unless_down:329, path=/cib/status/node_state[@id='3']/transient_attributes[@id='3']/instance_attributes[@id='status-3'], 1)
> May 18 23:17:40 [2528] apache-up001.itc4u.local attrd: info: attrd_cib_callback: Update 17 for last-failure-scsi: OK (0)
> May 18 23:17:40 [2528] apache-up001.itc4u.local attrd: info: attrd_cib_callback: Update 17 for last-failure-scsi[apache-up001.ring0]=1463606256: OK (0)
> May 18 23:17:40 [2528] apache-up001.itc4u.local attrd: info: attrd_cib_callback: Update 17 for last-failure-scsi[apache-up002.ring0]=1463606258: OK (0)
> May 18 23:17:40 [2528] apache-up001.itc4u.local attrd: info: attrd_cib_callback: Update 17 for last-failure-scsi[apache-up003.ring0]=1463606260: OK (0)
> May 18 23:17:40 [2529] apache-up001.itc4u.local pengine: warning: common_unpack: dlm-clone requires (un)fencing but fencing is disabled
> May 18 23:17:40 [2529] apache-up001.itc4u.local pengine: warning: common_unpack: dlm:0 requires (un)fencing but fencing is disabled
> May 18 23:17:40 [2529] apache-up001.itc4u.local pengine: warning: common_unpack: dlm:1 requires (un)fencing but fencing is disabled
> May 18 23:17:40 [2529] apache-up001.itc4u.local pengine: warning: common_unpack: dlm:2 requires (un)fencing but fencing is disabled
> May 18 23:17:40 [2529] apache-up001.itc4u.local pengine: warning: common_unpack: clvmd-clone requires (un)fencing but fencing is disabled
> May 18 23:17:40 [2529] apache-up001.itc4u.local pengine: warning: common_unpack: clvmd:0 requires (un)fencing but fencing is disabled
> May 18 23:17:40 [2529] apache-up001.itc4u.local pengine: warning: common_unpack: clvmd:1 requires (un)fencing but fencing is disabled
> May 18 23:17:40 [2529] apache-up001.itc4u.local pengine: warning: common_unpack: clvmd:2 requires (un)fencing but fencing is disabled
> May 18 23:17:40 [2529] apache-up001.itc4u.local pengine: info: determine_online_status: Node apache-up001.ring0 is online
> May 18 23:17:40 [2529] apache-up001.itc4u.local pengine: info: determine_online_status: Node apache-up003.ring0 is online
> May 18 23:17:40 [2529] apache-up001.itc4u.local pengine: info: determine_online_status: Node apache-up002.ring0 is online
> May 18 23:17:40 [2529] apache-up001.itc4u.local pengine: warning: unpack_rsc_op_failure: Processing failed op start for scsi on apache-up001.ring0: unknown error (1)
> May 18 23:17:40 [2529] apache-up001.itc4u.local pengine: warning: unpack_rsc_op_failure: Processing failed op start for scsi on apache-up003.ring0: unknown error (1)
> May 18 23:17:40 [2529] apache-up001.itc4u.local pengine: warning: unpack_rsc_op_failure: Processing failed op start for scsi on apache-up003.ring0: unknown error (1)
> May 18 23:17:40 [2529] apache-up001.itc4u.local pengine: warning: unpack_rsc_op_failure: Processing failed op start for scsi on apache-up002.ring0: unknown error (1)
> May 18 23:17:40 [2529] apache-up001.itc4u.local pengine: warning: unpack_rsc_op_failure: Processing failed op start for dlm:0 on apache-up002.ring0: not configured (6)
> May 18 23:17:40 [2529] apache-up001.itc4u.local pengine: error: unpack_rsc_op: Preventing dlm-clone from re-starting anywhere: operation start failed 'not configured' (6)
> May 18 23:17:40 [2529] apache-up001.itc4u.local pengine: info: native_print: scsi (stonith:fence_scsi): FAILED apache-up003.ring0
> May 18 23:17:40 [2529] apache-up001.itc4u.local pengine: info: clone_print: Clone Set: dlm-clone [dlm]
> May 18 23:17:40 [2529] apache-up001.itc4u.local pengine: info: short_print: Stopped: [ apache-up001.ring0 apache-up002.ring0 apache-up003.ring0 ]
> May 18 23:17:40 [2529] apache-up001.itc4u.local pengine: info: clone_print: Clone Set: clvmd-clone [clvmd]
> May 18 23:17:40 [2529] apache-up001.itc4u.local pengine: info: short_print: Stopped: [ apache-up001.ring0 apache-up002.ring0 apache-up003.ring0 ]
> May 18 23:17:40 [2529] apache-up001.itc4u.local pengine: info: get_failcount_full: scsi has failed INFINITY times on apache-up001.ring0
> May 18 23:17:40 [2529] apache-up001.itc4u.local pengine: warning: common_apply_stickiness: Forcing scsi away from apache-up001.ring0 after 1000000 failures (max=1000000)
> May 18 23:17:40 [2529] apache-up001.itc4u.local pengine: info: get_failcount_full: scsi has failed INFINITY times on apache-up002.ring0
> May 18 23:17:40 [2529] apache-up001.itc4u.local pengine: warning: common_apply_stickiness: Forcing scsi away from apache-up002.ring0 after 1000000 failures (max=1000000)
> May 18 23:17:40 [2529] apache-up001.itc4u.local pengine: info: get_failcount_full: dlm:0 has failed INFINITY times on apache-up002.ring0
> May 18 23:17:40 [2529] apache-up001.itc4u.local pengine: warning: common_apply_stickiness: Forcing dlm-clone away from apache-up002.ring0 after 1000000 failures (max=1000000)
> May 18 23:17:40 [2529] apache-up001.itc4u.local pengine: info: get_failcount_full: dlm-clone has failed INFINITY times on apache-up002.ring0
> May 18 23:17:40 [2529] apache-up001.itc4u.local pengine: warning: common_apply_stickiness: Forcing dlm-clone away from apache-up002.ring0 after 1000000 failures (max=1000000)
> May 18 23:17:40 [2529] apache-up001.itc4u.local pengine: info: get_failcount_full: dlm-clone has failed INFINITY times on apache-up002.ring0
> May 18 23:17:40 [2529] apache-up001.itc4u.local pengine: warning: common_apply_stickiness: Forcing dlm-clone away from apache-up002.ring0 after 1000000 failures (max=1000000)
> May 18 23:17:40 [2529] apache-up001.itc4u.local pengine: info: get_failcount_full: scsi has failed INFINITY times on apache-up003.ring0
> May 18 23:17:40 [2529] apache-up001.itc4u.local pengine: warning: common_apply_stickiness: Forcing scsi away from apache-up003.ring0 after 1000000 failures (max=1000000)
> May 18 23:17:40 [2529] apache-up001.itc4u.local pengine: info: native_color: Resource scsi cannot run anywhere
> May 18 23:17:40 [2529] apache-up001.itc4u.local pengine: info: rsc_merge_weights: dlm-clone: Rolling back scores from clvmd-clone
> May 18 23:17:40 [2529] apache-up001.itc4u.local pengine: info: native_color: Resource dlm:0 cannot run anywhere
> May 18 23:17:40 [2529] apache-up001.itc4u.local pengine: info: native_color: Resource dlm:1 cannot run anywhere
> May 18 23:17:40 [2529] apache-up001.itc4u.local pengine: info: native_color: Resource dlm:2 cannot run anywhere
> May 18 23:17:40 [2529] apache-up001.itc4u.local pengine: notice: clone_rsc_colocation_rh: Cannot pair clvmd:0 with instance of dlm-clone
> May 18 23:17:40 [2529] apache-up001.itc4u.local pengine: info: native_color: Resource clvmd:0 cannot run anywhere
> May 18 23:17:40 [2529] apache-up001.itc4u.local pengine: notice: clone_rsc_colocation_rh: Cannot pair clvmd:1 with instance of dlm-clone
> May 18 23:17:40 [2529] apache-up001.itc4u.local pengine: info: native_color: Resource clvmd:1 cannot run anywhere
> May 18 23:17:40 [2529] apache-up001.itc4u.local pengine: notice: clone_rsc_colocation_rh: Cannot pair clvmd:2 with instance of dlm-clone
> May 18 23:17:40 [2529] apache-up001.itc4u.local pengine: info: native_color: Resource clvmd:2 cannot run anywhere
> May 18 23:17:40 [2529] apache-up001.itc4u.local pengine: info: clone_update_actions_interleave: Inhibiting clvmd:0 from being active
> May 18 23:17:40 [2529] apache-up001.itc4u.local pengine: info: clone_update_actions_interleave: Inhibiting clvmd:1 from being active
> May 18 23:17:40 [2529] apache-up001.itc4u.local pengine: info: clone_update_actions_interleave: Inhibiting clvmd:2 from being active
> May 18 23:17:40 [2529] apache-up001.itc4u.local pengine: info: clone_update_actions_interleave: Inhibiting clvmd:0 from being active
> May 18 23:17:40 [2529] apache-up001.itc4u.local pengine: info: clone_update_actions_interleave: Inhibiting clvmd:1 from being active
> May 18 23:17:40 [2529] apache-up001.itc4u.local pengine: info: clone_update_actions_interleave: Inhibiting clvmd:2 from being active
> May 18 23:17:40 [2529] apache-up001.itc4u.local pengine: info: clone_update_actions_interleave: Inhibiting clvmd:0 from being active
> May 18 23:17:40 [2529] apache-up001.itc4u.local pengine: info: clone_update_actions_interleave: Inhibiting clvmd:1 from being active
> May 18 23:17:40 [2529] apache-up001.itc4u.local pengine: info: clone_update_actions_interleave: Inhibiting clvmd:2 from being active
> May 18 23:17:40 [2529] apache-up001.itc4u.local pengine: info: clone_update_actions_interleave: Inhibiting clvmd:0 from being active
> May 18 23:17:40 [2529] apache-up001.itc4u.local pengine: info: clone_update_actions_interleave: Inhibiting clvmd:1 from being active
> May 18 23:17:40 [2529] apache-up001.itc4u.local pengine: info: clone_update_actions_interleave: Inhibiting clvmd:2 from being active
> May 18 23:17:40 [2529] apache-up001.itc4u.local pengine: notice: LogActions: Stop scsi (apache-up003.ring0)
> May 18 23:17:40 [2529] apache-up001.itc4u.local pengine: info: LogActions: Leave dlm:0 (Stopped)
> May 18 23:17:40 [2529] apache-up001.itc4u.local pengine: info: LogActions: Leave dlm:1 (Stopped)
> May 18 23:17:40 [2529] apache-up001.itc4u.local pengine: info: LogActions: Leave dlm:2 (Stopped)
> May 18 23:17:40 [2529] apache-up001.itc4u.local pengine: info: LogActions: Leave clvmd:0 (Stopped)
> May 18 23:17:40 [2529] apache-up001.itc4u.local pengine: info: LogActions: Leave clvmd:1 (Stopped)
> May 18 23:17:40 [2529] apache-up001.itc4u.local pengine: info: LogActions: Leave clvmd:2 (Stopped)
> May 18 23:17:40 [2529] apache-up001.itc4u.local pengine: notice: process_pe_message: Calculated Transition 5: /var/lib/pacemaker/pengine/pe-input-127.bz2
> May 18 23:17:40 [2530] apache-up001.itc4u.local crmd: info: do_state_transition: State transition S_POLICY_ENGINE -> S_TRANSITION_ENGINE [ input=I_PE_SUCCESS cause=C_IPC_MESSAGE origin=handle_response ]
> May 18 23:17:40 [2530] apache-up001.itc4u.local crmd: info: do_te_invoke: Processing graph 5 (ref=pe_calc-dc-1463606260-60) derived from /var/lib/pacemaker/pengine/pe-input-127.bz2
> May 18 23:17:40 [2530] apache-up001.itc4u.local crmd: notice: te_rsc_command: Initiating action 1: stop scsi_stop_0 on apache-up003.ring0
> May 18 23:17:40 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: --- 0.72.34 2
> May 18 23:17:40 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: +++ 0.72.35 (null)
> May 18 23:17:40 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib: @num_updates=35
> May 18 23:17:40 [2525] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib/status/node_state[@id='3']/lrm[@id='3']/lrm_resources/lrm_resource[@id='scsi']/lrm_rsc_op[@id='scsi_last_0']: @operation_key=scsi_stop_0, @operation=stop, @transition-key=1:5:0:cf3499c5-bb9c-4d70-b209-80cc23a4ff1d, @transition-magic=0:0;1:5:0:cf3499c5-bb9c-4d70-b209-80cc23a4ff1d, @call-id=17, @rc-code=0, @op-status=0, @last-run=1463606260, @last-rc-change=1463606260, @exec-time=1
> May 18 23:17:40 [2530] apache-up001.itc4u.local crmd: info: match_graph_event: Action scsi_stop_0 (1) confirmed on apache-up003.ring0 (rc=0)
> May 18 23:17:40 [2526] apache-up001.itc4u.local stonith-ng: debug: xml_patch_version_check: Can apply patch 0.72.35 to 0.72.34
> May 18 23:17:40 [2530] apache-up001.itc4u.local crmd: notice: run_graph: Transition 5 (Complete=2, Pending=0, Fired=0, Skipped=0, Incomplete=0, Source=/var/lib/pacemaker/pengine/pe-input-127.bz2): Complete
> May 18 23:17:40 [2530] apache-up001.itc4u.local crmd: info: do_log: FSA: Input I_TE_SUCCESS from notify_crmd() received in state S_TRANSITION_ENGINE
> May 18 23:17:40 [2530] apache-up001.itc4u.local crmd: notice: do_state_transition: State transition S_TRANSITION_ENGINE -> S_IDLE [ input=I_TE_SUCCESS cause=C_FSA_INTERNAL origin=notify_crmd ]
> May 18 23:17:40 [2525] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section status: OK (rc=0, origin=apache-up003.ring0/crmd/19, version=0.72.35)
> May 18 23:17:45 [2525] apache-up001.itc4u.local cib: info: cib_process_ping: Reporting our current digest to apache-up001.ring0: 3a91351a8fb90a12be5aee1c4c586f78 for 0.72.35 (0x140f3f0 0)
>
>
>>
>>
>>>> Thanks
>>>>
>>>> Marco
>>>>
>>>>
>>>> May 18 10:37:03 apache-up001 crmd[15918]: notice: State transition S_IDLE -> S_POLICY_ENGINE [ input=I_PE_CALC cause=C_FSA_INTERNAL origin=abort_transition_graph ]
>>>> May 18 10:37:03 apache-up001 pengine[15917]: notice: On loss of CCM Quorum: Ignore
>>>> May 18 10:37:03 apache-up001 pengine[15917]: notice: Unfencing apache-up001.ring0: node discovery
>>>> May 18 10:37:03 apache-up001 pengine[15917]: notice: Unfencing apache-up002.ring0: node discovery
>>>> May 18 10:37:03 apache-up001 pengine[15917]: notice: Unfencing apache-up003.ring0: node discovery
>>>> May 18 10:37:03 apache-up001 pengine[15917]: notice: Start scsia#011(apache-up001.ring0)
>>>> May 18 10:37:03 apache-up001 pengine[15917]: notice: Calculated Transition 11: /var/lib/pacemaker/pengine/pe-input-95.bz2
>>>> May 18 10:37:03 apache-up001 crmd[15918]: notice: Executing on fencing operation (11) on apache-up003.ring0 (timeout=60000)
>>>> May 18 10:37:03 apache-up001 crmd[15918]: notice: Initiating action 9: probe_complete probe_complete-apache-up003.ring0 on apache-up003.ring0 - no waiting
>>>> May 18 10:37:03 apache-up001 crmd[15918]: notice: Executing on fencing operation (8) on apache-up002.ring0 (timeout=60000)
>>>> May 18 10:37:03 apache-up001 crmd[15918]: notice: Initiating action 6: probe_complete probe_complete-apache-up002.ring0 on apache-up002.ring0 - no waiting
>>>> May 18 10:37:03 apache-up001 crmd[15918]: notice: Executing on fencing operation (5) on apache-up001.ring0 (timeout=60000)
>>>> May 18 10:37:03 apache-up001 stonith-ng[15914]: notice: Client crmd.15918.697c495e wants to fence (on) 'apache-up003.ring0' with device '(any)'
>>>> May 18 10:37:03 apache-up001 stonith-ng[15914]: notice: Initiating remote operation on for apache-up003.ring0: 0599387e-0a30-4e1b-b641-adea5ba2a4ad (0)
>>>> May 18 10:37:03 apache-up001 stonith-ng[15914]: notice: Client crmd.15918.697c495e wants to fence (on) 'apache-up002.ring0' with device '(any)'
>>>> May 18 10:37:03 apache-up001 stonith-ng[15914]: notice: Initiating remote operation on for apache-up002.ring0: 76aba815-280e-491a-bd17-40776c8169e9 (0)
>>>> May 18 10:37:03 apache-up001 crmd[15918]: notice: Initiating action 3: probe_complete probe_complete-apache-up001.ring0 on apache-up001.ring0 (local) - no waiting
>>>> May 18 10:37:03 apache-up001 stonith-ng[15914]: notice: Client crmd.15918.697c495e wants to fence (on) 'apache-up001.ring0' with device '(any)'
>>>> May 18 10:37:03 apache-up001 stonith-ng[15914]: notice: Initiating remote operation on for apache-up001.ring0: e50d7e16-9578-4964-96a3-7b36bdcfba46 (0)
>>>> May 18 10:37:03 apache-up001 stonith-ng[15914]: notice: Couldn't find anyone to fence (on) apache-up003.ring0 with any device
>>>> May 18 10:37:03 apache-up001 stonith-ng[15914]: notice: Couldn't find anyone to fence (on) apache-up002.ring0 with any device
>>>> May 18 10:37:03 apache-up001 stonith-ng[15914]: error: Operation on of apache-up003.ring0 by <no-one> for crmd.15918 at apache-up001.ring0.0599387e: No such device
>>>> May 18 10:37:03 apache-up001 stonith-ng[15914]: error: Operation on of apache-up002.ring0 by <no-one> for crmd.15918 at apache-up001.ring0.76aba815: No such device
>>>> May 18 10:37:03 apache-up001 stonith-ng[15914]: notice: Couldn't find anyone to fence (on) apache-up001.ring0 with any device
>>>> May 18 10:37:03 apache-up001 crmd[15918]: notice: Stonith operation 5/11:11:0:8248cebf-c198-4ff2-bd43-7415533ce50f: No such device (-19)
>>>> May 18 10:37:03 apache-up001 stonith-ng[15914]: error: Operation on of apache-up001.ring0 by <no-one> for crmd.15918 at apache-up001.ring0.e50d7e16: No such device
>>>> May 18 10:37:03 apache-up001 crmd[15918]: notice: Stonith operation 5 for apache-up003.ring0 failed (No such device): aborting transition.
>>>> May 18 10:37:03 apache-up001 crmd[15918]: notice: Transition aborted: Stonith failed (source=tengine_stonith_callback:733, 0)
>>>> May 18 10:37:03 apache-up001 crmd[15918]: error: Unfencing of apache-up003.ring0 by <anyone> failed: No such device (-19)
>>>> May 18 10:37:03 apache-up001 crmd[15918]: notice: Stonith operation 6/8:11:0:8248cebf-c198-4ff2-bd43-7415533ce50f: No such device (-19)
>>>> May 18 10:37:03 apache-up001 crmd[15918]: notice: Stonith operation 6 for apache-up002.ring0 failed (No such device): aborting transition.
>>>> May 18 10:37:03 apache-up001 crmd[15918]: error: Unfencing of apache-up002.ring0 by <anyone> failed: No such device (-19)
>>>> May 18 10:37:03 apache-up001 crmd[15918]: notice: Stonith operation 7/5:11:0:8248cebf-c198-4ff2-bd43-7415533ce50f: No such device (-19)
>>>> May 18 10:37:03 apache-up001 crmd[15918]: notice: Stonith operation 7 for apache-up001.ring0 failed (No such device): aborting transition.
>>>> May 18 10:37:03 apache-up001 crmd[15918]: error: Unfencing of apache-up001.ring0 by <anyone> failed: No such device (-19)
>>>> May 18 10:37:03 apache-up001 crmd[15918]: notice: Initiating action 10: monitor scsia_monitor_0 on apache-up003.ring0
>>>> May 18 10:37:03 apache-up001 crmd[15918]: notice: Initiating action 7: monitor scsia_monitor_0 on apache-up002.ring0
>>>> May 18 10:37:03 apache-up001 crmd[15918]: notice: Initiating action 4: monitor scsia_monitor_0 on apache-up001.ring0 (local)
>>>> May 18 10:37:03 apache-up001 crmd[15918]: notice: Operation scsia_monitor_0: not running (node=apache-up001.ring0, call=19, rc=7, cib-update=59, confirmed=true)
>>>> May 18 10:37:03 apache-up001 crmd[15918]: notice: Transition 11 (Complete=10, Pending=0, Fired=0, Skipped=1, Incomplete=2, Source=/var/lib/pacemaker/pengine/pe-input-95.bz2): Stopped
>>>> May 18 10:37:03 apache-up001 crmd[15918]: notice: No devices found in cluster to fence apache-up001.ring0, giving up
>>>> May 18 10:37:03 apache-up001 crmd[15918]: notice: State transition S_TRANSITION_ENGINE -> S_IDLE [ input=I_TE_SUCCESS cause=C_FSA_INTERNAL origin=notify_crmd ]
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>> On 16 May 2016, at 16:22, Ken Gaillot <kgaillot at redhat.com> wrote:
>>>>>
>>>>> On 05/14/2016 08:54 AM, Marco A. Carcano wrote:
>>>>>> I hope to find here someone who can help me:
>>>>>>
>>>>>> I have a 3 node cluster and I’m struggling to create a GFSv2 shared storage. The weird thing is that despite cluster seems OK, I’m not able to have the fence_scsi stonith device managed, and this prevent CLVMD and GFSv2 to start.
>>>>>>
>>>>>> I’m using CentOS 7.1, selinux and firewall disabled
>>>>>>
>>>>>> I created the stonith device with the following command
>>>>>>
>>>>>> pcs stonith create scsi fence_scsi pcmk_host_list="apache-up001.ring0 apache-up002.ring0 apache-up003.ring0 apache-up001.ring1 apache-up002.ring1 apache-up003.ring1”
>>>>>> pcmk_reboot_action="off" devices="/dev/mapper/36001405973e201b3fdb4a999175b942f" meta provides="unfencing" —force
>>>>>>
>>>>>> Notice that is a 3 node cluster with a redundant ring: hosts with .ring1 suffix are the same of the ones with .ring0 suffix, but with a different IP address
>>>>>
>>>>> pcmk_host_list only needs the names of the nodes as specified in the
>>>>> Pacemaker configuration. It allows the cluster to answer the question,
>>>>> "What device can I use to fence this particular node?"
>>>>>
>>>>> Sometimes the fence device itself needs to identify the node by a
>>>>> different name than the one used by Pacemaker. In that case, use
>>>>> pcmk_host_map, which maps each cluster node name to a fence device node
>>>>> name.
>>>>>
>>>>> The one thing your command is missing is an "op monitor". I'm guessing
>>>>> that's why it required "--force" (which shouldn't be necessary) and why
>>>>> the cluster is treating it as unmanaged.
>>>>>
>>>>>> /dev/mapper/36001405973e201b3fdb4a999175b942f is a multipath device for /dev/sda and /dev/sdb
>>>>>>
>>>>>> in log files everything seems right. However pcs status reports the following:
>>>>>>
>>>>>> Cluster name: apache-0
>>>>>> Last updated: Sat May 14 15:35:56 2016 Last change: Sat May 14 15:18:17 2016 by root via cibadmin on apache-up001.ring0
>>>>>> Stack: corosync
>>>>>> Current DC: apache-up003.ring0 (version 1.1.13-10.el7_2.2-44eb2dd) - partition with quorum
>>>>>> 3 nodes and 7 resources configured
>>>>>>
>>>>>> Online: [ apache-up001.ring0 apache-up002.ring0 apache-up003.ring0 ]
>>>>>>
>>>>>> Full list of resources:
>>>>>>
>>>>>> scsi (stonith:fence_scsi): Stopped (unmanaged)
>>>>>>
>>>>>> PCSD Status:
>>>>>> apache-up001.ring0: Online
>>>>>> apache-up002.ring0: Online
>>>>>> apache-up003.ring0: Online
>>>>>>
>>>>>> Daemon Status:
>>>>>> corosync: active/enabled
>>>>>> pacemaker: active/enabled
>>>>>> pcsd: active/enabled
>>>>>>
>>>>>> However SCSI fencing and persistent id reservation seems right:
>>>>>>
>>>>>> sg_persist -n -i -r -d /dev/mapper/36001405973e201b3fdb4a999175b942f
>>>>>> PR generation=0x37, Reservation follows:
>>>>>> Key=0x9b0e0000
>>>>>> scope: LU_SCOPE, type: Write Exclusive, registrants only
>>>>>>
>>>>>> sg_persist -n -i -k -d /dev/mapper/36001405973e201b3fdb4a999175b942f
>>>>>> PR generation=0x37, 6 registered reservation keys follow:
>>>>>> 0x9b0e0000
>>>>>> 0x9b0e0000
>>>>>> 0x9b0e0001
>>>>>> 0x9b0e0001
>>>>>> 0x9b0e0002
>>>>>> 0x9b0e0002
>>>>>>
>>>>>> if I manually fence the second node:
>>>>>>
>>>>>> pcs stonith fence apache-up002.ring0
>>>>>>
>>>>>> I got as expected
>>>>>>
>>>>>> sg_persist -n -i -k -d /dev/mapper/36001405973e201b3fdb4a999175b942f
>>>>>> PR generation=0x38, 4 registered reservation keys follow:
>>>>>> 0x9b0e0000
>>>>>> 0x9b0e0000
>>>>>> 0x9b0e0002
>>>>>> 0x9b0e0002
>>>>>>
>>>>>> Cluster configuration seems OK
>>>>>>
>>>>>> crm_verify -L -V reports no errors neither warnings,
>>>>>>
>>>>>> corosync-cfgtool -s
>>>>>>
>>>>>> Printing ring status.
>>>>>> Local node ID 1
>>>>>> RING ID 0
>>>>>> id = 192.168.15.9
>>>>>> status = ring 0 active with no faults
>>>>>> RING ID 1
>>>>>> id = 192.168.16.9
>>>>>> status = ring 1 active with no faults
>>>>>>
>>>>>> corosync-quorumtool -s
>>>>>>
>>>>>> Quorum information
>>>>>> ------------------
>>>>>> Date: Sat May 14 15:42:38 2016
>>>>>> Quorum provider: corosync_votequorum
>>>>>> Nodes: 3
>>>>>> Node ID: 1
>>>>>> Ring ID: 820
>>>>>> Quorate: Yes
>>>>>>
>>>>>> Votequorum information
>>>>>> ----------------------
>>>>>> Expected votes: 3
>>>>>> Highest expected: 3
>>>>>> Total votes: 3
>>>>>> Quorum: 2
>>>>>> Flags: Quorate
>>>>>>
>>>>>> Membership information
>>>>>> ----------------------
>>>>>> Nodeid Votes Name
>>>>>> 3 1 apache-up003.ring0
>>>>>> 2 1 apache-up002.ring0
>>>>>> 1 1 apache-up001.ring0 (local)
>>>>>>
>>>>>>
>>>>>> corosync-cmapctl | grep members
>>>>>> runtime.totem.pg.mrp.srp.members.1.config_version (u64) = 0
>>>>>> runtime.totem.pg.mrp.srp.members.1.ip (str) = r(0) ip(192.168.15.9) r(1) ip(192.168.16.9)
>>>>>> runtime.totem.pg.mrp.srp.members.1.join_count (u32) = 1
>>>>>> runtime.totem.pg.mrp.srp.members.1.status (str) = joined
>>>>>> runtime.totem.pg.mrp.srp.members.2.config_version (u64) = 0
>>>>>> runtime.totem.pg.mrp.srp.members.2.ip (str) = r(0) ip(192.168.15.8) r(1) ip(192.168.16.8)
>>>>>> runtime.totem.pg.mrp.srp.members.2.join_count (u32) = 1
>>>>>> runtime.totem.pg.mrp.srp.members.2.status (str) = joined
>>>>>> runtime.totem.pg.mrp.srp.members.3.config_version (u64) = 0
>>>>>> runtime.totem.pg.mrp.srp.members.3.ip (str) = r(0) ip(192.168.15.7) r(1) ip(192.168.16.7)
>>>>>> runtime.totem.pg.mrp.srp.members.3.join_count (u32) = 1
>>>>>> runtime.totem.pg.mrp.srp.members.3.status (str) = joined
>>>>>>
>>>>>> here are logs at cluster start
>>>>>>
>>>>>> pcs cluster start --all
>>>>>> apache-up003.ring0: Starting Cluster...
>>>>>> apache-up001.ring0: Starting Cluster...
>>>>>> apache-up002.ring0: Starting Cluster...
>>>>>>
>>>>>>
>>>>>> cat /var/log/messages
>>>>>> May 14 15:46:59 apache-up001 systemd: Starting Corosync Cluster Engine...
>>>>>> May 14 15:46:59 apache-up001 corosync[18934]: [MAIN ] Corosync Cluster Engine ('2.3.4'): started and ready to provide service.
>>>>>> May 14 15:46:59 apache-up001 corosync[18934]: [MAIN ] Corosync built-in features: dbus systemd xmlconf snmp pie relro bindnow
>>>>>> May 14 15:46:59 apache-up001 corosync[18935]: [TOTEM ] Initializing transport (UDP/IP Unicast).
>>>>>> May 14 15:46:59 apache-up001 corosync[18935]: [TOTEM ] Initializing transmit/receive security (NSS) crypto: none hash: none
>>>>>> May 14 15:46:59 apache-up001 corosync[18935]: [TOTEM ] Initializing transport (UDP/IP Unicast).
>>>>>> May 14 15:46:59 apache-up001 corosync[18935]: [TOTEM ] Initializing transmit/receive security (NSS) crypto: none hash: none
>>>>>> May 14 15:46:59 apache-up001 corosync[18935]: [TOTEM ] The network interface [192.168.15.9] is now up.
>>>>>> May 14 15:46:59 apache-up001 corosync[18935]: [SERV ] Service engine loaded: corosync configuration map access [0]
>>>>>> May 14 15:46:59 apache-up001 corosync[18935]: [QB ] server name: cmap
>>>>>> May 14 15:46:59 apache-up001 corosync[18935]: [SERV ] Service engine loaded: corosync configuration service [1]
>>>>>> May 14 15:46:59 apache-up001 corosync[18935]: [QB ] server name: cfg
>>>>>> May 14 15:46:59 apache-up001 corosync[18935]: [SERV ] Service engine loaded: corosync cluster closed process group service v1.01 [2]
>>>>>> May 14 15:46:59 apache-up001 corosync[18935]: [QB ] server name: cpg
>>>>>> May 14 15:46:59 apache-up001 corosync[18935]: [SERV ] Service engine loaded: corosync profile loading service [4]
>>>>>> May 14 15:46:59 apache-up001 corosync[18935]: [QUORUM] Using quorum provider corosync_votequorum
>>>>>> May 14 15:46:59 apache-up001 corosync[18935]: [SERV ] Service engine loaded: corosync vote quorum service v1.0 [5]
>>>>>> May 14 15:46:59 apache-up001 corosync[18935]: [QB ] server name: votequorum
>>>>>> May 14 15:46:59 apache-up001 corosync[18935]: [SERV ] Service engine loaded: corosync cluster quorum service v0.1 [3]
>>>>>> May 14 15:46:59 apache-up001 corosync[18935]: [QB ] server name: quorum
>>>>>> May 14 15:46:59 apache-up001 corosync[18935]: [TOTEM ] adding new UDPU member {192.168.15.9}
>>>>>> May 14 15:46:59 apache-up001 corosync[18935]: [TOTEM ] adding new UDPU member {192.168.15.8}
>>>>>> May 14 15:46:59 apache-up001 corosync[18935]: [TOTEM ] adding new UDPU member {192.168.15.7}
>>>>>> May 14 15:46:59 apache-up001 corosync[18935]: [TOTEM ] The network interface [192.168.16.9] is now up.
>>>>>> May 14 15:46:59 apache-up001 corosync[18935]: [TOTEM ] adding new UDPU member {192.168.16.9}
>>>>>> May 14 15:46:59 apache-up001 corosync[18935]: [TOTEM ] adding new UDPU member {192.168.16.8}
>>>>>> May 14 15:46:59 apache-up001 corosync[18935]: [TOTEM ] adding new UDPU member {192.168.16.7}
>>>>>> May 14 15:46:59 apache-up001 corosync[18935]: [TOTEM ] A new membership (192.168.15.9:824) was formed. Members joined: 1
>>>>>> May 14 15:46:59 apache-up001 corosync[18935]: [QUORUM] Members[1]: 1
>>>>>> May 14 15:46:59 apache-up001 corosync[18935]: [MAIN ] Completed service synchronization, ready to provide service.
>>>>>> May 14 15:46:59 apache-up001 corosync[18935]: [TOTEM ] A new membership (192.168.15.7:836) was formed. Members joined: 3 2
>>>>>> May 14 15:46:59 apache-up001 corosync[18935]: [QUORUM] This node is within the primary component and will provide service.
>>>>>> May 14 15:46:59 apache-up001 corosync[18935]: [QUORUM] Members[3]: 3 2 1
>>>>>> May 14 15:46:59 apache-up001 corosync[18935]: [MAIN ] Completed service synchronization, ready to provide service.
>>>>>> May 14 15:46:59 apache-up001 corosync: Starting Corosync Cluster Engine (corosync): [ OK ]
>>>>>> May 14 15:46:59 apache-up001 systemd: Started Corosync Cluster Engine.
>>>>>> May 14 15:46:59 apache-up001 pacemakerd[18950]: notice: Additional logging available in /var/log/pacemaker.log
>>>>>> May 14 15:46:59 apache-up001 systemd: Started Pacemaker High Availability Cluster Manager.
>>>>>> May 14 15:46:59 apache-up001 systemd: Starting Pacemaker High Availability Cluster Manager...
>>>>>> May 14 15:46:59 apache-up001 pacemakerd[18950]: notice: Switching to /var/log/cluster/corosync.log
>>>>>> May 14 15:46:59 apache-up001 pacemakerd[18950]: notice: Additional logging available in /var/log/cluster/corosync.log
>>>>>> May 14 15:46:59 apache-up001 pacemakerd[18950]: notice: Configured corosync to accept connections from group 189: OK (1)
>>>>>> May 14 15:46:59 apache-up001 pacemakerd[18950]: notice: Starting Pacemaker 1.1.13-10.el7_2.2 (Build: 44eb2dd): generated-manpages agent-manpages ncurses libqb-logging libqb-ipc upstart systemd nagios corosync-native atomic-attrd acls
>>>>>> May 14 15:46:59 apache-up001 pacemakerd[18950]: notice: Quorum acquired
>>>>>> May 14 15:46:59 apache-up001 pacemakerd[18950]: notice: pcmk_quorum_notification: Node apache-up003.ring0[3] - state is now member (was (null))
>>>>>> May 14 15:46:59 apache-up001 pacemakerd[18950]: notice: pcmk_quorum_notification: Node apache-up002.ring0[2] - state is now member (was (null))
>>>>>> May 14 15:46:59 apache-up001 pacemakerd[18950]: notice: pcmk_quorum_notification: Node apache-up001.ring0[1] - state is now member (was (null))
>>>>>> May 14 15:46:59 apache-up001 attrd[18954]: notice: Additional logging available in /var/log/cluster/corosync.log
>>>>>> May 14 15:46:59 apache-up001 attrd[18954]: notice: Connecting to cluster infrastructure: corosync
>>>>>> May 14 15:46:59 apache-up001 crmd[18956]: notice: Additional logging available in /var/log/cluster/corosync.log
>>>>>> May 14 15:46:59 apache-up001 crmd[18956]: notice: CRM Git Version: 1.1.13-10.el7_2.2 (44eb2dd)
>>>>>> May 14 15:46:59 apache-up001 cib[18951]: notice: Additional logging available in /var/log/cluster/corosync.log
>>>>>> May 14 15:46:59 apache-up001 pengine[18955]: notice: Additional logging available in /var/log/cluster/corosync.log
>>>>>> May 14 15:46:59 apache-up001 lrmd[18953]: notice: Additional logging available in /var/log/cluster/corosync.log
>>>>>> May 14 15:46:59 apache-up001 stonith-ng[18952]: notice: Additional logging available in /var/log/cluster/corosync.log
>>>>>> May 14 15:46:59 apache-up001 stonith-ng[18952]: notice: Connecting to cluster infrastructure: corosync
>>>>>> May 14 15:46:59 apache-up001 cib[18951]: notice: Connecting to cluster infrastructure: corosync
>>>>>> May 14 15:46:59 apache-up001 attrd[18954]: notice: crm_update_peer_proc: Node apache-up001.ring0[1] - state is now member (was (null))
>>>>>> May 14 15:46:59 apache-up001 stonith-ng[18952]: notice: crm_update_peer_proc: Node apache-up001.ring0[1] - state is now member (was (null))
>>>>>> May 14 15:46:59 apache-up001 cib[18951]: notice: crm_update_peer_proc: Node apache-up001.ring0[1] - state is now member (was (null))
>>>>>> May 14 15:46:59 apache-up001 cib[18951]: notice: crm_update_peer_proc: Node apache-up003.ring0[3] - state is now member (was (null))
>>>>>> May 14 15:46:59 apache-up001 cib[18951]: notice: crm_update_peer_proc: Node apache-up002.ring0[2] - state is now member (was (null))
>>>>>> May 14 15:47:00 apache-up001 crmd[18956]: notice: Connecting to cluster infrastructure: corosync
>>>>>> May 14 15:47:00 apache-up001 crmd[18956]: notice: Quorum acquired
>>>>>> May 14 15:47:00 apache-up001 crmd[18956]: notice: pcmk_quorum_notification: Node apache-up003.ring0[3] - state is now member (was (null))
>>>>>> May 14 15:47:00 apache-up001 crmd[18956]: notice: pcmk_quorum_notification: Node apache-up002.ring0[2] - state is now member (was (null))
>>>>>> May 14 15:47:00 apache-up001 crmd[18956]: notice: pcmk_quorum_notification: Node apache-up001.ring0[1] - state is now member (was (null))
>>>>>> May 14 15:47:00 apache-up001 crmd[18956]: notice: Notifications disabled
>>>>>> May 14 15:47:00 apache-up001 crmd[18956]: notice: The local CRM is operational
>>>>>> May 14 15:47:00 apache-up001 crmd[18956]: notice: State transition S_STARTING -> S_PENDING [ input=I_PENDING cause=C_FSA_INTERNAL origin=do_started ]
>>>>>> May 14 15:47:00 apache-up001 attrd[18954]: notice: crm_update_peer_proc: Node apache-up003.ring0[3] - state is now member (was (null))
>>>>>> May 14 15:47:00 apache-up001 stonith-ng[18952]: notice: Watching for stonith topology changes
>>>>>> May 14 15:47:00 apache-up001 attrd[18954]: notice: crm_update_peer_proc: Node apache-up002.ring0[2] - state is now member (was (null))
>>>>>> May 14 15:47:00 apache-up001 stonith-ng[18952]: notice: crm_update_peer_proc: Node apache-up002.ring0[2] - state is now member (was (null))
>>>>>> May 14 15:47:00 apache-up001 stonith-ng[18952]: notice: crm_update_peer_proc: Node apache-up003.ring0[3] - state is now member (was (null))
>>>>>> May 14 15:47:01 apache-up001 stonith-ng[18952]: notice: Added 'scsi' to the device list (1 active devices)
>>>>>> May 14 15:47:21 apache-up001 crmd[18956]: notice: State transition S_PENDING -> S_NOT_DC [ input=I_NOT_DC cause=C_HA_MESSAGE origin=do_cl_join_finalize_respond ]
>>>>>> May 14 15:47:22 apache-up001 stonith-ng[18952]: notice: scsi can fence (on) apache-up001.ring0: static-list
>>>>>> May 14 15:47:22 apache-up001 stonith-ng[18952]: notice: scsi can fence (on) apache-up001.ring0: static-list
>>>>>> May 14 15:47:22 apache-up001 kernel: sda: unknown partition table
>>>>>> May 14 15:47:22 apache-up001 kernel: sdb: unknown partition table
>>>>>> May 14 15:47:22 apache-up001 stonith-ng[18952]: notice: Operation on of apache-up003.ring0 by apache-up003.ring0 for crmd.15120 at apache-up002.ring0.44c5a0b6: OK
>>>>>> May 14 15:47:22 apache-up001 crmd[18956]: notice: apache-up003.ring0 was successfully unfenced by apache-up003.ring0 (at the request of apache-up002.ring0)
>>>>>> May 14 15:47:22 apache-up001 stonith-ng[18952]: notice: Operation on of apache-up002.ring0 by apache-up002.ring0 for crmd.15120 at apache-up002.ring0.e4b17672: OK
>>>>>> May 14 15:47:22 apache-up001 crmd[18956]: notice: apache-up002.ring0 was successfully unfenced by apache-up002.ring0 (at the request of apache-up002.ring0)
>>>>>> May 14 15:47:23 apache-up001 stonith-ng[18952]: notice: Operation 'on' [19052] (call 4 from crmd.15120) for host 'apache-up001.ring0' with device 'scsi' returned: 0 (OK)
>>>>>> May 14 15:47:23 apache-up001 stonith-ng[18952]: notice: Operation on of apache-up001.ring0 by apache-up001.ring0 for crmd.15120 at apache-up002.ring0.a682d19f: OK
>>>>>> May 14 15:47:23 apache-up001 crmd[18956]: notice: apache-up001.ring0 was successfully unfenced by apache-up001.ring0 (at the request of apache-up002.ring0)
>>>>>> May 14 15:47:23 apache-up001 systemd: Device dev-disk-by\x2did-scsi\x2d36001405973e201b3fdb4a999175b942f.device appeared twice with different sysfs paths /sys/devices/platform/host3/session2/target3:0:0/3:0:0:1/block/sda and /sys/devices/platform/host2/session1/target2:0:0/2:0:0:1/block/sdb
>>>>>> May 14 15:47:23 apache-up001 systemd: Device dev-disk-by\x2did-wwn\x2d0x6001405973e201b3fdb4a999175b942f.device appeared twice with different sysfs paths /sys/devices/platform/host3/session2/target3:0:0/3:0:0:1/block/sda and /sys/devices/platform/host2/session1/target2:0:0/2:0:0:1/block/sdb
>>>>>> May 14 15:47:25 apache-up001 crmd[18956]: notice: Operation scsi_monitor_0: not running (node=apache-up001.ring0, call=5, rc=7, cib-update=12, confirmed=true)
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>> cat /var/log/cluster/corosync.log
>>>>>> [18934] apache-up001.itc4u.local corosyncnotice [MAIN ] Corosync Cluster Engine ('2.3.4'): started and ready to provide service.
>>>>>> [18934] apache-up001.itc4u.local corosyncinfo [MAIN ] Corosync built-in features: dbus systemd xmlconf snmp pie relro bindnow
>>>>>> [18934] apache-up001.itc4u.local corosyncnotice [TOTEM ] Initializing transport (UDP/IP Unicast).
>>>>>> [18934] apache-up001.itc4u.local corosyncnotice [TOTEM ] Initializing transmit/receive security (NSS) crypto: none hash: none
>>>>>> [18934] apache-up001.itc4u.local corosyncnotice [TOTEM ] Initializing transport (UDP/IP Unicast).
>>>>>> [18934] apache-up001.itc4u.local corosyncnotice [TOTEM ] Initializing transmit/receive security (NSS) crypto: none hash: none
>>>>>> [18934] apache-up001.itc4u.local corosyncnotice [TOTEM ] The network interface [192.168.15.9] is now up.
>>>>>> [18934] apache-up001.itc4u.local corosyncnotice [SERV ] Service engine loaded: corosync configuration map access [0]
>>>>>> [18934] apache-up001.itc4u.local corosyncinfo [QB ] server name: cmap
>>>>>> [18934] apache-up001.itc4u.local corosyncnotice [SERV ] Service engine loaded: corosync configuration service [1]
>>>>>> [18934] apache-up001.itc4u.local corosyncinfo [QB ] server name: cfg
>>>>>> [18934] apache-up001.itc4u.local corosyncnotice [SERV ] Service engine loaded: corosync cluster closed process group service v1.01 [2]
>>>>>> [18934] apache-up001.itc4u.local corosyncinfo [QB ] server name: cpg
>>>>>> [18934] apache-up001.itc4u.local corosyncnotice [SERV ] Service engine loaded: corosync profile loading service [4]
>>>>>> [18934] apache-up001.itc4u.local corosyncnotice [QUORUM] Using quorum provider corosync_votequorum
>>>>>> [18934] apache-up001.itc4u.local corosyncnotice [SERV ] Service engine loaded: corosync vote quorum service v1.0 [5]
>>>>>> [18934] apache-up001.itc4u.local corosyncinfo [QB ] server name: votequorum
>>>>>> [18934] apache-up001.itc4u.local corosyncnotice [SERV ] Service engine loaded: corosync cluster quorum service v0.1 [3]
>>>>>> [18934] apache-up001.itc4u.local corosyncinfo [QB ] server name: quorum
>>>>>> [18934] apache-up001.itc4u.local corosyncnotice [TOTEM ] adding new UDPU member {192.168.15.9}
>>>>>> [18934] apache-up001.itc4u.local corosyncnotice [TOTEM ] adding new UDPU member {192.168.15.8}
>>>>>> [18934] apache-up001.itc4u.local corosyncnotice [TOTEM ] adding new UDPU member {192.168.15.7}
>>>>>> [18934] apache-up001.itc4u.local corosyncnotice [TOTEM ] The network interface [192.168.16.9] is now up.
>>>>>> [18934] apache-up001.itc4u.local corosyncnotice [TOTEM ] adding new UDPU member {192.168.16.9}
>>>>>> [18934] apache-up001.itc4u.local corosyncnotice [TOTEM ] adding new UDPU member {192.168.16.8}
>>>>>> [18934] apache-up001.itc4u.local corosyncnotice [TOTEM ] adding new UDPU member {192.168.16.7}
>>>>>> [18934] apache-up001.itc4u.local corosyncnotice [TOTEM ] A new membership (192.168.15.9:824) was formed. Members joined: 1
>>>>>> [18934] apache-up001.itc4u.local corosyncnotice [QUORUM] Members[1]: 1
>>>>>> [18934] apache-up001.itc4u.local corosyncnotice [MAIN ] Completed service synchronization, ready to provide service.
>>>>>> [18934] apache-up001.itc4u.local corosyncnotice [TOTEM ] A new membership (192.168.15.7:836) was formed. Members joined: 3 2
>>>>>> [18934] apache-up001.itc4u.local corosyncnotice [QUORUM] This node is within the primary component and will provide service.
>>>>>> [18934] apache-up001.itc4u.local corosyncnotice [QUORUM] Members[3]: 3 2 1
>>>>>> [18934] apache-up001.itc4u.local corosyncnotice [MAIN ] Completed service synchronization, ready to provide service.
>>>>>> May 14 15:46:59 [18950] apache-up001.itc4u.local pacemakerd: notice: mcp_read_config: Configured corosync to accept connections from group 189: OK (1)
>>>>>> May 14 15:46:59 [18950] apache-up001.itc4u.local pacemakerd: notice: main: Starting Pacemaker 1.1.13-10.el7_2.2 (Build: 44eb2dd): generated-manpages agent-manpages ncurses libqb-logging libqb-ipc upstart systemd nagios corosync-native atomic-attrd acls
>>>>>> May 14 15:46:59 [18950] apache-up001.itc4u.local pacemakerd: info: main: Maximum core file size is: 18446744073709551615
>>>>>> May 14 15:46:59 [18950] apache-up001.itc4u.local pacemakerd: info: qb_ipcs_us_publish: server name: pacemakerd
>>>>>> May 14 15:46:59 [18950] apache-up001.itc4u.local pacemakerd: info: crm_get_peer: Created entry 2ce0a451-fca7-407d-82d6-cf16b2d9059e/0x1213720 for node apache-up001.ring0/1 (1 total)
>>>>>> May 14 15:46:59 [18950] apache-up001.itc4u.local pacemakerd: info: crm_get_peer: Node 1 is now known as apache-up001.ring0
>>>>>> May 14 15:46:59 [18950] apache-up001.itc4u.local pacemakerd: info: crm_get_peer: Node 1 has uuid 1
>>>>>> May 14 15:46:59 [18950] apache-up001.itc4u.local pacemakerd: info: crm_update_peer_proc: cluster_connect_cpg: Node apache-up001.ring0[1] - corosync-cpg is now online
>>>>>> May 14 15:46:59 [18950] apache-up001.itc4u.local pacemakerd: notice: cluster_connect_quorum: Quorum acquired
>>>>>> May 14 15:46:59 [18950] apache-up001.itc4u.local pacemakerd: info: crm_get_peer: Created entry 9fc4b33e-ee75-4ebb-ab2e-e7ead18e083d/0x1214b80 for node apache-up002.ring0/2 (2 total)
>>>>>> May 14 15:46:59 [18950] apache-up001.itc4u.local pacemakerd: info: crm_get_peer: Node 2 is now known as apache-up002.ring0
>>>>>> May 14 15:46:59 [18950] apache-up001.itc4u.local pacemakerd: info: crm_get_peer: Node 2 has uuid 2
>>>>>> May 14 15:46:59 [18950] apache-up001.itc4u.local pacemakerd: info: crm_get_peer: Created entry 54d08100-982e-42c3-b364-57017d8c2f14/0x1215070 for node apache-up003.ring0/3 (3 total)
>>>>>> May 14 15:46:59 [18950] apache-up001.itc4u.local pacemakerd: info: crm_get_peer: Node 3 is now known as apache-up003.ring0
>>>>>> May 14 15:46:59 [18950] apache-up001.itc4u.local pacemakerd: info: crm_get_peer: Node 3 has uuid 3
>>>>>> May 14 15:46:59 [18950] apache-up001.itc4u.local pacemakerd: info: start_child: Using uid=189 and group=189 for process cib
>>>>>> May 14 15:46:59 [18950] apache-up001.itc4u.local pacemakerd: info: start_child: Forked child 18951 for process cib
>>>>>> May 14 15:46:59 [18950] apache-up001.itc4u.local pacemakerd: info: start_child: Forked child 18952 for process stonith-ng
>>>>>> May 14 15:46:59 [18950] apache-up001.itc4u.local pacemakerd: info: start_child: Forked child 18953 for process lrmd
>>>>>> May 14 15:46:59 [18950] apache-up001.itc4u.local pacemakerd: info: start_child: Using uid=189 and group=189 for process attrd
>>>>>> May 14 15:46:59 [18950] apache-up001.itc4u.local pacemakerd: info: start_child: Forked child 18954 for process attrd
>>>>>> May 14 15:46:59 [18950] apache-up001.itc4u.local pacemakerd: info: start_child: Using uid=189 and group=189 for process pengine
>>>>>> May 14 15:46:59 [18950] apache-up001.itc4u.local pacemakerd: info: start_child: Forked child 18955 for process pengine
>>>>>> May 14 15:46:59 [18950] apache-up001.itc4u.local pacemakerd: info: start_child: Using uid=189 and group=189 for process crmd
>>>>>> May 14 15:46:59 [18950] apache-up001.itc4u.local pacemakerd: info: start_child: Forked child 18956 for process crmd
>>>>>> May 14 15:46:59 [18950] apache-up001.itc4u.local pacemakerd: info: main: Starting mainloop
>>>>>> May 14 15:46:59 [18950] apache-up001.itc4u.local pacemakerd: info: pcmk_quorum_notification: Membership 836: quorum retained (3)
>>>>>> May 14 15:46:59 [18950] apache-up001.itc4u.local pacemakerd: notice: crm_update_peer_state_iter: pcmk_quorum_notification: Node apache-up003.ring0[3] - state is now member (was (null))
>>>>>> May 14 15:46:59 [18950] apache-up001.itc4u.local pacemakerd: notice: crm_update_peer_state_iter: pcmk_quorum_notification: Node apache-up002.ring0[2] - state is now member (was (null))
>>>>>> May 14 15:46:59 [18950] apache-up001.itc4u.local pacemakerd: notice: crm_update_peer_state_iter: pcmk_quorum_notification: Node apache-up001.ring0[1] - state is now member (was (null))
>>>>>> May 14 15:46:59 [18950] apache-up001.itc4u.local pacemakerd: info: pcmk_cpg_membership: Node 1 joined group pacemakerd (counter=0.0)
>>>>>> May 14 15:46:59 [18950] apache-up001.itc4u.local pacemakerd: info: pcmk_cpg_membership: Node 1 still member of group pacemakerd (peer=apache-up001.ring0, counter=0.0)
>>>>>> May 14 15:46:59 [18950] apache-up001.itc4u.local pacemakerd: info: pcmk_cpg_membership: Node 3 still member of group pacemakerd (peer=apache-up003.ring0, counter=0.1)
>>>>>> May 14 15:46:59 [18950] apache-up001.itc4u.local pacemakerd: info: crm_update_peer_proc: pcmk_cpg_membership: Node apache-up003.ring0[3] - corosync-cpg is now online
>>>>>> May 14 15:46:59 [18950] apache-up001.itc4u.local pacemakerd: info: pcmk_cpg_membership: Node 2 joined group pacemakerd (counter=1.0)
>>>>>> May 14 15:46:59 [18950] apache-up001.itc4u.local pacemakerd: info: pcmk_cpg_membership: Node 1 still member of group pacemakerd (peer=apache-up001.ring0, counter=1.0)
>>>>>> May 14 15:46:59 [18950] apache-up001.itc4u.local pacemakerd: info: pcmk_cpg_membership: Node 2 still member of group pacemakerd (peer=apache-up002.ring0, counter=1.1)
>>>>>> May 14 15:46:59 [18950] apache-up001.itc4u.local pacemakerd: info: crm_update_peer_proc: pcmk_cpg_membership: Node apache-up002.ring0[2] - corosync-cpg is now online
>>>>>> May 14 15:46:59 [18950] apache-up001.itc4u.local pacemakerd: info: pcmk_cpg_membership: Node 3 still member of group pacemakerd (peer=apache-up003.ring0, counter=1.2)
>>>>>> May 14 15:46:59 [18950] apache-up001.itc4u.local pacemakerd: info: mcp_cpg_deliver: Ignoring process list sent by peer for local node
>>>>>> May 14 15:46:59 [18950] apache-up001.itc4u.local pacemakerd: info: mcp_cpg_deliver: Ignoring process list sent by peer for local node
>>>>>> May 14 15:46:59 [18950] apache-up001.itc4u.local pacemakerd: info: mcp_cpg_deliver: Ignoring process list sent by peer for local node
>>>>>> May 14 15:46:59 [18950] apache-up001.itc4u.local pacemakerd: info: mcp_cpg_deliver: Ignoring process list sent by peer for local node
>>>>>> May 14 15:46:59 [18950] apache-up001.itc4u.local pacemakerd: info: mcp_cpg_deliver: Ignoring process list sent by peer for local node
>>>>>> May 14 15:46:59 [18950] apache-up001.itc4u.local pacemakerd: info: mcp_cpg_deliver: Ignoring process list sent by peer for local node
>>>>>> May 14 15:46:59 [18950] apache-up001.itc4u.local pacemakerd: info: mcp_cpg_deliver: Ignoring process list sent by peer for local node
>>>>>> May 14 15:46:59 [18950] apache-up001.itc4u.local pacemakerd: info: mcp_cpg_deliver: Ignoring process list sent by peer for local node
>>>>>> May 14 15:46:59 [18954] apache-up001.itc4u.local attrd: info: crm_log_init: Changed active directory to /var/lib/pacemaker/cores/hacluster
>>>>>> May 14 15:46:59 [18954] apache-up001.itc4u.local attrd: info: main: Starting up
>>>>>> May 14 15:46:59 [18954] apache-up001.itc4u.local attrd: info: get_cluster_type: Verifying cluster type: 'corosync'
>>>>>> May 14 15:46:59 [18954] apache-up001.itc4u.local attrd: info: get_cluster_type: Assuming an active 'corosync' cluster
>>>>>> May 14 15:46:59 [18954] apache-up001.itc4u.local attrd: notice: crm_cluster_connect: Connecting to cluster infrastructure: corosync
>>>>>> May 14 15:46:59 [18956] apache-up001.itc4u.local crmd: info: crm_log_init: Changed active directory to /var/lib/pacemaker/cores/hacluster
>>>>>> May 14 15:46:59 [18956] apache-up001.itc4u.local crmd: notice: main: CRM Git Version: 1.1.13-10.el7_2.2 (44eb2dd)
>>>>>> May 14 15:46:59 [18956] apache-up001.itc4u.local crmd: info: do_log: FSA: Input I_STARTUP from crmd_init() received in state S_STARTING
>>>>>> May 14 15:46:59 [18956] apache-up001.itc4u.local crmd: info: get_cluster_type: Verifying cluster type: 'corosync'
>>>>>> May 14 15:46:59 [18956] apache-up001.itc4u.local crmd: info: get_cluster_type: Assuming an active 'corosync' cluster
>>>>>> May 14 15:46:59 [18951] apache-up001.itc4u.local cib: info: crm_log_init: Changed active directory to /var/lib/pacemaker/cores/hacluster
>>>>>> May 14 15:46:59 [18951] apache-up001.itc4u.local cib: info: get_cluster_type: Verifying cluster type: 'corosync'
>>>>>> May 14 15:46:59 [18951] apache-up001.itc4u.local cib: info: get_cluster_type: Assuming an active 'corosync' cluster
>>>>>> May 14 15:46:59 [18951] apache-up001.itc4u.local cib: info: retrieveCib: Reading cluster configuration file /var/lib/pacemaker/cib/cib.xml (digest: /var/lib/pacemaker/cib/cib.xml.sig)
>>>>>> May 14 15:46:59 [18951] apache-up001.itc4u.local cib: info: validate_with_relaxng: Creating RNG parser context
>>>>>> May 14 15:46:59 [18955] apache-up001.itc4u.local pengine: info: crm_log_init: Changed active directory to /var/lib/pacemaker/cores/hacluster
>>>>>> May 14 15:46:59 [18955] apache-up001.itc4u.local pengine: info: qb_ipcs_us_publish: server name: pengine
>>>>>> May 14 15:46:59 [18955] apache-up001.itc4u.local pengine: info: main: Starting pengine
>>>>>> May 14 15:46:59 [18953] apache-up001.itc4u.local lrmd: info: crm_log_init: Changed active directory to /var/lib/pacemaker/cores/root
>>>>>> May 14 15:46:59 [18953] apache-up001.itc4u.local lrmd: info: qb_ipcs_us_publish: server name: lrmd
>>>>>> May 14 15:46:59 [18953] apache-up001.itc4u.local lrmd: info: main: Starting
>>>>>> May 14 15:46:59 [18952] apache-up001.itc4u.local stonith-ng: info: crm_log_init: Changed active directory to /var/lib/pacemaker/cores/root
>>>>>> May 14 15:46:59 [18952] apache-up001.itc4u.local stonith-ng: info: get_cluster_type: Verifying cluster type: 'corosync'
>>>>>> May 14 15:46:59 [18952] apache-up001.itc4u.local stonith-ng: info: get_cluster_type: Assuming an active 'corosync' cluster
>>>>>> May 14 15:46:59 [18952] apache-up001.itc4u.local stonith-ng: notice: crm_cluster_connect: Connecting to cluster infrastructure: corosync
>>>>>> May 14 15:46:59 [18954] apache-up001.itc4u.local attrd: info: crm_get_peer: Created entry 2e6a7f6f-877e-4eba-93dc-7e2f13a48c31/0x8b1cd0 for node apache-up001.ring0/1 (1 total)
>>>>>> May 14 15:46:59 [18954] apache-up001.itc4u.local attrd: info: crm_get_peer: Node 1 is now known as apache-up001.ring0
>>>>>> May 14 15:46:59 [18951] apache-up001.itc4u.local cib: info: startCib: CIB Initialization completed successfully
>>>>>> May 14 15:46:59 [18951] apache-up001.itc4u.local cib: notice: crm_cluster_connect: Connecting to cluster infrastructure: corosync
>>>>>> May 14 15:46:59 [18954] apache-up001.itc4u.local attrd: info: crm_get_peer: Node 1 has uuid 1
>>>>>> May 14 15:46:59 [18954] apache-up001.itc4u.local attrd: info: crm_update_peer_proc: cluster_connect_cpg: Node apache-up001.ring0[1] - corosync-cpg is now online
>>>>>> May 14 15:46:59 [18954] apache-up001.itc4u.local attrd: notice: crm_update_peer_state_iter: crm_update_peer_proc: Node apache-up001.ring0[1] - state is now member (was (null))
>>>>>> May 14 15:46:59 [18954] apache-up001.itc4u.local attrd: info: init_cs_connection_once: Connection to 'corosync': established
>>>>>> May 14 15:46:59 [18952] apache-up001.itc4u.local stonith-ng: info: crm_get_peer: Created entry ed676779-f16b-4ebe-8bf2-a80c08001e4b/0x22e71d0 for node apache-up001.ring0/1 (1 total)
>>>>>> May 14 15:46:59 [18952] apache-up001.itc4u.local stonith-ng: info: crm_get_peer: Node 1 is now known as apache-up001.ring0
>>>>>> May 14 15:46:59 [18954] apache-up001.itc4u.local attrd: info: main: Cluster connection active
>>>>>> May 14 15:46:59 [18954] apache-up001.itc4u.local attrd: info: qb_ipcs_us_publish: server name: attrd
>>>>>> May 14 15:46:59 [18954] apache-up001.itc4u.local attrd: info: main: Accepting attribute updates
>>>>>> May 14 15:46:59 [18951] apache-up001.itc4u.local cib: info: crm_get_peer: Created entry 5ca16226-9bac-40aa-910f-b1825e1f505b/0x1828af0 for node apache-up001.ring0/1 (1 total)
>>>>>> May 14 15:46:59 [18951] apache-up001.itc4u.local cib: info: crm_get_peer: Node 1 is now known as apache-up001.ring0
>>>>>> May 14 15:46:59 [18952] apache-up001.itc4u.local stonith-ng: info: crm_get_peer: Node 1 has uuid 1
>>>>>> May 14 15:46:59 [18952] apache-up001.itc4u.local stonith-ng: info: crm_update_peer_proc: cluster_connect_cpg: Node apache-up001.ring0[1] - corosync-cpg is now online
>>>>>> May 14 15:46:59 [18952] apache-up001.itc4u.local stonith-ng: notice: crm_update_peer_state_iter: crm_update_peer_proc: Node apache-up001.ring0[1] - state is now member (was (null))
>>>>>> May 14 15:46:59 [18952] apache-up001.itc4u.local stonith-ng: info: init_cs_connection_once: Connection to 'corosync': established
>>>>>> May 14 15:46:59 [18951] apache-up001.itc4u.local cib: info: crm_get_peer: Node 1 has uuid 1
>>>>>> May 14 15:46:59 [18951] apache-up001.itc4u.local cib: info: crm_update_peer_proc: cluster_connect_cpg: Node apache-up001.ring0[1] - corosync-cpg is now online
>>>>>> May 14 15:46:59 [18951] apache-up001.itc4u.local cib: notice: crm_update_peer_state_iter: crm_update_peer_proc: Node apache-up001.ring0[1] - state is now member (was (null))
>>>>>> May 14 15:46:59 [18951] apache-up001.itc4u.local cib: info: init_cs_connection_once: Connection to 'corosync': established
>>>>>> May 14 15:46:59 [18951] apache-up001.itc4u.local cib: info: qb_ipcs_us_publish: server name: cib_ro
>>>>>> May 14 15:46:59 [18951] apache-up001.itc4u.local cib: info: qb_ipcs_us_publish: server name: cib_rw
>>>>>> May 14 15:46:59 [18951] apache-up001.itc4u.local cib: info: qb_ipcs_us_publish: server name: cib_shm
>>>>>> May 14 15:46:59 [18951] apache-up001.itc4u.local cib: info: cib_init: Starting cib mainloop
>>>>>> May 14 15:46:59 [18951] apache-up001.itc4u.local cib: info: pcmk_cpg_membership: Node 1 joined group cib (counter=0.0)
>>>>>> May 14 15:46:59 [18951] apache-up001.itc4u.local cib: info: pcmk_cpg_membership: Node 1 still member of group cib (peer=apache-up001.ring0, counter=0.0)
>>>>>> May 14 15:46:59 [18951] apache-up001.itc4u.local cib: info: crm_get_peer: Created entry 32607e56-5e7f-42d6-91c9-3d9ee2fa152f/0x182b820 for node apache-up003.ring0/3 (2 total)
>>>>>> May 14 15:46:59 [18951] apache-up001.itc4u.local cib: info: crm_get_peer: Node 3 is now known as apache-up003.ring0
>>>>>> May 14 15:46:59 [18951] apache-up001.itc4u.local cib: info: crm_get_peer: Node 3 has uuid 3
>>>>>> May 14 15:46:59 [18951] apache-up001.itc4u.local cib: info: pcmk_cpg_membership: Node 3 still member of group cib (peer=apache-up003.ring0, counter=0.1)
>>>>>> May 14 15:46:59 [18951] apache-up001.itc4u.local cib: info: crm_update_peer_proc: pcmk_cpg_membership: Node apache-up003.ring0[3] - corosync-cpg is now online
>>>>>> May 14 15:46:59 [18951] apache-up001.itc4u.local cib: notice: crm_update_peer_state_iter: crm_update_peer_proc: Node apache-up003.ring0[3] - state is now member (was (null))
>>>>>> May 14 15:46:59 [18951] apache-up001.itc4u.local cib: info: pcmk_cpg_membership: Node 2 joined group cib (counter=1.0)
>>>>>> May 14 15:46:59 [18951] apache-up001.itc4u.local cib: info: pcmk_cpg_membership: Node 1 still member of group cib (peer=apache-up001.ring0, counter=1.0)
>>>>>> May 14 15:46:59 [18951] apache-up001.itc4u.local cib: info: crm_get_peer: Created entry 9d8bed61-2324-42b4-8010-5b2736c21534/0x182b910 for node apache-up002.ring0/2 (3 total)
>>>>>> May 14 15:46:59 [18951] apache-up001.itc4u.local cib: info: crm_get_peer: Node 2 is now known as apache-up002.ring0
>>>>>> May 14 15:46:59 [18951] apache-up001.itc4u.local cib: info: crm_get_peer: Node 2 has uuid 2
>>>>>> May 14 15:46:59 [18951] apache-up001.itc4u.local cib: info: pcmk_cpg_membership: Node 2 still member of group cib (peer=apache-up002.ring0, counter=1.1)
>>>>>> May 14 15:46:59 [18951] apache-up001.itc4u.local cib: info: crm_update_peer_proc: pcmk_cpg_membership: Node apache-up002.ring0[2] - corosync-cpg is now online
>>>>>> May 14 15:46:59 [18951] apache-up001.itc4u.local cib: notice: crm_update_peer_state_iter: crm_update_peer_proc: Node apache-up002.ring0[2] - state is now member (was (null))
>>>>>> May 14 15:46:59 [18951] apache-up001.itc4u.local cib: info: pcmk_cpg_membership: Node 3 still member of group cib (peer=apache-up003.ring0, counter=1.2)
>>>>>> May 14 15:46:59 [18951] apache-up001.itc4u.local cib: info: cib_file_backup: Archived previous version as /var/lib/pacemaker/cib/cib-69.raw
>>>>>> May 14 15:46:59 [18951] apache-up001.itc4u.local cib: info: cib_file_write_with_digest: Wrote version 0.98.0 of the CIB to disk (digest: 262eb42d23bff917f27a0914467d7218)
>>>>>> May 14 15:46:59 [18951] apache-up001.itc4u.local cib: info: cib_file_write_with_digest: Reading cluster configuration file /var/lib/pacemaker/cib/cib.8fxTts (digest: /var/lib/pacemaker/cib/cib.Ls9hSP)
>>>>>> May 14 15:47:00 [18956] apache-up001.itc4u.local crmd: info: do_cib_control: CIB connection established
>>>>>> May 14 15:47:00 [18956] apache-up001.itc4u.local crmd: notice: crm_cluster_connect: Connecting to cluster infrastructure: corosync
>>>>>> May 14 15:47:00 [18956] apache-up001.itc4u.local crmd: info: crm_get_peer: Created entry 0e0a8dc1-17df-42f7-83f7-55fbee944173/0x24e2c20 for node apache-up001.ring0/1 (1 total)
>>>>>> May 14 15:47:00 [18956] apache-up001.itc4u.local crmd: info: crm_get_peer: Node 1 is now known as apache-up001.ring0
>>>>>> May 14 15:47:00 [18956] apache-up001.itc4u.local crmd: info: peer_update_callback: apache-up001.ring0 is now in unknown state
>>>>>> May 14 15:47:00 [18956] apache-up001.itc4u.local crmd: info: crm_get_peer: Node 1 has uuid 1
>>>>>> May 14 15:47:00 [18956] apache-up001.itc4u.local crmd: info: crm_update_peer_proc: cluster_connect_cpg: Node apache-up001.ring0[1] - corosync-cpg is now online
>>>>>> May 14 15:47:00 [18956] apache-up001.itc4u.local crmd: info: peer_update_callback: Client apache-up001.ring0/peer now has status [online] (DC=<null>, changed=4000000)
>>>>>> May 14 15:47:00 [18956] apache-up001.itc4u.local crmd: info: init_cs_connection_once: Connection to 'corosync': established
>>>>>> May 14 15:47:00 [18956] apache-up001.itc4u.local crmd: notice: cluster_connect_quorum: Quorum acquired
>>>>>> May 14 15:47:00 [18956] apache-up001.itc4u.local crmd: info: crm_get_peer: Created entry 5476fb11-1e94-4908-92e5-d27a3e5a29b2/0x24e5130 for node apache-up002.ring0/2 (2 total)
>>>>>> May 14 15:47:00 [18956] apache-up001.itc4u.local crmd: info: crm_get_peer: Node 2 is now known as apache-up002.ring0
>>>>>> May 14 15:47:00 [18956] apache-up001.itc4u.local crmd: info: peer_update_callback: apache-up002.ring0 is now in unknown state
>>>>>> May 14 15:47:00 [18956] apache-up001.itc4u.local crmd: info: crm_get_peer: Node 2 has uuid 2
>>>>>> May 14 15:47:00 [18956] apache-up001.itc4u.local crmd: info: crm_get_peer: Created entry c50be947-f189-4c64-a7a2-523593eafac8/0x24e5390 for node apache-up003.ring0/3 (3 total)
>>>>>> May 14 15:47:00 [18956] apache-up001.itc4u.local crmd: info: crm_get_peer: Node 3 is now known as apache-up003.ring0
>>>>>> May 14 15:47:00 [18956] apache-up001.itc4u.local crmd: info: peer_update_callback: apache-up003.ring0 is now in unknown state
>>>>>> May 14 15:47:00 [18956] apache-up001.itc4u.local crmd: info: crm_get_peer: Node 3 has uuid 3
>>>>>> May 14 15:47:00 [18951] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section nodes: OK (rc=0, origin=apache-up003.ring0/crmd/6, version=0.98.0)
>>>>>> May 14 15:47:00 [18956] apache-up001.itc4u.local crmd: info: do_ha_control: Connected to the cluster
>>>>>> May 14 15:47:00 [18956] apache-up001.itc4u.local crmd: info: lrmd_ipc_connect: Connecting to lrmd
>>>>>> May 14 15:47:00 [18956] apache-up001.itc4u.local crmd: info: do_lrm_control: LRM connection established
>>>>>> May 14 15:47:00 [18956] apache-up001.itc4u.local crmd: info: do_started: Delaying start, no membership data (0000000000100000)
>>>>>> May 14 15:47:00 [18956] apache-up001.itc4u.local crmd: info: do_started: Delaying start, no membership data (0000000000100000)
>>>>>> May 14 15:47:00 [18956] apache-up001.itc4u.local crmd: info: pcmk_quorum_notification: Membership 836: quorum retained (3)
>>>>>> May 14 15:47:00 [18956] apache-up001.itc4u.local crmd: notice: crm_update_peer_state_iter: pcmk_quorum_notification: Node apache-up003.ring0[3] - state is now member (was (null))
>>>>>> May 14 15:47:00 [18956] apache-up001.itc4u.local crmd: info: peer_update_callback: apache-up003.ring0 is now member (was in unknown state)
>>>>>> May 14 15:47:00 [18956] apache-up001.itc4u.local crmd: notice: crm_update_peer_state_iter: pcmk_quorum_notification: Node apache-up002.ring0[2] - state is now member (was (null))
>>>>>> May 14 15:47:00 [18956] apache-up001.itc4u.local crmd: info: peer_update_callback: apache-up002.ring0 is now member (was in unknown state)
>>>>>> May 14 15:47:00 [18956] apache-up001.itc4u.local crmd: notice: crm_update_peer_state_iter: pcmk_quorum_notification: Node apache-up001.ring0[1] - state is now member (was (null))
>>>>>> May 14 15:47:00 [18956] apache-up001.itc4u.local crmd: info: peer_update_callback: apache-up001.ring0 is now member (was in unknown state)
>>>>>> May 14 15:47:00 [18951] apache-up001.itc4u.local cib: info: cib_process_request: Forwarding cib_modify operation for section nodes to master (origin=local/crmd/6)
>>>>>> May 14 15:47:00 [18951] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section nodes: OK (rc=0, origin=apache-up001.ring0/crmd/6, version=0.98.0)
>>>>>> May 14 15:47:00 [18954] apache-up001.itc4u.local attrd: info: attrd_cib_connect: Connected to the CIB after 2 attempts
>>>>>> May 14 15:47:00 [18954] apache-up001.itc4u.local attrd: info: main: CIB connection active
>>>>>> May 14 15:47:00 [18954] apache-up001.itc4u.local attrd: info: pcmk_cpg_membership: Node 1 joined group attrd (counter=0.0)
>>>>>> May 14 15:47:00 [18954] apache-up001.itc4u.local attrd: info: pcmk_cpg_membership: Node 1 still member of group attrd (peer=apache-up001.ring0, counter=0.0)
>>>>>> May 14 15:47:00 [18956] apache-up001.itc4u.local crmd: info: do_started: Delaying start, Config not read (0000000000000040)
>>>>>> May 14 15:47:00 [18956] apache-up001.itc4u.local crmd: notice: crmd_enable_notifications: Notifications disabled
>>>>>> May 14 15:47:00 [18956] apache-up001.itc4u.local crmd: info: qb_ipcs_us_publish: server name: crmd
>>>>>> May 14 15:47:00 [18956] apache-up001.itc4u.local crmd: notice: do_started: The local CRM is operational
>>>>>> May 14 15:47:00 [18956] apache-up001.itc4u.local crmd: info: do_log: FSA: Input I_PENDING from do_started() received in state S_STARTING
>>>>>> May 14 15:47:00 [18956] apache-up001.itc4u.local crmd: notice: do_state_transition: State transition S_STARTING -> S_PENDING [ input=I_PENDING cause=C_FSA_INTERNAL origin=do_started ]
>>>>>> May 14 15:47:00 [18954] apache-up001.itc4u.local attrd: info: crm_get_peer: Created entry eb16028a-9e18-4b07-b9bf-d29dc04177bd/0x8b4450 for node apache-up003.ring0/3 (2 total)
>>>>>> May 14 15:47:00 [18954] apache-up001.itc4u.local attrd: info: crm_get_peer: Node 3 is now known as apache-up003.ring0
>>>>>> May 14 15:47:00 [18954] apache-up001.itc4u.local attrd: info: crm_get_peer: Node 3 has uuid 3
>>>>>> May 14 15:47:00 [18954] apache-up001.itc4u.local attrd: info: pcmk_cpg_membership: Node 3 still member of group attrd (peer=apache-up003.ring0, counter=0.1)
>>>>>> May 14 15:47:00 [18954] apache-up001.itc4u.local attrd: info: crm_update_peer_proc: pcmk_cpg_membership: Node apache-up003.ring0[3] - corosync-cpg is now online
>>>>>> May 14 15:47:00 [18954] apache-up001.itc4u.local attrd: notice: crm_update_peer_state_iter: crm_update_peer_proc: Node apache-up003.ring0[3] - state is now member (was (null))
>>>>>> May 14 15:47:00 [18954] apache-up001.itc4u.local attrd: info: pcmk_cpg_membership: Node 2 joined group attrd (counter=1.0)
>>>>>> May 14 15:47:00 [18954] apache-up001.itc4u.local attrd: info: pcmk_cpg_membership: Node 1 still member of group attrd (peer=apache-up001.ring0, counter=1.0)
>>>>>> May 14 15:47:00 [18952] apache-up001.itc4u.local stonith-ng: notice: setup_cib: Watching for stonith topology changes
>>>>>> May 14 15:47:00 [18952] apache-up001.itc4u.local stonith-ng: info: qb_ipcs_us_publish: server name: stonith-ng
>>>>>> May 14 15:47:00 [18952] apache-up001.itc4u.local stonith-ng: info: main: Starting stonith-ng mainloop
>>>>>> May 14 15:47:00 [18952] apache-up001.itc4u.local stonith-ng: info: pcmk_cpg_membership: Node 1 joined group stonith-ng (counter=0.0)
>>>>>> May 14 15:47:00 [18952] apache-up001.itc4u.local stonith-ng: info: pcmk_cpg_membership: Node 1 still member of group stonith-ng (peer=apache-up001.ring0, counter=0.0)
>>>>>> May 14 15:47:00 [18954] apache-up001.itc4u.local attrd: info: crm_get_peer: Created entry 8e6e690d-b0f3-4894-8ae2-663543a34c55/0x8b4ec0 for node apache-up002.ring0/2 (3 total)
>>>>>> May 14 15:47:00 [18954] apache-up001.itc4u.local attrd: info: crm_get_peer: Node 2 is now known as apache-up002.ring0
>>>>>> May 14 15:47:00 [18954] apache-up001.itc4u.local attrd: info: crm_get_peer: Node 2 has uuid 2
>>>>>> May 14 15:47:00 [18954] apache-up001.itc4u.local attrd: info: pcmk_cpg_membership: Node 2 still member of group attrd (peer=apache-up002.ring0, counter=1.1)
>>>>>> May 14 15:47:00 [18954] apache-up001.itc4u.local attrd: info: crm_update_peer_proc: pcmk_cpg_membership: Node apache-up002.ring0[2] - corosync-cpg is now online
>>>>>> May 14 15:47:00 [18954] apache-up001.itc4u.local attrd: notice: crm_update_peer_state_iter: crm_update_peer_proc: Node apache-up002.ring0[2] - state is now member (was (null))
>>>>>> May 14 15:47:00 [18954] apache-up001.itc4u.local attrd: info: pcmk_cpg_membership: Node 3 still member of group attrd (peer=apache-up003.ring0, counter=1.2)
>>>>>> May 14 15:47:00 [18952] apache-up001.itc4u.local stonith-ng: info: crm_get_peer: Created entry 51a26c10-f43f-4b9b-a7a5-71049ebacdf0/0x22e88c0 for node apache-up002.ring0/2 (2 total)
>>>>>> May 14 15:47:00 [18952] apache-up001.itc4u.local stonith-ng: info: crm_get_peer: Node 2 is now known as apache-up002.ring0
>>>>>> May 14 15:47:00 [18952] apache-up001.itc4u.local stonith-ng: info: crm_get_peer: Node 2 has uuid 2
>>>>>> May 14 15:47:00 [18952] apache-up001.itc4u.local stonith-ng: info: pcmk_cpg_membership: Node 2 still member of group stonith-ng (peer=apache-up002.ring0, counter=0.1)
>>>>>> May 14 15:47:00 [18952] apache-up001.itc4u.local stonith-ng: info: crm_update_peer_proc: pcmk_cpg_membership: Node apache-up002.ring0[2] - corosync-cpg is now online
>>>>>> May 14 15:47:00 [18952] apache-up001.itc4u.local stonith-ng: notice: crm_update_peer_state_iter: crm_update_peer_proc: Node apache-up002.ring0[2] - state is now member (was (null))
>>>>>> May 14 15:47:00 [18952] apache-up001.itc4u.local stonith-ng: info: crm_get_peer: Created entry 9fd39304-24fa-48bd-899e-8d33c3994ecf/0x22e8a10 for node apache-up003.ring0/3 (3 total)
>>>>>> May 14 15:47:00 [18952] apache-up001.itc4u.local stonith-ng: info: crm_get_peer: Node 3 is now known as apache-up003.ring0
>>>>>> May 14 15:47:00 [18952] apache-up001.itc4u.local stonith-ng: info: crm_get_peer: Node 3 has uuid 3
>>>>>> May 14 15:47:00 [18952] apache-up001.itc4u.local stonith-ng: info: pcmk_cpg_membership: Node 3 still member of group stonith-ng (peer=apache-up003.ring0, counter=0.2)
>>>>>> May 14 15:47:00 [18952] apache-up001.itc4u.local stonith-ng: info: crm_update_peer_proc: pcmk_cpg_membership: Node apache-up003.ring0[3] - corosync-cpg is now online
>>>>>> May 14 15:47:00 [18952] apache-up001.itc4u.local stonith-ng: notice: crm_update_peer_state_iter: crm_update_peer_proc: Node apache-up003.ring0[3] - state is now member (was (null))
>>>>>> May 14 15:47:00 [18952] apache-up001.itc4u.local stonith-ng: info: init_cib_cache_cb: Updating device list from the cib: init
>>>>>> May 14 15:47:00 [18952] apache-up001.itc4u.local stonith-ng: info: cib_devices_update: Updating devices to version 0.98.0
>>>>>> May 14 15:47:00 [18951] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section nodes: OK (rc=0, origin=apache-up002.ring0/crmd/6, version=0.98.0)
>>>>>> May 14 15:47:01 [18956] apache-up001.itc4u.local crmd: info: pcmk_cpg_membership: Node 1 joined group crmd (counter=0.0)
>>>>>> May 14 15:47:01 [18956] apache-up001.itc4u.local crmd: info: pcmk_cpg_membership: Node 1 still member of group crmd (peer=apache-up001.ring0, counter=0.0)
>>>>>> May 14 15:47:01 [18956] apache-up001.itc4u.local crmd: info: pcmk_cpg_membership: Node 3 still member of group crmd (peer=apache-up003.ring0, counter=0.1)
>>>>>> May 14 15:47:01 [18956] apache-up001.itc4u.local crmd: info: crm_update_peer_proc: pcmk_cpg_membership: Node apache-up003.ring0[3] - corosync-cpg is now online
>>>>>> May 14 15:47:01 [18956] apache-up001.itc4u.local crmd: info: peer_update_callback: Client apache-up003.ring0/peer now has status [online] (DC=<null>, changed=4000000)
>>>>>> May 14 15:47:01 [18956] apache-up001.itc4u.local crmd: info: pcmk_cpg_membership: Node 2 joined group crmd (counter=1.0)
>>>>>> May 14 15:47:01 [18956] apache-up001.itc4u.local crmd: info: pcmk_cpg_membership: Node 1 still member of group crmd (peer=apache-up001.ring0, counter=1.0)
>>>>>> May 14 15:47:01 [18956] apache-up001.itc4u.local crmd: info: pcmk_cpg_membership: Node 2 still member of group crmd (peer=apache-up002.ring0, counter=1.1)
>>>>>> May 14 15:47:01 [18956] apache-up001.itc4u.local crmd: info: crm_update_peer_proc: pcmk_cpg_membership: Node apache-up002.ring0[2] - corosync-cpg is now online
>>>>>> May 14 15:47:01 [18956] apache-up001.itc4u.local crmd: info: peer_update_callback: Client apache-up002.ring0/peer now has status [online] (DC=<null>, changed=4000000)
>>>>>> May 14 15:47:01 [18956] apache-up001.itc4u.local crmd: info: pcmk_cpg_membership: Node 3 still member of group crmd (peer=apache-up003.ring0, counter=1.2)
>>>>>> May 14 15:47:01 [18952] apache-up001.itc4u.local stonith-ng: info: build_device_from_xml: The fencing device 'scsi' requires unfencing
>>>>>> May 14 15:47:01 [18952] apache-up001.itc4u.local stonith-ng: info: build_device_from_xml: The fencing device 'scsi' requires actions (on) to be executed on the target node
>>>>>> May 14 15:47:01 [18952] apache-up001.itc4u.local stonith-ng: notice: stonith_device_register: Added 'scsi' to the device list (1 active devices)
>>>>>> May 14 15:47:21 [18956] apache-up001.itc4u.local crmd: info: election_count_vote: Election 1 (owner: 3) lost: vote from apache-up003.ring0 (Uptime)
>>>>>> May 14 15:47:21 [18956] apache-up001.itc4u.local crmd: info: do_log: FSA: Input I_PENDING from do_election_count_vote() received in state S_PENDING
>>>>>> May 14 15:47:21 [18956] apache-up001.itc4u.local crmd: info: election_count_vote: Election 1 (owner: 2) lost: vote from apache-up002.ring0 (Uptime)
>>>>>> May 14 15:47:21 [18956] apache-up001.itc4u.local crmd: info: do_log: FSA: Input I_PENDING from do_election_count_vote() received in state S_PENDING
>>>>>> May 14 15:47:21 [18951] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section cib: OK (rc=0, origin=apache-up002.ring0/crmd/10, version=0.98.0)
>>>>>> May 14 15:47:21 [18951] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section crm_config: OK (rc=0, origin=apache-up002.ring0/crmd/12, version=0.98.0)
>>>>>> May 14 15:47:21 [18956] apache-up001.itc4u.local crmd: info: update_dc: Set DC to apache-up002.ring0 (3.0.10)
>>>>>> May 14 15:47:21 [18956] apache-up001.itc4u.local crmd: info: crm_update_peer_expected: update_dc: Node apache-up002.ring0[2] - expected state is now member (was (null))
>>>>>> May 14 15:47:21 [18951] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section crm_config: OK (rc=0, origin=apache-up002.ring0/crmd/14, version=0.98.0)
>>>>>> May 14 15:47:21 [18956] apache-up001.itc4u.local crmd: info: election_count_vote: Election 2 (owner: 2) lost: vote from apache-up002.ring0 (Uptime)
>>>>>> May 14 15:47:21 [18956] apache-up001.itc4u.local crmd: info: update_dc: Unset DC. Was apache-up002.ring0
>>>>>> May 14 15:47:21 [18956] apache-up001.itc4u.local crmd: info: do_log: FSA: Input I_PENDING from do_election_count_vote() received in state S_PENDING
>>>>>> May 14 15:47:21 [18956] apache-up001.itc4u.local crmd: info: update_dc: Set DC to apache-up002.ring0 (3.0.10)
>>>>>> May 14 15:47:21 [18951] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section crm_config: OK (rc=0, origin=apache-up002.ring0/crmd/16, version=0.98.0)
>>>>>> May 14 15:47:21 [18956] apache-up001.itc4u.local crmd: info: erase_status_tag: Deleting xpath: //node_state[@uname='apache-up001.ring0']/transient_attributes
>>>>>> May 14 15:47:21 [18956] apache-up001.itc4u.local crmd: info: update_attrd_helper: Connecting to attribute manager ... 5 retries remaining
>>>>>> May 14 15:47:21 [18951] apache-up001.itc4u.local cib: info: cib_process_replace: Digest matched on replace from apache-up002.ring0: 10bfa46e2d338e958e6864a0b202f034
>>>>>> May 14 15:47:21 [18951] apache-up001.itc4u.local cib: info: cib_process_replace: Replaced 0.98.0 with 0.98.0 from apache-up002.ring0
>>>>>> May 14 15:47:21 [18951] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_replace operation for section 'all': OK (rc=0, origin=apache-up002.ring0/crmd/20, version=0.98.0)
>>>>>> May 14 15:47:21 [18951] apache-up001.itc4u.local cib: info: cib_process_request: Forwarding cib_delete operation for section //node_state[@uname='apache-up001.ring0']/transient_attributes to master (origin=local/crmd/11)
>>>>>> May 14 15:47:21 [18951] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_delete operation for section //node_state[@uname='apache-up001.ring0']/transient_attributes: OK (rc=0, origin=apache-up001.ring0/crmd/11, version=0.98.0)
>>>>>> May 14 15:47:21 [18951] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_delete operation for section //node_state[@uname='apache-up003.ring0']/transient_attributes: OK (rc=0, origin=apache-up003.ring0/crmd/12, version=0.98.0)
>>>>>> May 14 15:47:21 [18954] apache-up001.itc4u.local attrd: info: attrd_client_update: Starting an election to determine the writer
>>>>>> May 14 15:47:21 [18956] apache-up001.itc4u.local crmd: info: do_log: FSA: Input I_NOT_DC from do_cl_join_finalize_respond() received in state S_PENDING
>>>>>> May 14 15:47:21 [18956] apache-up001.itc4u.local crmd: notice: do_state_transition: State transition S_PENDING -> S_NOT_DC [ input=I_NOT_DC cause=C_HA_MESSAGE origin=do_cl_join_finalize_respond ]
>>>>>> May 14 15:47:21 [18951] apache-up001.itc4u.local cib: info: cib_file_backup: Archived previous version as /var/lib/pacemaker/cib/cib-70.raw
>>>>>> May 14 15:47:21 [18951] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section nodes: OK (rc=0, origin=apache-up002.ring0/crmd/21, version=0.98.0)
>>>>>> May 14 15:47:21 [18951] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section nodes: OK (rc=0, origin=apache-up002.ring0/crmd/22, version=0.98.0)
>>>>>> May 14 15:47:21 [18951] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section nodes: OK (rc=0, origin=apache-up002.ring0/crmd/23, version=0.98.0)
>>>>>> May 14 15:47:21 [18954] apache-up001.itc4u.local attrd: info: election_count_vote: Election 1 (owner: 2) pass: vote from apache-up002.ring0 (Uptime)
>>>>>> May 14 15:47:21 [18954] apache-up001.itc4u.local attrd: info: attrd_peer_update: Setting shutdown[apache-up002.ring0]: (null) -> 0 from apache-up002.ring0
>>>>>> May 14 15:47:21 [18954] apache-up001.itc4u.local attrd: info: election_count_vote: Election 1 (owner: 3) pass: vote from apache-up003.ring0 (Uptime)
>>>>>> May 14 15:47:21 [18954] apache-up001.itc4u.local attrd: info: attrd_peer_update: Setting shutdown[apache-up003.ring0]: (null) -> 0 from apache-up003.ring0
>>>>>> May 14 15:47:21 [18954] apache-up001.itc4u.local attrd: info: election_count_vote: Election 2 (owner: 3) pass: vote from apache-up003.ring0 (Uptime)
>>>>>> May 14 15:47:21 [18954] apache-up001.itc4u.local attrd: info: attrd_client_refresh: Updating all attributes
>>>>>> May 14 15:47:21 [18951] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_delete operation for section //node_state[@uname='apache-up002.ring0']/transient_attributes: OK (rc=0, origin=apache-up002.ring0/crmd/24, version=0.98.0)
>>>>>> May 14 15:47:21 [18954] apache-up001.itc4u.local attrd: info: write_attribute: Sent update 2 with 2 changes for shutdown, id=<n/a>, set=(null)
>>>>>> May 14 15:47:21 [18954] apache-up001.itc4u.local attrd: info: write_attribute: Sent update 3 with 2 changes for terminate, id=<n/a>, set=(null)
>>>>>> May 14 15:47:21 [18954] apache-up001.itc4u.local attrd: info: attrd_peer_update: Setting shutdown[apache-up001.ring0]: (null) -> 0 from apache-up001.ring0
>>>>>> May 14 15:47:21 [18951] apache-up001.itc4u.local cib: info: cib_file_write_with_digest: Wrote version 0.98.0 of the CIB to disk (digest: 088b40b257e579e23dcbd0047454c8a9)
>>>>>> May 14 15:47:21 [18951] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_delete operation for section //node_state[@uname='apache-up002.ring0']/lrm: OK (rc=0, origin=apache-up002.ring0/crmd/25, version=0.98.0)
>>>>>> May 14 15:47:21 [18951] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: --- 0.98.0 2
>>>>>> May 14 15:47:21 [18951] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: +++ 0.98.1 (null)
>>>>>> May 14 15:47:21 [18951] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib: @num_updates=1
>>>>>> May 14 15:47:21 [18951] apache-up001.itc4u.local cib: info: cib_perform_op: ++ /cib/status: <node_state id="2" uname="apache-up002.ring0" in_ccm="true" crmd="online" crm-debug-origin="do_lrm_query_internal" join="member" expected="member"/>
>>>>>> May 14 15:47:21 [18951] apache-up001.itc4u.local cib: info: cib_perform_op: ++ <lrm id="2">
>>>>>> May 14 15:47:21 [18951] apache-up001.itc4u.local cib: info: cib_perform_op: ++ <lrm_resources/>
>>>>>> May 14 15:47:21 [18951] apache-up001.itc4u.local cib: info: cib_perform_op: ++ </lrm>
>>>>>> May 14 15:47:21 [18951] apache-up001.itc4u.local cib: info: cib_perform_op: ++ </node_state>
>>>>>> May 14 15:47:21 [18954] apache-up001.itc4u.local attrd: info: election_complete: Election election-attrd complete
>>>>>> May 14 15:47:21 [18954] apache-up001.itc4u.local attrd: info: write_attribute: Write out of 'shutdown' delayed: update 2 in progress
>>>>>> May 14 15:47:21 [18954] apache-up001.itc4u.local attrd: info: write_attribute: Write out of 'terminate' delayed: update 3 in progress
>>>>>> May 14 15:47:21 [18951] apache-up001.itc4u.local cib: info: cib_file_write_with_digest: Reading cluster configuration file /var/lib/pacemaker/cib/cib.JoSONl (digest: /var/lib/pacemaker/cib/cib.tSlLmC)
>>>>>> May 14 15:47:21 [18951] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section status: OK (rc=0, origin=apache-up002.ring0/crmd/26, version=0.98.1)
>>>>>> May 14 15:47:21 [18951] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_delete operation for section //node_state[@uname='apache-up001.ring0']/lrm: OK (rc=0, origin=apache-up002.ring0/crmd/27, version=0.98.1)
>>>>>> May 14 15:47:21 [18951] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: --- 0.98.1 2
>>>>>> May 14 15:47:21 [18951] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: +++ 0.98.2 (null)
>>>>>> May 14 15:47:21 [18951] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib: @num_updates=2
>>>>>> May 14 15:47:21 [18951] apache-up001.itc4u.local cib: info: cib_perform_op: ++ /cib/status: <node_state id="1" uname="apache-up001.ring0" in_ccm="true" crmd="online" crm-debug-origin="do_lrm_query_internal" join="member" expected="member"/>
>>>>>> May 14 15:47:21 [18951] apache-up001.itc4u.local cib: info: cib_perform_op: ++ <lrm id="1">
>>>>>> May 14 15:47:21 [18951] apache-up001.itc4u.local cib: info: cib_perform_op: ++ <lrm_resources/>
>>>>>> May 14 15:47:21 [18951] apache-up001.itc4u.local cib: info: cib_perform_op: ++ </lrm>
>>>>>> May 14 15:47:21 [18951] apache-up001.itc4u.local cib: info: cib_perform_op: ++ </node_state>
>>>>>> May 14 15:47:21 [18951] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section status: OK (rc=0, origin=apache-up002.ring0/crmd/28, version=0.98.2)
>>>>>> May 14 15:47:21 [18951] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_delete operation for section //node_state[@uname='apache-up003.ring0']/lrm: OK (rc=0, origin=apache-up002.ring0/crmd/29, version=0.98.2)
>>>>>> May 14 15:47:21 [18951] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: --- 0.98.2 2
>>>>>> May 14 15:47:21 [18951] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: +++ 0.98.3 (null)
>>>>>> May 14 15:47:21 [18951] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib: @num_updates=3
>>>>>> May 14 15:47:21 [18951] apache-up001.itc4u.local cib: info: cib_perform_op: ++ /cib/status: <node_state id="3" uname="apache-up003.ring0" in_ccm="true" crmd="online" crm-debug-origin="do_lrm_query_internal" join="member" expected="member"/>
>>>>>> May 14 15:47:21 [18951] apache-up001.itc4u.local cib: info: cib_perform_op: ++ <lrm id="3">
>>>>>> May 14 15:47:21 [18951] apache-up001.itc4u.local cib: info: cib_perform_op: ++ <lrm_resources/>
>>>>>> May 14 15:47:21 [18951] apache-up001.itc4u.local cib: info: cib_perform_op: ++ </lrm>
>>>>>> May 14 15:47:21 [18951] apache-up001.itc4u.local cib: info: cib_perform_op: ++ </node_state>
>>>>>> May 14 15:47:21 [18951] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section status: OK (rc=0, origin=apache-up002.ring0/crmd/30, version=0.98.3)
>>>>>> May 14 15:47:21 [18951] apache-up001.itc4u.local cib: info: cib_process_request: Forwarding cib_modify operation for section status to master (origin=local/attrd/2)
>>>>>> May 14 15:47:21 [18951] apache-up001.itc4u.local cib: info: cib_process_request: Forwarding cib_modify operation for section status to master (origin=local/attrd/3)
>>>>>> May 14 15:47:21 [18951] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: --- 0.98.3 2
>>>>>> May 14 15:47:21 [18951] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: +++ 0.98.4 (null)
>>>>>> May 14 15:47:21 [18951] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib: @num_updates=4
>>>>>> May 14 15:47:21 [18951] apache-up001.itc4u.local cib: info: cib_perform_op: ++ /cib/status/node_state[@id='2']: <transient_attributes id="2"/>
>>>>>> May 14 15:47:21 [18951] apache-up001.itc4u.local cib: info: cib_perform_op: ++ <instance_attributes id="status-2">
>>>>>> May 14 15:47:21 [18951] apache-up001.itc4u.local cib: info: cib_perform_op: ++ <nvpair id="status-2-shutdown" name="shutdown" value="0"/>
>>>>>> May 14 15:47:21 [18951] apache-up001.itc4u.local cib: info: cib_perform_op: ++ </instance_attributes>
>>>>>> May 14 15:47:21 [18951] apache-up001.itc4u.local cib: info: cib_perform_op: ++ </transient_attributes>
>>>>>> May 14 15:47:21 [18951] apache-up001.itc4u.local cib: info: cib_perform_op: ++ /cib/status/node_state[@id='1']: <transient_attributes id="1"/>
>>>>>> May 14 15:47:21 [18951] apache-up001.itc4u.local cib: info: cib_perform_op: ++ <instance_attributes id="status-1">
>>>>>> May 14 15:47:21 [18951] apache-up001.itc4u.local cib: info: cib_perform_op: ++ <nvpair id="status-1-shutdown" name="shutdown" value="0"/>
>>>>>> May 14 15:47:21 [18951] apache-up001.itc4u.local cib: info: cib_perform_op: ++ </instance_attributes>
>>>>>> May 14 15:47:21 [18951] apache-up001.itc4u.local cib: info: cib_perform_op: ++ </transient_attributes>
>>>>>> May 14 15:47:21 [18951] apache-up001.itc4u.local cib: info: cib_perform_op: ++ /cib/status/node_state[@id='3']: <transient_attributes id="3"/>
>>>>>> May 14 15:47:21 [18951] apache-up001.itc4u.local cib: info: cib_perform_op: ++ <instance_attributes id="status-3">
>>>>>> May 14 15:47:21 [18951] apache-up001.itc4u.local cib: info: cib_perform_op: ++ <nvpair id="status-3-shutdown" name="shutdown" value="0"/>
>>>>>> May 14 15:47:21 [18951] apache-up001.itc4u.local cib: info: cib_perform_op: ++ </instance_attributes>
>>>>>> May 14 15:47:21 [18951] apache-up001.itc4u.local cib: info: cib_perform_op: ++ </transient_attributes>
>>>>>> May 14 15:47:21 [18951] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section status: OK (rc=0, origin=apache-up002.ring0/attrd/2, version=0.98.4)
>>>>>> May 14 15:47:21 [18951] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: --- 0.98.4 2
>>>>>> May 14 15:47:21 [18951] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: +++ 0.98.5 (null)
>>>>>> May 14 15:47:21 [18951] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib: @num_updates=5
>>>>>> May 14 15:47:21 [18951] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section status: OK (rc=0, origin=apache-up002.ring0/attrd/3, version=0.98.5)
>>>>>> May 14 15:47:21 [18951] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section status: OK (rc=0, origin=apache-up001.ring0/attrd/2, version=0.98.5)
>>>>>> May 14 15:47:21 [18954] apache-up001.itc4u.local attrd: info: attrd_cib_callback: Update 2 for shutdown: OK (0)
>>>>>> May 14 15:47:21 [18954] apache-up001.itc4u.local attrd: info: attrd_cib_callback: Update 2 for shutdown[apache-up001.ring0]=(null): OK (0)
>>>>>> May 14 15:47:21 [18954] apache-up001.itc4u.local attrd: info: attrd_cib_callback: Update 2 for shutdown[apache-up002.ring0]=0: OK (0)
>>>>>> May 14 15:47:21 [18954] apache-up001.itc4u.local attrd: info: attrd_cib_callback: Update 2 for shutdown[apache-up003.ring0]=0: OK (0)
>>>>>> May 14 15:47:21 [18954] apache-up001.itc4u.local attrd: info: write_attribute: Sent update 4 with 3 changes for shutdown, id=<n/a>, set=(null)
>>>>>> May 14 15:47:21 [18951] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: --- 0.98.5 2
>>>>>> May 14 15:47:21 [18951] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: +++ 0.98.6 (null)
>>>>>> May 14 15:47:21 [18951] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib: @num_updates=6
>>>>>> May 14 15:47:21 [18951] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section status: OK (rc=0, origin=apache-up001.ring0/attrd/3, version=0.98.6)
>>>>>> May 14 15:47:21 [18954] apache-up001.itc4u.local attrd: info: attrd_cib_callback: Update 3 for terminate: OK (0)
>>>>>> May 14 15:47:21 [18954] apache-up001.itc4u.local attrd: info: attrd_cib_callback: Update 3 for terminate[apache-up001.ring0]=(null): OK (0)
>>>>>> May 14 15:47:21 [18954] apache-up001.itc4u.local attrd: info: attrd_cib_callback: Update 3 for terminate[apache-up002.ring0]=(null): OK (0)
>>>>>> May 14 15:47:21 [18954] apache-up001.itc4u.local attrd: info: attrd_cib_callback: Update 3 for terminate[apache-up003.ring0]=(null): OK (0)
>>>>>> May 14 15:47:21 [18951] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section nodes: OK (rc=0, origin=apache-up002.ring0/crmd/34, version=0.98.6)
>>>>>> May 14 15:47:21 [18951] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: --- 0.98.6 2
>>>>>> May 14 15:47:21 [18951] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: +++ 0.98.7 (null)
>>>>>> May 14 15:47:21 [18951] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib: @num_updates=7
>>>>>> May 14 15:47:21 [18951] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib/status/node_state[@id='2']: @crm-debug-origin=do_state_transition
>>>>>> May 14 15:47:21 [18951] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib/status/node_state[@id='1']: @crm-debug-origin=do_state_transition
>>>>>> May 14 15:47:21 [18951] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib/status/node_state[@id='3']: @crm-debug-origin=do_state_transition
>>>>>> May 14 15:47:21 [18951] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section status: OK (rc=0, origin=apache-up002.ring0/crmd/35, version=0.98.7)
>>>>>> May 14 15:47:21 [18951] apache-up001.itc4u.local cib: info: cib_process_request: Forwarding cib_modify operation for section status to master (origin=local/attrd/4)
>>>>>> May 14 15:47:21 [18951] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section status: OK (rc=0, origin=apache-up001.ring0/attrd/4, version=0.98.7)
>>>>>> May 14 15:47:21 [18954] apache-up001.itc4u.local attrd: info: attrd_cib_callback: Update 4 for shutdown: OK (0)
>>>>>> May 14 15:47:21 [18954] apache-up001.itc4u.local attrd: info: attrd_cib_callback: Update 4 for shutdown[apache-up001.ring0]=0: OK (0)
>>>>>> May 14 15:47:21 [18954] apache-up001.itc4u.local attrd: info: attrd_cib_callback: Update 4 for shutdown[apache-up002.ring0]=0: OK (0)
>>>>>> May 14 15:47:21 [18954] apache-up001.itc4u.local attrd: info: attrd_cib_callback: Update 4 for shutdown[apache-up003.ring0]=0: OK (0)
>>>>>> May 14 15:47:21 [18951] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: --- 0.98.7 2
>>>>>> May 14 15:47:21 [18951] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: +++ 0.98.8 (null)
>>>>>> May 14 15:47:21 [18951] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib: @num_updates=8, @dc-uuid=2
>>>>>> May 14 15:47:21 [18951] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section cib: OK (rc=0, origin=apache-up002.ring0/crmd/36, version=0.98.8)
>>>>>> May 14 15:47:22 [18954] apache-up001.itc4u.local attrd: info: attrd_peer_update: Setting probe_complete[apache-up003.ring0]: (null) -> true from apache-up003.ring0
>>>>>> May 14 15:47:22 [18954] apache-up001.itc4u.local attrd: info: write_attribute: Sent update 5 with 1 changes for probe_complete, id=<n/a>, set=(null)
>>>>>> May 14 15:47:22 [18954] apache-up001.itc4u.local attrd: info: attrd_peer_update: Setting probe_complete[apache-up002.ring0]: (null) -> true from apache-up002.ring0
>>>>>> May 14 15:47:22 [18954] apache-up001.itc4u.local attrd: info: write_attribute: Write out of 'probe_complete' delayed: update 5 in progress
>>>>>> May 14 15:47:22 [18951] apache-up001.itc4u.local cib: info: cib_process_request: Forwarding cib_modify operation for section status to master (origin=local/attrd/5)
>>>>>> May 14 15:47:22 [18951] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: --- 0.98.8 2
>>>>>> May 14 15:47:22 [18951] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: +++ 0.98.9 (null)
>>>>>> May 14 15:47:22 [18951] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib: @num_updates=9
>>>>>> May 14 15:47:22 [18951] apache-up001.itc4u.local cib: info: cib_perform_op: ++ /cib/status/node_state[@id='3']/transient_attributes[@id='3']/instance_attributes[@id='status-3']: <nvpair id="status-3-probe_complete" name="probe_complete" value="true"/>
>>>>>> May 14 15:47:22 [18951] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section status: OK (rc=0, origin=apache-up001.ring0/attrd/5, version=0.98.9)
>>>>>> May 14 15:47:22 [18954] apache-up001.itc4u.local attrd: info: attrd_cib_callback: Update 5 for probe_complete: OK (0)
>>>>>> May 14 15:47:22 [18954] apache-up001.itc4u.local attrd: info: attrd_cib_callback: Update 5 for probe_complete[apache-up002.ring0]=(null): OK (0)
>>>>>> May 14 15:47:22 [18954] apache-up001.itc4u.local attrd: info: attrd_cib_callback: Update 5 for probe_complete[apache-up003.ring0]=true: OK (0)
>>>>>> May 14 15:47:22 [18954] apache-up001.itc4u.local attrd: info: write_attribute: Sent update 6 with 2 changes for probe_complete, id=<n/a>, set=(null)
>>>>>> May 14 15:47:22 [18951] apache-up001.itc4u.local cib: info: cib_process_request: Forwarding cib_modify operation for section status to master (origin=local/attrd/6)
>>>>>> May 14 15:47:22 [18954] apache-up001.itc4u.local attrd: info: attrd_peer_update: Setting probe_complete[apache-up001.ring0]: (null) -> true from apache-up001.ring0
>>>>>> May 14 15:47:22 [18954] apache-up001.itc4u.local attrd: info: write_attribute: Write out of 'probe_complete' delayed: update 6 in progress
>>>>>> May 14 15:47:22 [18951] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: --- 0.98.9 2
>>>>>> May 14 15:47:22 [18951] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: +++ 0.98.10 (null)
>>>>>> May 14 15:47:22 [18951] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib: @num_updates=10
>>>>>> May 14 15:47:22 [18951] apache-up001.itc4u.local cib: info: cib_perform_op: ++ /cib/status/node_state[@id='2']/transient_attributes[@id='2']/instance_attributes[@id='status-2']: <nvpair id="status-2-probe_complete" name="probe_complete" value="true"/>
>>>>>> May 14 15:47:22 [18951] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section status: OK (rc=0, origin=apache-up001.ring0/attrd/6, version=0.98.10)
>>>>>> May 14 15:47:22 [18954] apache-up001.itc4u.local attrd: info: attrd_cib_callback: Update 6 for probe_complete: OK (0)
>>>>>> May 14 15:47:22 [18954] apache-up001.itc4u.local attrd: info: attrd_cib_callback: Update 6 for probe_complete[apache-up001.ring0]=(null): OK (0)
>>>>>> May 14 15:47:22 [18954] apache-up001.itc4u.local attrd: info: attrd_cib_callback: Update 6 for probe_complete[apache-up002.ring0]=true: OK (0)
>>>>>> May 14 15:47:22 [18954] apache-up001.itc4u.local attrd: info: attrd_cib_callback: Update 6 for probe_complete[apache-up003.ring0]=true: OK (0)
>>>>>> May 14 15:47:22 [18954] apache-up001.itc4u.local attrd: info: write_attribute: Sent update 7 with 3 changes for probe_complete, id=<n/a>, set=(null)
>>>>>> May 14 15:47:22 [18951] apache-up001.itc4u.local cib: info: cib_process_request: Forwarding cib_modify operation for section status to master (origin=local/attrd/7)
>>>>>> May 14 15:47:22 [18951] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: --- 0.98.10 2
>>>>>> May 14 15:47:22 [18951] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: +++ 0.98.11 (null)
>>>>>> May 14 15:47:22 [18951] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib: @num_updates=11
>>>>>> May 14 15:47:22 [18951] apache-up001.itc4u.local cib: info: cib_perform_op: ++ /cib/status/node_state[@id='1']/transient_attributes[@id='1']/instance_attributes[@id='status-1']: <nvpair id="status-1-probe_complete" name="probe_complete" value="true"/>
>>>>>> May 14 15:47:22 [18952] apache-up001.itc4u.local stonith-ng: notice: can_fence_host_with_device: scsi can fence (on) apache-up001.ring0: static-list
>>>>>> May 14 15:47:22 [18951] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section status: OK (rc=0, origin=apache-up001.ring0/attrd/7, version=0.98.11)
>>>>>> May 14 15:47:22 [18954] apache-up001.itc4u.local attrd: info: attrd_cib_callback: Update 7 for probe_complete: OK (0)
>>>>>> May 14 15:47:22 [18954] apache-up001.itc4u.local attrd: info: attrd_cib_callback: Update 7 for probe_complete[apache-up001.ring0]=true: OK (0)
>>>>>> May 14 15:47:22 [18954] apache-up001.itc4u.local attrd: info: attrd_cib_callback: Update 7 for probe_complete[apache-up002.ring0]=true: OK (0)
>>>>>> May 14 15:47:22 [18954] apache-up001.itc4u.local attrd: info: attrd_cib_callback: Update 7 for probe_complete[apache-up003.ring0]=true: OK (0)
>>>>>> May 14 15:47:22 [18952] apache-up001.itc4u.local stonith-ng: notice: can_fence_host_with_device: scsi can fence (on) apache-up001.ring0: static-list
>>>>>> May 14 15:47:22 [18952] apache-up001.itc4u.local stonith-ng: info: stonith_fence_get_devices_cb: Found 1 matching devices for 'apache-up001.ring0'
>>>>>> May 14 15:47:22 [18952] apache-up001.itc4u.local stonith-ng: notice: remote_op_done: Operation on of apache-up003.ring0 by apache-up003.ring0 for crmd.15120 at apache-up002.ring0.44c5a0b6: OK
>>>>>> May 14 15:47:22 [18956] apache-up001.itc4u.local crmd: notice: tengine_stonith_notify: apache-up003.ring0 was successfully unfenced by apache-up003.ring0 (at the request of apache-up002.ring0)
>>>>>> May 14 15:47:22 [18952] apache-up001.itc4u.local stonith-ng: notice: remote_op_done: Operation on of apache-up002.ring0 by apache-up002.ring0 for crmd.15120 at apache-up002.ring0.e4b17672: OK
>>>>>> May 14 15:47:22 [18956] apache-up001.itc4u.local crmd: notice: tengine_stonith_notify: apache-up002.ring0 was successfully unfenced by apache-up002.ring0 (at the request of apache-up002.ring0)
>>>>>> May 14 15:47:22 [18952] apache-up001.itc4u.local stonith-ng: notice: log_operation: Operation 'on' [19052] (call 4 from crmd.15120) for host 'apache-up001.ring0' with device 'scsi' returned: 0 (OK)
>>>>>> May 14 15:47:23 [18952] apache-up001.itc4u.local stonith-ng: notice: remote_op_done: Operation on of apache-up001.ring0 by apache-up001.ring0 for crmd.15120 at apache-up002.ring0.a682d19f: OK
>>>>>> May 14 15:47:23 [18956] apache-up001.itc4u.local crmd: notice: tengine_stonith_notify: apache-up001.ring0 was successfully unfenced by apache-up001.ring0 (at the request of apache-up002.ring0)
>>>>>> May 14 15:47:24 [18951] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: --- 0.98.11 2
>>>>>> May 14 15:47:24 [18951] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: +++ 0.98.12 (null)
>>>>>> May 14 15:47:24 [18951] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib: @num_updates=12
>>>>>> May 14 15:47:24 [18951] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib/status/node_state[@id='2']: @crm-debug-origin=do_update_resource
>>>>>> May 14 15:47:24 [18951] apache-up001.itc4u.local cib: info: cib_perform_op: ++ /cib/status/node_state[@id='2']/lrm[@id='2']/lrm_resources: <lrm_resource id="scsi" type="fence_scsi" class="stonith"/>
>>>>>> May 14 15:47:24 [18951] apache-up001.itc4u.local cib: info: cib_perform_op: ++ <lrm_rsc_op id="scsi_last_0" operation_key="scsi_monitor_0" operation="monitor" crm-debug-origin="do_update_resource" crm_feature_set="3.0.10" transition-key="6:0:7:6b7d5189-b033-453b-b1a3-a851c1bd46c2" transition-magic="0:7;6:0:7:6b7d5189-b033-453b-b1a3-a851c1bd46c2" on_node="apache-up002.ring0" call-id="5" rc-code="7" op-status="0" interval="0" last-run=
>>>>>> May 14 15:47:24 [18951] apache-up001.itc4u.local cib: info: cib_perform_op: ++ </lrm_resource>
>>>>>> May 14 15:47:24 [18953] apache-up001.itc4u.local lrmd: info: process_lrmd_get_rsc_info: Resource 'scsi' not found (0 active resources)
>>>>>> May 14 15:47:24 [18953] apache-up001.itc4u.local lrmd: info: process_lrmd_rsc_register: Added 'scsi' to the rsc list (1 active resources)
>>>>>> May 14 15:47:24 [18951] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section status: OK (rc=0, origin=apache-up002.ring0/crmd/39, version=0.98.12)
>>>>>> May 14 15:47:24 [18956] apache-up001.itc4u.local crmd: info: do_lrm_rsc_op: Performing key=3:0:7:6b7d5189-b033-453b-b1a3-a851c1bd46c2 op=scsi_monitor_0
>>>>>> May 14 15:47:24 [18951] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: --- 0.98.12 2
>>>>>> May 14 15:47:24 [18951] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: +++ 0.98.13 (null)
>>>>>> May 14 15:47:24 [18951] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib: @num_updates=13
>>>>>> May 14 15:47:24 [18951] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib/status/node_state[@id='3']: @crm-debug-origin=do_update_resource
>>>>>> May 14 15:47:24 [18951] apache-up001.itc4u.local cib: info: cib_perform_op: ++ /cib/status/node_state[@id='3']/lrm[@id='3']/lrm_resources: <lrm_resource id="scsi" type="fence_scsi" class="stonith"/>
>>>>>> May 14 15:47:24 [18951] apache-up001.itc4u.local cib: info: cib_perform_op: ++ <lrm_rsc_op id="scsi_last_0" operation_key="scsi_monitor_0" operation="monitor" crm-debug-origin="do_update_resource" crm_feature_set="3.0.10" transition-key="9:0:7:6b7d5189-b033-453b-b1a3-a851c1bd46c2" transition-magic="0:7;9:0:7:6b7d5189-b033-453b-b1a3-a851c1bd46c2" on_node="apache-up003.ring0" call-id="5" rc-code="7" op-status="0" interval="0" last-run=
>>>>>> May 14 15:47:24 [18951] apache-up001.itc4u.local cib: info: cib_perform_op: ++ </lrm_resource>
>>>>>> May 14 15:47:24 [18951] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section status: OK (rc=0, origin=apache-up003.ring0/crmd/13, version=0.98.13)
>>>>>> May 14 15:47:25 [18956] apache-up001.itc4u.local crmd: notice: process_lrm_event: Operation scsi_monitor_0: not running (node=apache-up001.ring0, call=5, rc=7, cib-update=12, confirmed=true)
>>>>>> May 14 15:47:25 [18951] apache-up001.itc4u.local cib: info: cib_process_request: Forwarding cib_modify operation for section status to master (origin=local/crmd/12)
>>>>>> May 14 15:47:25 [18951] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: --- 0.98.13 2
>>>>>> May 14 15:47:25 [18951] apache-up001.itc4u.local cib: info: cib_perform_op: Diff: +++ 0.98.14 (null)
>>>>>> May 14 15:47:25 [18951] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib: @num_updates=14
>>>>>> May 14 15:47:25 [18951] apache-up001.itc4u.local cib: info: cib_perform_op: + /cib/status/node_state[@id='1']: @crm-debug-origin=do_update_resource
>>>>>> May 14 15:47:25 [18951] apache-up001.itc4u.local cib: info: cib_perform_op: ++ /cib/status/node_state[@id='1']/lrm[@id='1']/lrm_resources: <lrm_resource id="scsi" type="fence_scsi" class="stonith"/>
>>>>>> May 14 15:47:25 [18951] apache-up001.itc4u.local cib: info: cib_perform_op: ++ <lrm_rsc_op id="scsi_last_0" operation_key="scsi_monitor_0" operation="monitor" crm-debug-origin="do_update_resource" crm_feature_set="3.0.10" transition-key="3:0:7:6b7d5189-b033-453b-b1a3-a851c1bd46c2" transition-magic="0:7;3:0:7:6b7d5189-b033-453b-b1a3-a851c1bd46c2" on_node="apache-up001.ring0" call-id="5" rc-code="7" op-status="0" interval="0" last-run=
>>>>>> May 14 15:47:25 [18951] apache-up001.itc4u.local cib: info: cib_perform_op: ++ </lrm_resource>
>>>>>> May 14 15:47:25 [18951] apache-up001.itc4u.local cib: info: cib_process_request: Completed cib_modify operation for section status: OK (rc=0, origin=apache-up001.ring0/crmd/12, version=0.98.14)
>>>>>> May 14 15:47:30 [18951] apache-up001.itc4u.local cib: info: cib_process_ping: Reporting our current digest to apache-up002.ring0: e1c4fabedccaa4621f5d737327d9a8d5 for 0.98.14 (0x18c3300 0)
>>>>>> May 14 15:47:30 [18956] apache-up001.itc4u.local crmd: info: throttle_send_command: New throttle mode: 0000 (was ffffffff)
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>> cat /var/log/pacemaker.log
>>>>>> May 14 15:46:59 [18950] apache-up001.itc4u.local pacemakerd: info: crm_log_init: Changed active directory to /var/lib/pacemaker/cores/root
>>>>>> May 14 15:46:59 [18950] apache-up001.itc4u.local pacemakerd: info: get_cluster_type: Detected an active 'corosync' cluster
>>>>>> May 14 15:46:59 [18950] apache-up001.itc4u.local pacemakerd: info: mcp_read_config: Reading configure for stack: corosync
>>>>>> May 14 15:46:59 [18950] apache-up001.itc4u.local pacemakerd: notice: crm_add_logfile: Switching to /var/log/cluster/corosync.log
>>>>>>
>>>>>> Can anyone help me please? This is really driving me crazy
>>>>>>
>>>>>> Kind regards
>>>>>>
>>>>>> Marco
More information about the Users
mailing list