Dec 16 15:08:07 [660] xstorage1 corosync notice [TOTEM ] A processor failed, forming new configuration. Dec 16 15:08:07 [660] xstorage1 corosync notice [TOTEM ] The network interface is down. Dec 16 15:08:08 [660] xstorage1 corosync notice [TOTEM ] A new membership (127.0.0.1:408) was formed. Members left: 2 Dec 16 15:08:08 [660] xstorage1 corosync notice [TOTEM ] Failed to receive the leave message. failed: 2 Dec 16 15:08:08 [710] attrd: info: pcmk_cpg_membership: Group attrd event 2: xstha2 (node 2 pid 666) left via cluster exit Dec 16 15:08:08 [707] cib: info: pcmk_cpg_membership: Group cib event 2: xstha2 (node 2 pid 663) left via cluster exit Dec 16 15:08:08 [710] attrd: info: crm_update_peer_proc: pcmk_cpg_membership: Node xstha2[2] - corosync-cpg is now offline Dec 16 15:08:08 [687] pacemakerd: info: pcmk_cpg_membership: Group pacemakerd event 2: xstha2 (node 2 pid 662) left via cluster exit Dec 16 15:08:08 [708] stonith-ng: info: pcmk_cpg_membership: Group stonith-ng event 2: xstha2 (node 2 pid 664) left via cluster exit Dec 16 15:08:08 [687] pacemakerd: info: crm_update_peer_proc: pcmk_cpg_membership: Node xstha2[2] - corosync-cpg is now offline Dec 16 15:08:08 [708] stonith-ng: info: crm_update_peer_proc: pcmk_cpg_membership: Node xstha2[2] - corosync-cpg is now offline Dec 16 15:08:08 [710] attrd: notice: crm_update_peer_state_iter: Node xstha2 state is now lost | nodeid=2 previous=member source=crm_update_peer_proc Dec 16 15:08:08 [660] xstorage1 corosync notice [QUORUM] Members[1]: 1 Dec 16 15:08:08 [710] attrd: notice: attrd_peer_remove: Removing all xstha2 attributes for peer loss Dec 16 15:08:08 [708] stonith-ng: notice: crm_update_peer_state_iter: Node xstha2 state is now lost | nodeid=2 previous=member source=crm_update_peer_proc Dec 16 15:08:08 [712] crmd: info: pcmk_cpg_membership: Group crmd event 2: xstha2 (node 2 pid 668) left via cluster exit Dec 16 15:08:08 [660] xstorage1 corosync notice [MAIN ] Completed service synchronization, ready to provide service. Dec 16 15:08:08 [687] pacemakerd: info: pcmk_cpg_membership: Group pacemakerd event 2: xstha1 (node 1 pid 687) is member Dec 16 15:08:08 [710] attrd: info: crm_reap_dead_member: Removing node with name xstha2 and id 2 from membership cache Dec 16 15:08:08 [712] crmd: info: crm_update_peer_proc: pcmk_cpg_membership: Node xstha2[2] - corosync-cpg is now offline Dec 16 15:08:08 [710] attrd: notice: reap_crm_member: Purged 1 peer with id=2 and/or uname=xstha2 from the membership cache Dec 16 15:08:08 [708] stonith-ng: info: crm_reap_dead_member: Removing node with name xstha2 and id 2 from membership cache Dec 16 15:08:08 [710] attrd: info: pcmk_cpg_membership: Group attrd event 2: xstha1 (node 1 pid 710) is member Dec 16 15:08:08 [707] cib: info: crm_update_peer_proc: pcmk_cpg_membership: Node xstha2[2] - corosync-cpg is now offline Dec 16 15:08:08 [687] pacemakerd: info: pcmk_quorum_notification: Quorum retained | membership=408 members=1 Dec 16 15:08:08 [707] cib: notice: crm_update_peer_state_iter: Node xstha2 state is now lost | nodeid=2 previous=member source=crm_update_peer_proc Dec 16 15:08:08 [708] stonith-ng: notice: reap_crm_member: Purged 1 peer with id=2 and/or uname=xstha2 from the membership cache Dec 16 15:08:08 [712] crmd: info: peer_update_callback: Client xstha2/peer now has status [offline] (DC=true, changed=4000000) Dec 16 15:08:08 [687] pacemakerd: notice: crm_update_peer_state_iter: Node xstha2 state is now lost | nodeid=2 previous=member source=crm_reap_unseen_nodes Dec 16 15:08:08 [707] cib: info: crm_reap_dead_member: Removing node with name xstha2 and id 2 from membership cache Dec 16 15:08:08 [708] stonith-ng: info: pcmk_cpg_membership: Group stonith-ng event 2: xstha1 (node 1 pid 708) is member Dec 16 15:08:08 [707] cib: notice: reap_crm_member: Purged 1 peer with id=2 and/or uname=xstha2 from the membership cache Dec 16 15:08:08 [707] cib: info: pcmk_cpg_membership: Group cib event 2: xstha1 (node 1 pid 707) is member Dec 16 15:08:08 [687] pacemakerd: info: mcp_cpg_deliver: Ignoring process list sent by peer for local node Dec 16 15:08:08 [712] crmd: info: controld_delete_node_state: Deleting transient attributes for node xstha2 (via CIB call 65) | xpath=//node_state[@uname='xstha2']/transient_attributes Dec 16 15:08:08 [707] cib: info: cib_process_request: Forwarding cib_delete operation for section //node_state[@uname='xstha2']/transient_attributes to all (origin=local/crmd/65) Dec 16 15:08:08 [712] crmd: warning: match_down_event: No reason to expect node 2 to be down Dec 16 15:08:08 [712] crmd: notice: peer_update_callback: Stonith/shutdown of xstha2 not matched Dec 16 15:08:08 [712] crmd: info: abort_transition_graph: Transition aborted: Node failure | source=peer_update_callback:300 complete=true Dec 16 15:08:08 [712] crmd: info: pcmk_cpg_membership: Group crmd event 2: xstha1 (node 1 pid 712) is member Dec 16 15:08:08 [712] crmd: notice: do_state_transition: State transition S_IDLE -> S_POLICY_ENGINE | input=I_PE_CALC cause=C_FSA_INTERNAL origin=abort_transition_graph Dec 16 15:08:08 [712] crmd: info: pcmk_quorum_notification: Quorum retained | membership=408 members=1 Dec 16 15:08:08 [712] crmd: notice: crm_update_peer_state_iter: Node xstha2 state is now lost | nodeid=2 previous=member source=crm_reap_unseen_nodes Dec 16 15:08:08 [712] crmd: info: peer_update_callback: Cluster node xstha2 is now lost (was member) Dec 16 15:08:08 [712] crmd: warning: match_down_event: No reason to expect node 2 to be down Dec 16 15:08:08 [712] crmd: notice: peer_update_callback: Stonith/shutdown of xstha2 not matched Dec 16 15:08:08 [712] crmd: info: abort_transition_graph: Transition aborted: Node failure | source=peer_update_callback:300 complete=true Dec 16 15:08:08 [707] cib: info: cib_process_request: Completed cib_delete operation for section //node_state[@uname='xstha2']/transient_attributes: OK (rc=0, origin=xstha1/crmd/65, version=0.46.19) Dec 16 15:08:08 [707] cib: info: cib_process_request: Forwarding cib_modify operation for section status to all (origin=local/crmd/66) Dec 16 15:08:08 [707] cib: info: cib_process_request: Forwarding cib_modify operation for section status to all (origin=local/crmd/68) Dec 16 15:08:08 [707] cib: info: cib_perform_op: Diff: --- 0.46.19 2 Dec 16 15:08:08 [707] cib: info: cib_perform_op: Diff: +++ 0.46.20 (null) Dec 16 15:08:08 [707] cib: info: cib_perform_op: + /cib: @num_updates=20 Dec 16 15:08:08 [707] cib: info: cib_perform_op: + /cib/status/node_state[@id='2']: @crmd=offline, @crm-debug-origin=peer_update_callback Dec 16 15:08:08 [707] cib: info: cib_process_request: Completed cib_modify operation for section status: OK (rc=0, origin=xstha1/crmd/66, version=0.46.20) Dec 16 15:08:08 [707] cib: info: cib_process_request: Completed cib_modify operation for section status: OK (rc=0, origin=xstha1/crmd/68, version=0.46.20) Dec 16 15:08:08 [707] cib: info: cib_process_request: Forwarding cib_modify operation for section nodes to all (origin=local/crmd/71) Dec 16 15:08:08 [707] cib: info: cib_process_request: Forwarding cib_modify operation for section status to all (origin=local/crmd/72) Dec 16 15:08:08 [707] cib: info: cib_process_request: Completed cib_modify operation for section nodes: OK (rc=0, origin=xstha1/crmd/71, version=0.46.20) Dec 16 15:08:08 [707] cib: info: cib_perform_op: Diff: --- 0.46.20 2 Dec 16 15:08:08 [707] cib: info: cib_perform_op: Diff: +++ 0.46.21 (null) Dec 16 15:08:08 [707] cib: info: cib_perform_op: + /cib: @num_updates=21 Dec 16 15:08:08 [707] cib: info: cib_perform_op: + /cib/status/node_state[@id='1']: @crm-debug-origin=post_cache_update Dec 16 15:08:08 [707] cib: info: cib_perform_op: + /cib/status/node_state[@id='2']: @crm-debug-origin=post_cache_update, @in_ccm=false Dec 16 15:08:08 [707] cib: info: cib_process_request: Completed cib_modify operation for section status: OK (rc=0, origin=xstha1/crmd/72, version=0.46.21) Dec 16 15:08:09 [711] pengine: info: determine_online_status_fencing: Node xstha1 is active Dec 16 15:08:09 [711] pengine: info: determine_online_status: Node xstha1 is online Dec 16 15:08:09 [711] pengine: warning: pe_fence_node: Cluster node xstha2 will be fenced: peer is no longer part of the cluster Dec 16 15:08:09 [711] pengine: warning: determine_online_status: Node xstha2 is unclean Dec 16 15:08:09 [711] pengine: info: unpack_node_loop: Node 1 is already processed Dec 16 15:08:09 [711] pengine: info: unpack_node_loop: Node 2 is already processed Dec 16 15:08:09 [711] pengine: info: unpack_node_loop: Node 1 is already processed Dec 16 15:08:09 [711] pengine: info: unpack_node_loop: Node 2 is already processed Dec 16 15:08:09 [711] pengine: info: common_print: xstha1_san0_IP (ocf::heartbeat:IPaddr): Started xstha1 Dec 16 15:08:09 [711] pengine: info: common_print: xstha2_san0_IP (ocf::heartbeat:IPaddr): Started xstha2 (UNCLEAN) Dec 16 15:08:09 [711] pengine: info: common_print: zpool_data (ocf::heartbeat:ZFS): Started xstha1 Dec 16 15:08:09 [711] pengine: info: common_print: xstha1-stonith (stonith:external/ipmi): Started xstha2 (UNCLEAN) Dec 16 15:08:09 [711] pengine: info: common_print: xstha2-stonith (stonith:external/ipmi): Started xstha1 Dec 16 15:08:09 [711] pengine: info: pcmk__native_allocate: Resource xstha1-stonith cannot run anywhere Dec 16 15:08:09 [711] pengine: warning: custom_action: Action xstha2_san0_IP_stop_0 on xstha2 is unrunnable (offline) Dec 16 15:08:09 [711] pengine: warning: custom_action: Action xstha1-stonith_stop_0 on xstha2 is unrunnable (offline) Dec 16 15:08:09 [711] pengine: warning: custom_action: Action xstha1-stonith_stop_0 on xstha2 is unrunnable (offline) Dec 16 15:08:09 [711] pengine: warning: stage6: Scheduling Node xstha2 for STONITH Dec 16 15:08:09 [711] pengine: info: native_stop_constraints: xstha2_san0_IP_stop_0 is implicit after xstha2 is fenced Dec 16 15:08:09 [711] pengine: info: native_stop_constraints: xstha1-stonith_stop_0 is implicit after xstha2 is fenced Dec 16 15:08:09 [711] pengine: notice: LogNodeActions: * Fence (off) xstha2 'peer is no longer part of the cluster' Dec 16 15:08:09 [711] pengine: info: LogActions: Leave xstha1_san0_IP (Started xstha1) Dec 16 15:08:09 [711] pengine: notice: LogAction: * Move xstha2_san0_IP ( xstha2 -> xstha1 ) Dec 16 15:08:09 [711] pengine: info: LogActions: Leave zpool_data (Started xstha1) Dec 16 15:08:09 [711] pengine: notice: LogAction: * Stop xstha1-stonith ( xstha2 ) due to node availability Dec 16 15:08:09 [711] pengine: info: LogActions: Leave xstha2-stonith (Started xstha1) Dec 16 15:08:09 [711] pengine: warning: process_pe_message: Calculated transition 4 (with warnings), saving inputs in /sonicle/var/cluster/lib/pacemaker/pengine/pe-warn-51.bz2 Dec 16 15:08:09 [712] crmd: info: do_state_transition: State transition S_POLICY_ENGINE -> S_TRANSITION_ENGINE | input=I_PE_SUCCESS cause=C_IPC_MESSAGE origin=handle_response Dec 16 15:08:09 [712] crmd: info: do_te_invoke: Processing graph 4 (ref=pe_calc-dc-1608127689-39) derived from /sonicle/var/cluster/lib/pacemaker/pengine/pe-warn-51.bz2 Dec 16 15:08:09 [712] crmd: notice: te_fence_node: Requesting fencing (off) of node xstha2 | action=1 timeout=60000 Dec 16 15:08:09 [708] stonith-ng: notice: handle_request: Client crmd.712.e9eb875f wants to fence (off) 'xstha2' with device '(any)' Dec 16 15:08:09 [708] stonith-ng: notice: initiate_remote_stonith_op: Requesting peer fencing (off) targeting xstha2 | id=e487e7cc-f333-edd6-94d2-f5ff1bfd9b3d state=0 Dec 16 15:08:09 [708] stonith-ng: info: dynamic_list_search_cb: Refreshing port list for xstha2-stonith Dec 16 15:08:09 [708] stonith-ng: info: process_remote_stonith_query: Query result 1 of 1 from xstha1 for xstha2/off (1 devices) e487e7cc-f333-edd6-94d2-f5ff1bfd9b3d Dec 16 15:08:09 [708] stonith-ng: info: call_remote_stonith: Total timeout set to 60 for peer's fencing targeting xstha2 for crmd.712|id=e487e7cc-f333-edd6-94d2-f5ff1bfd9b3d Dec 16 15:08:09 [708] stonith-ng: notice: call_remote_stonith: Requesting that xstha1 perform 'off' action targeting xstha2 | for client crmd.712 (72s, 0s) Dec 16 15:08:09 [708] stonith-ng: notice: can_fence_host_with_device: xstha2-stonith can fence (off) xstha2: dynamic-list Dec 16 15:08:09 [708] stonith-ng: info: stonith_fence_get_devices_cb: Found 1 matching devices for 'xstha2' Dec 16 15:08:09 [708] stonith-ng: notice: schedule_stonith_command: Delaying 'off' action targeting xstha2 on xstha2-stonith for 1s (timeout=60s, requested_delay=0s, base=1s, max=1s) Dec 16 15:08:12 [708] stonith-ng: notice: log_operation: Operation 'off' [1273] (call 4 from crmd.712) for host 'xstha2' with device 'xstha2-stonith' returned: 0 (OK) Dec 16 15:08:12 [708] stonith-ng: notice: remote_op_done: Operation 'off' targeting xstha2 on xstha1 for crmd.712@xstha1.e487e7cc: OK Dec 16 15:08:12 [712] crmd: notice: tengine_stonith_callback: Stonith operation 4/1:4:0:cc8faf12-ac24-cc9c-c212-effe6840ca76: OK (0) Dec 16 15:08:12 [712] crmd: info: tengine_stonith_callback: Stonith operation 4 for xstha2 passed Dec 16 15:08:12 [712] crmd: info: crm_update_peer_expected: crmd_peer_down: Node xstha2[2] - expected state is now down (was member) Dec 16 15:08:12 [712] crmd: info: controld_delete_node_state: Deleting all state for node xstha2 (via CIB call 76) | xpath=//node_state[@uname='xstha2']/* Dec 16 15:08:12 [707] cib: info: cib_process_request: Forwarding cib_modify operation for section status to all (origin=local/crmd/75) Dec 16 15:08:12 [712] crmd: notice: tengine_stonith_notify: Peer xstha2 was terminated (off) by xstha1 on behalf of crmd.712: OK | initiator=xstha1 ref=e487e7cc-f333-edd6-94d2-f5ff1bfd9b3d Dec 16 15:08:12 [707] cib: info: cib_process_request: Forwarding cib_delete operation for section //node_state[@uname='xstha2']/* to all (origin=local/crmd/76) Dec 16 15:08:12 [712] crmd: info: controld_delete_node_state: Deleting all state for node xstha2 (via CIB call 78) | xpath=//node_state[@uname='xstha2']/* Dec 16 15:08:12 [712] crmd: notice: te_rsc_command: Initiating start operation xstha2_san0_IP_start_0 locally on xstha1 | action 6 Dec 16 15:08:12 [712] crmd: info: do_lrm_rsc_op: Performing key=6:4:0:cc8faf12-ac24-cc9c-c212-effe6840ca76 op=xstha2_san0_IP_start_0 Dec 16 15:08:12 [707] cib: info: cib_perform_op: Diff: --- 0.46.21 2 Dec 16 15:08:12 [707] cib: info: cib_perform_op: Diff: +++ 0.46.22 (null) Dec 16 15:08:12 [707] cib: info: cib_perform_op: + /cib: @num_updates=22 Dec 16 15:08:12 [707] cib: info: cib_perform_op: + /cib/status/node_state[@id='2']: @crm-debug-origin=send_stonith_update, @join=down, @expected=down Dec 16 15:08:12 [709] lrmd: info: log_execute: executing - rsc:xstha2_san0_IP action:start call_id:26 Dec 16 15:08:12 [707] cib: info: cib_process_request: Completed cib_modify operation for section status: OK (rc=0, origin=xstha1/crmd/75, version=0.46.22) Dec 16 15:08:12 [712] crmd: info: cib_fencing_updated: Fencing update 75 for xstha2: complete Dec 16 15:08:12 [707] cib: info: cib_perform_op: Diff: --- 0.46.22 2 Dec 16 15:08:12 [707] cib: info: cib_perform_op: Diff: +++ 0.46.23 (null) Dec 16 15:08:12 [707] cib: info: cib_perform_op: -- /cib/status/node_state[@id='2']/lrm[@id='2'] Dec 16 15:08:12 [707] cib: info: cib_perform_op: + /cib: @num_updates=23 Dec 16 15:08:12 [707] cib: info: cib_process_request: Completed cib_delete operation for section //node_state[@uname='xstha2']/*: OK (rc=0, origin=xstha1/crmd/76, version=0.46.23) Dec 16 15:08:12 [707] cib: info: cib_process_request: Forwarding cib_modify operation for section status to all (origin=local/crmd/77) Dec 16 15:08:12 [707] cib: info: cib_process_request: Forwarding cib_delete operation for section //node_state[@uname='xstha2']/* to all (origin=local/crmd/78) Dec 16 15:08:12 [707] cib: info: cib_process_request: Completed cib_modify operation for section status: OK (rc=0, origin=xstha1/crmd/77, version=0.46.23) Dec 16 15:08:12 [712] crmd: info: cib_fencing_updated: Fencing update 77 for xstha2: complete Dec 16 15:08:12 [707] cib: info: cib_process_request: Completed cib_delete operation for section //node_state[@uname='xstha2']/*: OK (rc=0, origin=xstha1/crmd/78, version=0.46.23) Dec 16 15:08:12 [709] lrmd: notice: operation_finished: xstha2_san0_IP_start_0:1286:stderr [ Converted dotted-quad netmask to CIDR as: 24 ] Dec 16 15:08:12 [709] lrmd: info: log_finished: finished - rsc:xstha2_san0_IP action:start call_id:26 pid:1286 exit-code:0 exec-time:424ms queue-time:0ms Dec 16 15:08:12 [712] crmd: notice: process_lrm_event: Result of start operation for xstha2_san0_IP on xstha1: 0 (ok) | call=26 key=xstha2_san0_IP_start_0 confirmed=true cib-update=79 Dec 16 15:08:12 [707] cib: info: cib_process_request: Forwarding cib_modify operation for section status to all (origin=local/crmd/79) Dec 16 15:08:12 [707] cib: info: cib_perform_op: Diff: --- 0.46.23 2 Dec 16 15:08:12 [707] cib: info: cib_perform_op: Diff: +++ 0.46.24 (null) Dec 16 15:08:12 [707] cib: info: cib_perform_op: + /cib: @num_updates=24 Dec 16 15:08:12 [707] cib: info: cib_perform_op: + /cib/status/node_state[@id='1']: @crm-debug-origin=do_update_resource Dec 16 15:08:12 [707] cib: info: cib_perform_op: + /cib/status/node_state[@id='1']/lrm[@id='1']/lrm_resources/lrm_resource[@id='xstha2_san0_IP']/lrm_rsc_op[@id='xstha2_san0_IP_last_0']: @operation_key=xstha2_san0_IP_start_0, @operation=start, @crm-debug-origin=do_update_resource, @transition-key=6:4:0:cc8faf12-ac24-cc9c-c212-effe6840ca76, @transition-magic=0:0;6:4:0:cc8faf12-ac24-cc9c-c212-effe6840ca76, @call-id=26, @rc-code=0, @last-run=1608127692, @last-rc-change=1608127692, @exec-time=424 Dec 16 15:08:12 [707] cib: info: cib_process_request: Completed cib_modify operation for section status: OK (rc=0, origin=xstha1/crmd/79, version=0.46.24) Dec 16 15:08:12 [712] crmd: info: match_graph_event: Action xstha2_san0_IP_start_0 (6) confirmed on xstha1 (rc=0) Dec 16 15:08:12 [712] crmd: notice: run_graph: Transition 4 (Complete=4, Pending=0, Fired=0, Skipped=0, Incomplete=0, Source=/sonicle/var/cluster/lib/pacemaker/pengine/pe-warn-51.bz2): Complete Dec 16 15:08:12 [712] crmd: info: do_log: Input I_TE_SUCCESS received in state S_TRANSITION_ENGINE from notify_crmd Dec 16 15:08:12 [712] crmd: notice: do_state_transition: State transition S_TRANSITION_ENGINE -> S_IDLE | input=I_TE_SUCCESS cause=C_FSA_INTERNAL origin=notify_crmd Dec 16 15:08:17 [707] cib: info: cib_process_ping: Reporting our current digest to xstha1: 12b5d0c73b7cc062864dd80352e00b6c for 0.46.24 (82626a0 0)