[ClusterLabs] Issue in fence_ilo4 with IPv6 ILO IPs
Rohit Saini
rohitsaini111.forum at gmail.com
Wed Apr 3 05:10:47 EDT 2019
Hi Ondrej,
Please find my reply below:
1.
*Stonith configuration:*
[root at orana ~]# pcs config
Resource: fence-uc-orana (class=stonith type=fence_ilo4)
Attributes: delay=0 ipaddr=fd00:1061:37:9002:: lanplus=1 login=xyz
passwd=xyz pcmk_host_list=orana pcmk_reboot_action=off
Meta Attrs: failure-timeout=3s
Operations: monitor interval=5s on-fail=ignore
(fence-uc-orana-monitor-interval-5s)
start interval=0s on-fail=restart
(fence-uc-orana-start-interval-0s)
Resource: fence-uc-tigana (class=stonith type=fence_ilo4)
Attributes: delay=10 ipaddr=fd00:1061:37:9001:: lanplus=1 login=xyz
passwd=xyz pcmk_host_list=tigana pcmk_reboot_action=off
Meta Attrs: failure-timeout=3s
Operations: monitor interval=5s on-fail=ignore
(fence-uc-tigana-monitor-interval-5s)
start interval=0s on-fail=restart
(fence-uc-tigana-start-interval-0s)
Fencing Levels:
Location Constraints:
Ordering Constraints:
start fence-uc-orana then promote unicloud-master (kind:Mandatory)
start fence-uc-tigana then promote unicloud-master (kind:Mandatory)
Colocation Constraints:
fence-uc-orana with unicloud-master (score:INFINITY) (rsc-role:Started)
(with-rsc-role:Master)
fence-uc-tigana with unicloud-master (score:INFINITY) (rsc-role:Started)
(with-rsc-role:Master)
2. This is seen randomly. Since I am using colocation, stonith resources
are stopped and started on new master. That time, starting of stonith is
taking variable amount of time.
No other IPv6 issues are seen in the cluster nodes.
3. fence_agent version
[root at orana ~]# rpm -qa|grep fence-agents-ipmilan
fence-agents-ipmilan-4.0.11-66.el7.x86_64
*NOTE:*
Both IPv4 and IPv6 are configured on my ILO, with "iLO Client Applications
use IPv6 first" turned on.
Attaching corosync logs also.
Thanks, increasing timeout to 60 worked. But thats not what exactly I am
looking for. I need to know exact reason behind delay of starting these
IPv6 stonith resources.
Regards,
Rohit
On Tue, Apr 2, 2019 at 7:22 AM Ondrej <ondrej-clusterlabs at famera.cz> wrote:
> On 3/31/19 5:40 AM, Rohit Saini wrote:
> > Looking for some help on this.
> >
> > Thanks,
> > Rohit
>
> Hi Rohit,
>
> As a good start to figure out what is happening here can you please
> provide more detailed information such as:
>
> 1. What is the configuration of the stonith device when using IPv4 and
> when using IPv6? ('pcs stonith show --full' - you can obfuscate the
> username and password from that output, the main idea is if you are
> using 'hostname' or 'IP4/6 address here.
>
> 2. What does it mean 'sometime' it happens with IPv6? Is there any
> pattern (like every night around 3/4 am, or when there is more traffic
> on network, when we test XXX service, etc.) when this happens or does it
> looks to be happening randomly? Are there any other IPv6 issues present
> on system not related to cluster at time when the timeout is observed?
>
> 3. Are there any messages from from fence_ilo4 in the logs
> (/var/log/pacemaker.log, /var/log/cluster/corosync/corosync.log,
> /var/log/messages, ...) around the time when the timeout is reported
> that would suggest what could be happening?
>
> 4. Which version of fence_ilo4 are you using?
> # rpm -qa|grep fence-agents-ipmilan
> # fence-uc-orana
>
> ===
> To give you some answers your questions with information provided so far:
> > 1. Why is it happening only for IPv6 ILO devices? Is this some known
> > issue?
> Based on the data provided it is not clear where is the issue. Could be
> DNS resolution, could be network issue, ...
>
> > 2. Can we increase the timeout period "exec=20006ms" to something else.
> Yes you can do that and it may hide/"resolve" the issue if the
> fence_ilo4 can finish monitoring in the newly set timeout. You can give
> it a try and increase this to 40 seconds to see if that yields a better
> results in your environment. While the default 20 seconds should be
> enough for majority of environments there might be something requiring
> more time in your case that demands more time. Note that this approach
> might just effectively hide the underlying issue.
> To increase the timeout you should increase it for both 'start' and
> 'monitor' operation, for example like this:
>
> # pcs stonith update fence-uc-orana op start timeout=40s op monitor
> timeout=40s
>
> --
> Ondrej
>
> >
> > On Thu, Mar 28, 2019 at 11:24 AM Rohit Saini
> > <rohitsaini111.forum at gmail.com <mailto:rohitsaini111.forum at gmail.com>>
> > wrote:
> >
> > Hi All,
> > I am trying fence_ilo4 with same ILO device having IPv4 and IPv6
> > address. I see some discrepancy in both the behaviours:
> >
> > *1. When ILO has IPv4 address*
> > This is working fine and stonith resources are started immediately.
> >
> > *2. When ILO has IPv6 address*
> > Starting of stonith resources is taking more than 20 seconds
> sometime.
> >
> > *[root at tigana ~]# pcs status*
> > Cluster name: ucc
> > Stack: corosync
> > Current DC: tigana (version 1.1.16-12.el7-94ff4df) - partition with
> > quorum
> > Last updated: Wed Mar 27 00:01:37 2019
> > Last change: Wed Mar 27 00:01:19 2019 by root via cibadmin on orana
> >
> > 2 nodes configured
> > 4 resources configured
> >
> > Online: [ orana tigana ]
> >
> > Full list of resources:
> >
> > Master/Slave Set: unicloud-master [unicloud]
> > Masters: [ orana ]
> > Slaves: [ tigana ]
> > fence-uc-orana (stonith:fence_ilo4): FAILED orana
> > fence-uc-tigana (stonith:fence_ilo4): Started orana
> >
> > Failed Actions:
> > * fence-uc-orana_start_0 on orana 'unknown error' (1): call=32,
> > status=Timed Out, exitreason='none',
> > last-rc-change='Wed Mar 27 00:01:17 2019', queued=0ms,
> > exec=20006ms *<<<<<<<*
> >
> >
> > *Queries:*
> > 1. Why is it happening only for IPv6 ILO devices? Is this some known
> > issue?
> > 2. Can we increase the timeout period "exec=20006ms" to something
> else.
> >
> >
> > Thanks,
> > Rohit
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.clusterlabs.org/pipermail/users/attachments/20190403/0d310b71/attachment-0001.html>
-------------- next part --------------
Apr 02 14:36:54 [31771] orana cib: debug: crm_client_new: Connecting 0x562de6fc78e0 for uid=0 gid=0 pid=3053 id=71fdc22a-497c-43a2-af23-b1972808642f
Apr 02 14:36:54 [31771] orana cib: debug: handle_new_connection: IPC credentials authenticated (31771-3053-14)
Apr 02 14:36:54 [31771] orana cib: debug: qb_ipcs_shm_connect: connecting to client [3053]
Apr 02 14:36:54 [31771] orana cib: debug: qb_rb_open_2: shm size:524301; real_size:528384; rb->word_size:132096
Apr 02 14:36:54 [31771] orana cib: debug: qb_rb_open_2: shm size:524301; real_size:528384; rb->word_size:132096
Apr 02 14:36:54 [31771] orana cib: debug: qb_rb_open_2: shm size:524301; real_size:528384; rb->word_size:132096
Apr 02 14:36:54 [31771] orana cib: info: cib_process_request: Forwarding cib_replace operation for section configuration to all (origin=local/cibadmin/2)
Apr 02 14:36:54 [31771] orana cib: info: __xml_diff_object: Moved rsc_location at rsc (1 -> 3)
Apr 02 14:36:54 [31771] orana cib: info: __xml_diff_object: Moved rsc_location at node (3 -> 1)
Apr 02 14:36:54 [31771] orana cib: info: cib_perform_op: Diff: --- 0.21.14 2
Apr 02 14:36:54 [31771] orana cib: info: cib_perform_op: Diff: +++ 0.22.0 (null)
Apr 02 14:36:54 [31771] orana cib: info: cib_perform_op: + /cib: @epoch=22, @num_updates=0
Apr 02 14:36:54 [31771] orana cib: info: cib_perform_op: + /cib/configuration/resources/primitive[@id='fence-uc-orana']/instance_attributes[@id='fence-uc-orana-instance_attributes']/nvpair[@id='fence-uc-orana-instance_attributes-delay']: @value=10
Apr 02 14:36:54 [31771] orana cib: info: cib_perform_op: + /cib/configuration/constraints/rsc_location[@id='cli-prefer-unicloud-master']: @node=orana, @rsc=unicloud-master
Apr 02 14:36:54 [31771] orana cib: debug: activateCibXml: Triggering CIB write for cib_replace op
Apr 02 14:36:54 [31771] orana cib: info: cib_process_request: Completed cib_replace operation for section configuration: OK (rc=0, origin=orana/cibadmin/2, version=0.22.0)
Apr 02 14:36:54 [31772] orana stonith-ng: debug: xml_patch_version_check: Can apply patch 0.22.0 to 0.21.14
Apr 02 14:36:54 [31776] orana crmd: debug: te_update_diff: Processing (cib_replace) diff: 0.21.14 -> 0.22.0 (S_IDLE)
Apr 02 14:36:54 [31776] orana crmd: info: abort_transition_graph: Transition aborted by fence-uc-orana-instance_attributes-delay doing modify delay=10: Configuration change | cib=0.22.0 source=te_update_diff:456 path=/cib/configuration/resources/primitive[@id='fence-uc-orana']/instance_attributes[@id='fence-uc-orana-instance_attributes']/nvpair[@id='fence-uc-orana-instance_attributes-delay'] complete=true
Apr 02 14:36:54 [31776] orana crmd: debug: s_crmd_fsa: Processing I_PE_CALC: [ state=S_IDLE cause=C_FSA_INTERNAL origin=abort_transition_graph ]
Apr 02 14:36:54 [31776] orana crmd: notice: do_state_transition: State transition S_IDLE -> S_POLICY_ENGINE | input=I_PE_CALC cause=C_FSA_INTERNAL origin=abort_transition_graph
Apr 02 14:36:54 [31772] orana stonith-ng: info: update_cib_stonith_devices_v2: Updating device list from the cib: modify nvpair[@id='fence-uc-orana-instance_attributes-delay']
Apr 02 14:36:54 [31776] orana crmd: debug: do_state_transition: All 2 cluster nodes are eligible to run resources.
Apr 02 14:36:54 [31772] orana stonith-ng: info: cib_devices_update: Updating devices to version 0.22.0
Apr 02 14:36:54 [31776] orana crmd: debug: do_pe_invoke: Query 89: Requesting the current CIB: S_POLICY_ENGINE
Apr 02 14:36:54 [31771] orana cib: debug: qb_ipcs_dispatch_connection_request: HUP conn (31771-3053-14)
Apr 02 14:36:54 [31771] orana cib: debug: qb_ipcs_disconnect: qb_ipcs_disconnect(31771-3053-14) state:2
Apr 02 14:36:54 [31771] orana cib: debug: crm_client_destroy: Destroying 0 events
Apr 02 14:36:54 [31771] orana cib: debug: qb_rb_close_helper: Free'ing ringbuffer: /dev/shm/qb-cib_rw-response-31771-3053-14-header
Apr 02 14:36:54 [31772] orana stonith-ng: debug: unpack_config: STONITH timeout: 60000
Apr 02 14:36:54 [31772] orana stonith-ng: debug: unpack_config: STONITH of failed nodes is enabled
Apr 02 14:36:54 [31772] orana stonith-ng: debug: unpack_config: Concurrent fencing is disabled
Apr 02 14:36:54 [31772] orana stonith-ng: debug: unpack_config: Stop all active resources: false
Apr 02 14:36:54 [31771] orana cib: debug: qb_rb_close_helper: Free'ing ringbuffer: /dev/shm/qb-cib_rw-event-31771-3053-14-header
Apr 02 14:36:54 [31772] orana stonith-ng: debug: unpack_config: Cluster is symmetric - resources can run anywhere by default
Apr 02 14:36:54 [31772] orana stonith-ng: debug: unpack_config: Default stickiness: 0
Apr 02 14:36:54 [31772] orana stonith-ng: notice: unpack_config: On loss of CCM Quorum: Ignore
Apr 02 14:36:54 [31771] orana cib: debug: qb_rb_close_helper: Free'ing ringbuffer: /dev/shm/qb-cib_rw-request-31771-3053-14-header
Apr 02 14:36:54 [31772] orana stonith-ng: debug: unpack_config: Node scores: 'red' = -INFINITY, 'yellow' = 0, 'green' = 0
Apr 02 14:36:54 [31772] orana stonith-ng: debug: native_rsc_location: Constraint (cli-prefer-unicloud-master) is not active (role : Master vs. Unknown)
Apr 02 14:36:54 [31772] orana stonith-ng: debug: native_rsc_location: Constraint (cli-prefer-unicloud-master) is not active (role : Master vs. Unknown)
Apr 02 14:36:54 [31772] orana stonith-ng: debug: native_rsc_location: Constraint (cli-prefer-unicloud-master) is not active (role : Master vs. Unknown)
Apr 02 14:36:54 [31772] orana stonith-ng: debug: cib_device_update: Device fence-uc-orana is allowed on orana: score=0
Apr 02 14:36:54 [31772] orana stonith-ng: info: stonith_device_register: Overwriting an existing entry for fence-uc-orana from the cib
Apr 02 14:36:54 [31772] orana stonith-ng: notice: stonith_device_register: Added 'fence-uc-orana' to the device list (2 active devices)
Apr 02 14:36:54 [31772] orana stonith-ng: debug: cib_device_update: Device fence-uc-tigana is allowed on orana: score=0
Apr 02 14:36:54 [31772] orana stonith-ng: debug: stonith_device_register: Device 'fence-uc-tigana' already existed in device list (2 active devices)
Apr 02 14:36:54 [31776] orana crmd: debug: do_pe_invoke_callback: Invoking the PE: query=89, ref=pe_calc-dc-1554196014-106, seq=780, quorate=1
Apr 02 14:36:54 [31775] orana pengine: debug: unpack_config: STONITH timeout: 60000
Apr 02 14:36:54 [31775] orana pengine: debug: unpack_config: STONITH of failed nodes is enabled
Apr 02 14:36:54 [31775] orana pengine: debug: unpack_config: Concurrent fencing is disabled
Apr 02 14:36:54 [31775] orana pengine: debug: unpack_config: Stop all active resources: false
Apr 02 14:36:54 [31775] orana pengine: debug: unpack_config: Cluster is symmetric - resources can run anywhere by default
Apr 02 14:36:54 [31775] orana pengine: debug: unpack_config: Default stickiness: 0
Apr 02 14:36:54 [31775] orana pengine: notice: unpack_config: On loss of CCM Quorum: Ignore
Apr 02 14:36:54 [31775] orana pengine: debug: unpack_config: Node scores: 'red' = -INFINITY, 'yellow' = 0, 'green' = 0
Apr 02 14:36:54 [31775] orana pengine: info: determine_online_status_fencing: Node tigana is active
Apr 02 14:36:54 [31775] orana pengine: info: determine_online_status: Node tigana is online
Apr 02 14:36:54 [31775] orana pengine: info: determine_online_status_fencing: Node orana is active
Apr 02 14:36:54 [31775] orana pengine: info: determine_online_status: Node orana is online
Apr 02 14:36:54 [31775] orana pengine: debug: find_anonymous_clone: Internally renamed unicloud on tigana to unicloud:0
Apr 02 14:36:54 [31775] orana pengine: debug: find_anonymous_clone: Internally renamed unicloud on orana to unicloud:1
Apr 02 14:36:54 [31775] orana pengine: info: unpack_node_loop: Node 2 is already processed
Apr 02 14:36:54 [31775] orana pengine: info: unpack_node_loop: Node 1 is already processed
Apr 02 14:36:54 [31775] orana pengine: info: unpack_node_loop: Node 2 is already processed
Apr 02 14:36:54 [31775] orana pengine: info: unpack_node_loop: Node 1 is already processed
Apr 02 14:36:54 [31775] orana pengine: info: clone_print: Master/Slave Set: unicloud-master [unicloud]
Apr 02 14:36:54 [31775] orana pengine: debug: native_active: Resource unicloud:0 active on tigana
Apr 02 14:36:54 [31775] orana pengine: debug: native_active: Resource unicloud:0 active on tigana
Apr 02 14:36:54 [31775] orana pengine: debug: native_active: Resource unicloud:1 active on orana
Apr 02 14:36:54 [31775] orana pengine: debug: native_active: Resource unicloud:1 active on orana
Apr 02 14:36:54 [31775] orana pengine: info: short_print: Masters: [ orana ]
Apr 02 14:36:54 [31775] orana pengine: info: short_print: Slaves: [ tigana ]
Apr 02 14:36:54 [31775] orana pengine: info: common_print: fence-uc-orana (stonith:fence_ilo4): Started orana
Apr 02 14:36:54 [31775] orana pengine: info: common_print: fence-uc-tigana (stonith:fence_ilo4): Started orana
Apr 02 14:36:54 [31775] orana pengine: debug: native_rsc_location: Constraint (cli-prefer-unicloud-master) is not active (role : Master vs. Unknown)
Apr 02 14:36:54 [31775] orana pengine: debug: native_rsc_location: Constraint (cli-prefer-unicloud-master) is not active (role : Master vs. Unknown)
Apr 02 14:36:54 [31775] orana pengine: debug: native_rsc_location: Constraint (cli-prefer-unicloud-master) is not active (role : Master vs. Unknown)
Apr 02 14:36:54 [31775] orana pengine: debug: common_apply_stickiness: Resource unicloud:1: preferring current location (node=orana, weight=1)
Apr 02 14:36:54 [31775] orana pengine: debug: common_apply_stickiness: Resource unicloud:0: preferring current location (node=tigana, weight=1)
Apr 02 14:36:54 [31775] orana pengine: info: check_action_definition: params:reload <parameters login="parallel" passwd="wireless" pcmk_host_list="orana" delay="10" lanplus="1" pcmk_reboot_action="off" ipaddr="fd00:1061:37:9002::"/>
Apr 02 14:36:54 [31775] orana pengine: info: check_action_definition: Parameters to fence-uc-orana_start_0 on orana changed: was 7c0f1995201e41fc077992f7a33993d0 vs. now 6003bb66f1f096d6575040016bcf6afe (reload:3.0.12) 0:0;36:27:0:80fd8561-42be-4d70-865d-ae837d35e118
Apr 02 14:36:54 [31775] orana pengine: info: check_action_definition: params:reload <parameters login="parallel" passwd="wireless" pcmk_host_list="orana" delay="10" lanplus="1" pcmk_reboot_action="off" ipaddr="fd00:1061:37:9002::" CRM_meta_timeout="20000"/>
Apr 02 14:36:54 [31775] orana pengine: info: check_action_definition: Parameters to fence-uc-orana_monitor_5000 on orana changed: was e38788f05a76b689a248ba0f8c4790b0 vs. now d25c05ce4a56b387fe2d44e5104775af (reload:3.0.12) 0:0;37:27:0:80fd8561-42be-4d70-865d-ae837d35e118
Apr 02 14:36:54 [31775] orana pengine: debug: distribute_children: Allocating 2 unicloud-master instances to a possible 2 nodes (1 per host, 1 optimal)
Apr 02 14:36:54 [31775] orana pengine: debug: native_assign_node: Assigning orana to unicloud:1
Apr 02 14:36:54 [31775] orana pengine: debug: native_assign_node: Assigning tigana to unicloud:0
Apr 02 14:36:54 [31775] orana pengine: debug: distribute_children: Allocated 2 unicloud-master instances of a possible 2
Apr 02 14:36:54 [31775] orana pengine: debug: master_color: unicloud:1 master score: 1000000
Apr 02 14:36:54 [31775] orana pengine: info: master_color: Promoting unicloud:1 (Master orana)
Apr 02 14:36:54 [31775] orana pengine: debug: master_color: unicloud:0 master score: 5
Apr 02 14:36:54 [31775] orana pengine: info: master_color: unicloud-master: Promoted 1 instances of a possible 1 to master
Apr 02 14:36:54 [31775] orana pengine: debug: native_assign_node: Assigning orana to fence-uc-orana
Apr 02 14:36:54 [31775] orana pengine: debug: native_assign_node: Assigning orana to fence-uc-tigana
Apr 02 14:36:54 [31775] orana pengine: debug: master_create_actions: Creating actions for unicloud-master
Apr 02 14:36:54 [31775] orana pengine: info: RecurringOp: Start recurring monitor (5s) for fence-uc-orana on orana
Apr 02 14:36:54 [31775] orana pengine: info: LogActions: Leave unicloud:0 (Slave tigana)
Apr 02 14:36:54 [31775] orana pengine: info: LogActions: Leave unicloud:1 (Master orana)
Apr 02 14:36:54 [31775] orana pengine: notice: LogActions: Restart fence-uc-orana (Started orana)
Apr 02 14:36:54 [31775] orana pengine: info: LogActions: Leave fence-uc-tigana (Started orana)
Apr 02 14:36:54 [31775] orana pengine: debug: action2xml: Using anonymous clone name unicloud for unicloud:0 (aka. unicloud)
Apr 02 14:36:54 [31775] orana pengine: debug: action2xml: Using anonymous clone name unicloud for unicloud:0 (aka. unicloud)
Apr 02 14:36:54 [31771] orana cib: info: cib_file_backup: Archived previous version as /var/lib/pacemaker/cib/cib-22.raw
Apr 02 14:36:54 [31771] orana cib: debug: cib_file_write_with_digest: Writing CIB to disk
Apr 02 14:36:54 [31775] orana pengine: debug: action2xml: Using anonymous clone name unicloud for unicloud:1 (aka. unicloud)
Apr 02 14:36:54 [31775] orana pengine: debug: action2xml: Using anonymous clone name unicloud for unicloud:1 (aka. unicloud)
Apr 02 14:36:54 [31775] orana pengine: debug: action2xml: Using anonymous clone name unicloud for unicloud:1 (aka. unicloud)
Apr 02 14:36:54 [31775] orana pengine: notice: process_pe_message: Calculated transition 29, saving inputs in /var/lib/pacemaker/pengine/pe-input-2753.bz2
Apr 02 14:36:54 [31776] orana crmd: debug: s_crmd_fsa: Processing I_PE_SUCCESS: [ state=S_POLICY_ENGINE cause=C_IPC_MESSAGE origin=handle_response ]
Apr 02 14:36:54 [31776] orana crmd: info: do_state_transition: State transition S_POLICY_ENGINE -> S_TRANSITION_ENGINE | input=I_PE_SUCCESS cause=C_IPC_MESSAGE origin=handle_response
Apr 02 14:36:54 [31776] orana crmd: debug: unpack_graph: Unpacked transition 29: 15 actions in 15 synapses
Apr 02 14:36:54 [31776] orana crmd: info: do_te_invoke: Processing graph 29 (ref=pe_calc-dc-1554196014-106) derived from /var/lib/pacemaker/pengine/pe-input-2753.bz2
Apr 02 14:36:54 [31776] orana crmd: debug: te_pseudo_action: Pseudo-action 27 (unicloud-master_pre_notify_promote_0) fired and confirmed
Apr 02 14:36:54 [31776] orana crmd: notice: te_rsc_command: Initiating stop operation fence-uc-orana_stop_0 locally on orana | action 37
Apr 02 14:36:54 [31776] orana crmd: debug: stop_recurring_action_by_rsc: Cancelling op 34 for fence-uc-orana (fence-uc-orana:34)
Apr 02 14:36:54 [31776] orana crmd: debug: cancel_op: Cancelling op 34 for fence-uc-orana (fence-uc-orana:34)
Apr 02 14:36:54 [31773] orana lrmd: debug: log_finished: finished - rsc:fence-uc-orana action:monitor call_id:34 exit-code:0 exec-time:0ms queue-time:0ms
Apr 02 14:36:54 [31773] orana lrmd: debug: process_lrmd_message: Processed lrmd_rsc_cancel operation from dc289d09-c257-44cc-90bd-7d2ba98d5b4b: rc=0, reply=1, notify=0
Apr 02 14:36:54 [31776] orana crmd: debug: cancel_op: Op 34 for fence-uc-orana (fence-uc-orana:34): cancelled
Apr 02 14:36:54 [31776] orana crmd: info: do_lrm_rsc_op: Performing key=37:29:0:80fd8561-42be-4d70-865d-ae837d35e118 op=fence-uc-orana_stop_0
Apr 02 14:36:54 [31773] orana lrmd: debug: process_lrmd_message: Processed lrmd_rsc_exec operation from dc289d09-c257-44cc-90bd-7d2ba98d5b4b: rc=40, reply=1, notify=0
Apr 02 14:36:54 [31773] orana lrmd: info: log_execute: executing - rsc:fence-uc-orana action:stop call_id:40
Apr 02 14:36:54 [31776] orana crmd: debug: run_graph: Transition 29 (Complete=0, Pending=1, Fired=2, Skipped=0, Incomplete=13, Source=/var/lib/pacemaker/pengine/pe-input-2753.bz2): In-progress
Apr 02 14:36:54 [31776] orana crmd: info: process_lrm_event: Result of monitor operation for fence-uc-orana on orana: Cancelled | call=34 key=fence-uc-orana_monitor_5000 confirmed=true
Apr 02 14:36:54 [31776] orana crmd: debug: send_notifications: Sending 'resource' alert to '/var/lib/pacemaker/pw_alert.sh' via '/var/lib/pacemaker/pw_alert.sh'
Apr 02 14:36:54 [31772] orana stonith-ng: debug: stonith_command: Processing st_device_remove 15 from lrmd.31773 ( 1000)
Apr 02 14:36:54 [31772] orana stonith-ng: debug: stonith_command: Processed st_device_remove from lrmd.31773: OK (0)
Apr 02 14:36:54 [31773] orana lrmd: info: log_finished: finished - rsc:fence-uc-orana action:stop call_id:40 exit-code:0 exec-time:1ms queue-time:0ms
Apr 02 14:36:54 [31776] orana crmd: debug: update_history_cache: Updating history for 'fence-uc-orana' with monitor op
Apr 02 14:36:54 [31776] orana crmd: debug: create_operation_update: do_update_resource: Updating resource fence-uc-orana after stop op complete (interval=0)
Apr 02 14:36:54 [31776] orana crmd: notice: process_lrm_event: Result of stop operation for fence-uc-orana on orana: 0 (ok) | call=40 key=fence-uc-orana_stop_0 confirmed=true cib-update=90
Apr 02 14:36:54 [31776] orana crmd: debug: send_notifications: Sending 'resource' alert to '/var/lib/pacemaker/pw_alert.sh' via '/var/lib/pacemaker/pw_alert.sh'
Apr 02 14:36:54 [31771] orana cib: info: cib_process_request: Forwarding cib_modify operation for section status to all (origin=local/crmd/90)
Apr 02 14:36:54 [31776] orana crmd: debug: update_history_cache: Updating history for 'fence-uc-orana' with stop op
Apr 02 14:36:54 [31776] orana crmd: notice: te_rsc_command: Initiating notify operation unicloud_pre_notify_promote_0 on tigana | action 50
Apr 02 14:36:54 [31776] orana crmd: notice: te_rsc_command: Initiating notify operation unicloud_pre_notify_promote_0 locally on orana | action 52
Apr 02 14:36:54 [31771] orana cib: info: cib_perform_op: Diff: --- 0.22.0 2
Apr 02 14:36:54 [31771] orana cib: info: cib_perform_op: Diff: +++ 0.22.1 (null)
Apr 02 14:36:54 [31771] orana cib: info: cib_perform_op: + /cib: @num_updates=1
Apr 02 14:36:54 [31771] orana cib: info: cib_perform_op: + /cib/status/node_state[@id='1']/lrm[@id='1']/lrm_resources/lrm_resource[@id='fence-uc-orana']/lrm_rsc_op[@id='fence-uc-orana_last_0']: @operation_key=fence-uc-orana_stop_0, @operation=stop, @transition-key=37:29:0:80fd8561-42be-4d70-865d-ae837d35e118, @transition-magic=0:0;37:29:0:80fd8561-42be-4d70-865d-ae837d35e118, @call-id=40, @last-run=1554196014, @last-rc-change=1554196014, @exec-time=1
Apr 02 14:36:54 [31776] orana crmd: info: do_lrm_rsc_op: Performing key=52:29:0:80fd8561-42be-4d70-865d-ae837d35e118 op=unicloud_notify_0
Apr 02 14:36:54 [31771] orana cib: info: cib_process_request: Completed cib_modify operation for section status: OK (rc=0, origin=orana/crmd/90, version=0.22.1)
Apr 02 14:36:54 [31773] orana lrmd: debug: process_lrmd_message: Processed lrmd_rsc_exec operation from dc289d09-c257-44cc-90bd-7d2ba98d5b4b: rc=41, reply=1, notify=0
Apr 02 14:36:54 [31772] orana stonith-ng: debug: xml_patch_version_check: Can apply patch 0.22.1 to 0.22.0
Apr 02 14:36:54 [31773] orana lrmd: info: log_execute: executing - rsc:unicloud action:notify call_id:41
Apr 02 14:36:54 [31776] orana crmd: debug: run_graph: Transition 29 (Complete=1, Pending=3, Fired=2, Skipped=0, Incomplete=11, Source=/var/lib/pacemaker/pengine/pe-input-2753.bz2): In-progress
Apr 02 14:36:54 [31776] orana crmd: debug: te_update_diff: Processing (cib_modify) diff: 0.22.0 -> 0.22.1 (S_TRANSITION_ENGINE)
Apr 02 14:36:54 [31776] orana crmd: info: match_graph_event: Action fence-uc-orana_stop_0 (37) confirmed on orana (rc=0)
Apr 02 14:36:54 [31771] orana cib: info: cib_file_write_with_digest: Wrote version 0.22.0 of the CIB to disk (digest: 4fa5c04719a2e78200d53eda83a775d7)
Apr 02 14:36:54 [31776] orana crmd: notice: te_rsc_command: Initiating start operation fence-uc-orana_start_0 locally on orana | action 6 <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
Apr 02 14:36:54 [31776] orana crmd: info: do_lrm_rsc_op: Performing key=6:29:0:80fd8561-42be-4d70-865d-ae837d35e118 op=fence-uc-orana_start_0
Apr 02 14:36:54 [31773] orana lrmd: debug: process_lrmd_message: Processed lrmd_rsc_exec operation from dc289d09-c257-44cc-90bd-7d2ba98d5b4b: rc=42, reply=1, notify=0
Apr 02 14:36:54 [31773] orana lrmd: info: log_execute: executing - rsc:fence-uc-orana action:start call_id:42
Apr 02 14:36:54 [31776] orana crmd: debug: te_pseudo_action: Pseudo-action 5 (all_stopped) fired and confirmed
Apr 02 14:36:54 [31772] orana stonith-ng: debug: stonith_command: Processing st_device_register 16 from lrmd.31773 ( 1000)
Apr 02 14:36:54 [31776] orana crmd: debug: run_graph: Transition 29 (Complete=2, Pending=3, Fired=2, Skipped=0, Incomplete=9, Source=/var/lib/pacemaker/pengine/pe-input-2753.bz2): In-progress
Apr 02 14:36:54 [31776] orana crmd: debug: run_graph: Transition 29 (Complete=3, Pending=3, Fired=0, Skipped=0, Incomplete=9, Source=/var/lib/pacemaker/pengine/pe-input-2753.bz2): In-progress
Apr 02 14:36:54 [31772] orana stonith-ng: debug: stonith_device_register: Device 'fence-uc-orana' already existed in device list (2 active devices)
Apr 02 14:36:54 [31772] orana stonith-ng: debug: stonith_command: Processed st_device_register from lrmd.31773: OK (0)
Apr 02 14:36:54 [31772] orana stonith-ng: debug: stonith_command: Processing st_execute 17 from lrmd.31773 ( 0)
Apr 02 14:36:54 [31772] orana stonith-ng: debug: schedule_stonith_command: Scheduling monitor on fence-uc-orana for 5549509e-0dfb-4bf8-9066-8e64c2e7ae34 (timeout=20s)
Apr 02 14:36:54 [31772] orana stonith-ng: debug: stonith_command: Processed st_execute from lrmd.31773: Operation now in progress (-115)
Apr 02 14:36:54 [31772] orana stonith-ng: debug: stonith_action_create: Initiating action monitor for agent fence_ilo4 (target=(null))
Apr 02 14:36:54 [31772] orana stonith-ng: debug: internal_stonith_action_execute: forking
Apr 02 14:36:54 [31772] orana stonith-ng: debug: internal_stonith_action_execute: sending args
Apr 02 14:36:54 [31772] orana stonith-ng: debug: stonith_device_execute: Operation monitor on fence-uc-orana now running with pid=3063, timeout=20s
Apr 02 14:36:54 [31771] orana cib: debug: cib_file_write_with_digest: Wrote digest 4fa5c04719a2e78200d53eda83a775d7 to disk
Apr 02 14:36:54 [31771] orana cib: info: cib_file_write_with_digest: Reading cluster configuration file /var/lib/pacemaker/cib/cib.JQwnqX (digest: /var/lib/pacemaker/cib/cib.6jhxaH)
Apr 02 14:36:54 [31771] orana cib: debug: cib_file_write_with_digest: Activating /var/lib/pacemaker/cib/cib.JQwnqX
Apr 02 14:36:54 [3072] orana crm_node: info: crm_xml_cleanup: Cleaning up memory from libxml2
Apr 02 14:36:54 [31776] orana crmd: debug: operation_finished: pw_pcsd_alert_id:3056 - exited with rc=0
Apr 02 14:36:54 [31776] orana crmd: debug: operation_finished: pw_pcsd_alert_id:3056:stderr [ -- empty -- ]
Apr 02 14:36:54 [31776] orana crmd: debug: operation_finished: pw_pcsd_alert_id:3056:stdout [ -- empty -- ]
Apr 02 14:36:54 [31776] orana crmd: info: crmd_notify_complete: Alert 25 (/var/lib/pacemaker/pw_alert.sh) complete
Apr 02 14:36:54 [31776] orana crmd: debug: child_waitpid: wait(3057) = 0: Success (0)
Apr 02 14:36:54 [31776] orana crmd: debug: operation_finished: pw_pcsd_alert_id:3057 - exited with rc=0
Apr 02 14:36:54 [31776] orana crmd: debug: operation_finished: pw_pcsd_alert_id:3057:stderr [ -- empty -- ]
Apr 02 14:36:54 [31776] orana crmd: debug: operation_finished: pw_pcsd_alert_id:3057:stdout [ -- empty -- ]
Apr 02 14:36:54 [31776] orana crmd: info: crmd_notify_complete: Alert 26 (/var/lib/pacemaker/pw_alert.sh) complete
Apr 02 14:36:54 [31776] orana crmd: debug: process_te_message: Processing (N)ACK lrm_invoke-lrmd-1554196014-19 from tigana
Apr 02 14:36:54 [31776] orana crmd: info: match_graph_event: Action unicloud_notify_0 (50) confirmed on tigana (rc=0)
Apr 02 14:36:54 [31776] orana crmd: debug: run_graph: Transition 29 (Complete=4, Pending=2, Fired=0, Skipped=0, Incomplete=9, Source=/var/lib/pacemaker/pengine/pe-input-2753.bz2): In-progress
uc(unicloud)[3058]: Apr 02 14:36:54 DEBUG: Received pre-promote notification.
Apr 02 14:36:54 uc(unicloud)[3058]: INFO: [UC] Tue Apr 2 14:36:54 IST 2019 NOTIFY pre-promote master=orana promote=orana
Apr 02 14:36:54 [31773] orana lrmd: debug: operation_finished: unicloud_notify_0:3058 - exited with rc=0
Apr 02 14:36:54 [31773] orana lrmd: debug: operation_finished: unicloud_notify_0:3058:stderr [ -- empty -- ]
Apr 02 14:36:54 [31773] orana lrmd: debug: operation_finished: unicloud_notify_0:3058:stdout [ -- empty -- ]
Apr 02 14:36:54 [31773] orana lrmd: info: log_finished: finished - rsc:unicloud action:notify call_id:41 pid:3058 exit-code:0 exec-time:70ms queue-time:0ms
Apr 02 14:36:54 [31776] orana crmd: debug: create_operation_update: send_direct_ack: Updating resource unicloud after notify op complete (interval=0)
Apr 02 14:36:54 [31776] orana crmd: debug: send_direct_ack: ACK'ing resource op unicloud_notify_0 from 52:29:0:80fd8561-42be-4d70-865d-ae837d35e118: lrm_invoke-lrmd-1554196014-111
Apr 02 14:36:54 [31776] orana crmd: debug: process_te_message: Processing (N)ACK lrm_invoke-lrmd-1554196014-111 from orana
Apr 02 14:36:54 [31776] orana crmd: info: match_graph_event: Action unicloud_notify_0 (52) confirmed on orana (rc=0)
Apr 02 14:36:54 [31776] orana crmd: notice: process_lrm_event: Result of notify operation for unicloud on orana: 0 (ok) | call=41 key=unicloud_notify_0 confirmed=true cib-update=0
Apr 02 14:36:54 [31776] orana crmd: debug: send_notifications: Sending 'resource' alert to '/var/lib/pacemaker/pw_alert.sh' via '/var/lib/pacemaker/pw_alert.sh'
Apr 02 14:36:54 [31776] orana crmd: debug: te_pseudo_action: Pseudo-action 28 (unicloud-master_confirmed-pre_notify_promote_0) fired and confirmed
Apr 02 14:36:54 [31776] orana crmd: debug: run_graph: Transition 29 (Complete=5, Pending=1, Fired=1, Skipped=0, Incomplete=8, Source=/var/lib/pacemaker/pengine/pe-input-2753.bz2): In-progress
Apr 02 14:36:54 [31776] orana crmd: debug: run_graph: Transition 29 (Complete=6, Pending=1, Fired=0, Skipped=0, Incomplete=8, Source=/var/lib/pacemaker/pengine/pe-input-2753.bz2): In-progress
Apr 02 14:36:54 [31776] orana crmd: debug: operation_finished: pw_pcsd_alert_id:3104 - exited with rc=0
Apr 02 14:36:54 [31776] orana crmd: debug: operation_finished: pw_pcsd_alert_id:3104:stderr [ -- empty -- ]
Apr 02 14:36:54 [31776] orana crmd: debug: operation_finished: pw_pcsd_alert_id:3104:stdout [ -- empty -- ]
Apr 02 14:36:54 [31776] orana crmd: info: crmd_notify_complete: Alert 27 (/var/lib/pacemaker/pw_alert.sh) complete
Apr 02 14:36:54 [31771] orana cib: debug: crm_client_new: Connecting 0x562de70456b0 for uid=0 gid=0 pid=3118 id=14dba87a-215a-46aa-99d4-51c8d2c7c118
Apr 02 14:36:54 [31771] orana cib: debug: handle_new_connection: IPC credentials authenticated (31771-3118-14)
Apr 02 14:36:54 [31771] orana cib: debug: qb_ipcs_shm_connect: connecting to client [3118]
Apr 02 14:36:54 [31771] orana cib: debug: qb_rb_open_2: shm size:524301; real_size:528384; rb->word_size:132096
Apr 02 14:36:54 [31771] orana cib: debug: qb_rb_open_2: shm size:524301; real_size:528384; rb->word_size:132096
Apr 02 14:36:54 [31771] orana cib: debug: qb_rb_open_2: shm size:524301; real_size:528384; rb->word_size:132096
Apr 02 14:36:54 [31771] orana cib: debug: qb_ipcs_dispatch_connection_request: HUP conn (31771-3118-14)
Apr 02 14:36:54 [31771] orana cib: debug: qb_ipcs_disconnect: qb_ipcs_disconnect(31771-3118-14) state:2
Apr 02 14:36:54 [31771] orana cib: debug: crm_client_destroy: Destroying 0 events
Apr 02 14:36:54 [31771] orana cib: debug: qb_rb_close_helper: Free'ing ringbuffer: /dev/shm/qb-cib_rw-response-31771-3118-14-header
Apr 02 14:36:54 [31771] orana cib: debug: qb_rb_close_helper: Free'ing ringbuffer: /dev/shm/qb-cib_rw-event-31771-3118-14-header
Apr 02 14:36:54 [31771] orana cib: debug: qb_rb_close_helper: Free'ing ringbuffer: /dev/shm/qb-cib_rw-request-31771-3118-14-header
Apr 02 14:36:54 [31773] orana lrmd: debug: log_execute: executing - rsc:fence-uc-tigana action:monitor call_id:35
Apr 02 14:36:54 [31772] orana stonith-ng: debug: stonith_command: Processing st_execute 18 from lrmd.31773 ( 0)
Apr 02 14:36:54 [31772] orana stonith-ng: debug: schedule_stonith_command: Scheduling monitor on fence-uc-tigana for 5549509e-0dfb-4bf8-9066-8e64c2e7ae34 (timeout=20s)
Apr 02 14:36:54 [31772] orana stonith-ng: debug: stonith_command: Processed st_execute from lrmd.31773: Operation now in progress (-115)
Apr 02 14:36:54 [31772] orana stonith-ng: debug: stonith_action_create: Initiating action monitor for agent fence_ilo4 (target=(null))
Apr 02 14:36:54 [31772] orana stonith-ng: debug: internal_stonith_action_execute: forking
Apr 02 14:36:54 [31772] orana stonith-ng: debug: internal_stonith_action_execute: sending args
Apr 02 14:36:54 [31772] orana stonith-ng: debug: stonith_device_execute: Operation monitor on fence-uc-tigana now running with pid=3125, timeout=20s
Apr 02 14:36:55 [31771] orana cib: debug: crm_client_new: Connecting 0x562de70456b0 for uid=0 gid=0 pid=3133 id=ca298340-579c-45f6-9ecd-2e41a9a76986
Apr 02 14:36:55 [31771] orana cib: debug: handle_new_connection: IPC credentials authenticated (31771-3133-14)
Apr 02 14:36:55 [31771] orana cib: debug: qb_ipcs_shm_connect: connecting to client [3133]
Apr 02 14:36:55 [31771] orana cib: debug: qb_rb_open_2: shm size:524301; real_size:528384; rb->word_size:132096
Apr 02 14:36:55 [31771] orana cib: debug: qb_rb_open_2: shm size:524301; real_size:528384; rb->word_size:132096
Apr 02 14:36:55 [31771] orana cib: debug: qb_rb_open_2: shm size:524301; real_size:528384; rb->word_size:132096
Apr 02 14:36:55 [31771] orana cib: info: cib_process_request: Forwarding cib_replace operation for section configuration to all (origin=local/cibadmin/2)
Apr 02 14:36:55 [31771] orana cib: info: cib_perform_op: Diff: --- 0.22.1 2
Apr 02 14:36:55 [31771] orana cib: info: cib_perform_op: Diff: +++ 0.23.0 (null)
Apr 02 14:36:55 [31771] orana cib: info: cib_perform_op: + /cib: @epoch=23, @num_updates=0
Apr 02 14:36:55 [31771] orana cib: info: cib_perform_op: + /cib/configuration/resources/primitive[@id='fence-uc-tigana']/instance_attributes[@id='fence-uc-tigana-instance_attributes']/nvpair[@id='fence-uc-tigana-instance_attributes-delay']: @value=0
Apr 02 14:36:55 [31771] orana cib: debug: activateCibXml: Triggering CIB write for cib_replace op
Apr 02 14:36:55 [31771] orana cib: info: cib_process_request: Completed cib_replace operation for section configuration: OK (rc=0, origin=orana/cibadmin/2, version=0.23.0)
Apr 02 14:36:55 [31776] orana crmd: debug: te_update_diff: Processing (cib_replace) diff: 0.22.1 -> 0.23.0 (S_TRANSITION_ENGINE)
Apr 02 14:36:55 [31772] orana stonith-ng: debug: xml_patch_version_check: Can apply patch 0.23.0 to 0.22.1
Apr 02 14:36:55 [31776] orana crmd: debug: update_abort_priority: Abort priority upgraded from 0 to 1000000
Apr 02 14:36:55 [31776] orana crmd: debug: update_abort_priority: Abort action done superseded by restart: Configuration change
Apr 02 14:36:55 [31776] orana crmd: notice: abort_transition_graph: Transition aborted by fence-uc-tigana-instance_attributes-delay doing modify delay=0: Configuration change | cib=0.23.0 source=te_update_diff:456 path=/cib/configuration/resources/primitive[@id='fence-uc-tigana']/instance_attributes[@id='fence-uc-tigana-instance_attributes']/nvpair[@id='fence-uc-tigana-instance_attributes-delay'] complete=false
Apr 02 14:36:55 [31776] orana crmd: debug: run_graph: Transition 29 (Complete=6, Pending=1, Fired=0, Skipped=0, Incomplete=8, Source=/var/lib/pacemaker/pengine/pe-input-2753.bz2): In-progress
Apr 02 14:36:55 [31772] orana stonith-ng: info: update_cib_stonith_devices_v2: Updating device list from the cib: modify nvpair[@id='fence-uc-tigana-instance_attributes-delay']
Apr 02 14:36:55 [31772] orana stonith-ng: info: cib_devices_update: Updating devices to version 0.23.0
Apr 02 14:36:55 [31772] orana stonith-ng: debug: unpack_config: STONITH timeout: 60000
Apr 02 14:36:55 [31771] orana cib: debug: qb_ipcs_dispatch_connection_request: HUP conn (31771-3133-14)
Apr 02 14:36:55 [31772] orana stonith-ng: debug: unpack_config: STONITH of failed nodes is enabled
Apr 02 14:36:55 [31771] orana cib: debug: qb_ipcs_disconnect: qb_ipcs_disconnect(31771-3133-14) state:2
Apr 02 14:36:55 [31772] orana stonith-ng: debug: unpack_config: Concurrent fencing is disabled
Apr 02 14:36:55 [31771] orana cib: debug: crm_client_destroy: Destroying 0 events
Apr 02 14:36:55 [31772] orana stonith-ng: debug: unpack_config: Stop all active resources: false
Apr 02 14:36:55 [31772] orana stonith-ng: debug: unpack_config: Cluster is symmetric - resources can run anywhere by default
Apr 02 14:36:55 [31771] orana cib: debug: qb_rb_close_helper: Free'ing ringbuffer: /dev/shm/qb-cib_rw-response-31771-3133-14-header
Apr 02 14:36:55 [31772] orana stonith-ng: debug: unpack_config: Default stickiness: 0
Apr 02 14:36:55 [31772] orana stonith-ng: notice: unpack_config: On loss of CCM Quorum: Ignore
Apr 02 14:36:55 [31772] orana stonith-ng: debug: unpack_config: Node scores: 'red' = -INFINITY, 'yellow' = 0, 'green' = 0
Apr 02 14:36:55 [31771] orana cib: debug: qb_rb_close_helper: Free'ing ringbuffer: /dev/shm/qb-cib_rw-event-31771-3133-14-header
Apr 02 14:36:55 [31771] orana cib: debug: qb_rb_close_helper: Free'ing ringbuffer: /dev/shm/qb-cib_rw-request-31771-3133-14-header
Apr 02 14:36:55 [31772] orana stonith-ng: debug: native_rsc_location: Constraint (cli-prefer-unicloud-master) is not active (role : Master vs. Unknown)
Apr 02 14:36:55 [31772] orana stonith-ng: debug: native_rsc_location: Constraint (cli-prefer-unicloud-master) is not active (role : Master vs. Unknown)
Apr 02 14:36:55 [31772] orana stonith-ng: debug: native_rsc_location: Constraint (cli-prefer-unicloud-master) is not active (role : Master vs. Unknown)
Apr 02 14:36:55 [31772] orana stonith-ng: debug: cib_device_update: Device fence-uc-orana is allowed on orana: score=0
Apr 02 14:36:55 [31772] orana stonith-ng: debug: stonith_device_register: Device 'fence-uc-orana' already existed in device list (2 active devices)
Apr 02 14:36:55 [31772] orana stonith-ng: debug: cib_device_update: Device fence-uc-tigana is allowed on orana: score=0
Apr 02 14:36:55 [31772] orana stonith-ng: info: stonith_device_register: Overwriting an existing entry for fence-uc-tigana from the cib
Apr 02 14:36:55 [31772] orana stonith-ng: notice: stonith_device_register: Added 'fence-uc-tigana' to the device list (2 active devices)
Apr 02 14:36:55 [31771] orana cib: info: cib_file_backup: Archived previous version as /var/lib/pacemaker/cib/cib-23.raw
Apr 02 14:36:55 [31771] orana cib: debug: cib_file_write_with_digest: Writing CIB to disk
Apr 02 14:36:55 [31771] orana cib: info: cib_file_write_with_digest: Wrote version 0.23.0 of the CIB to disk (digest: 8ab1c5f93c56862cce3f55eea68180e3)
Apr 02 14:36:55 [31771] orana cib: debug: cib_file_write_with_digest: Wrote digest 8ab1c5f93c56862cce3f55eea68180e3 to disk
Apr 02 14:36:55 [31771] orana cib: info: cib_file_write_with_digest: Reading cluster configuration file /var/lib/pacemaker/cib/cib.n1dmAm (digest: /var/lib/pacemaker/cib/cib.SdBYkb)
Apr 02 14:36:55 [31771] orana cib: debug: cib_file_write_with_digest: Activating /var/lib/pacemaker/cib/cib.n1dmAm
Apr 02 14:36:55 [31771] orana cib: debug: crm_client_new: Connecting 0x562de6f3b9c0 for uid=0 gid=0 pid=3154 id=bc38848c-8ddd-4906-973d-df8e6f9c9184
Apr 02 14:36:55 [31771] orana cib: debug: handle_new_connection: IPC credentials authenticated (31771-3154-14)
Apr 02 14:36:55 [31771] orana cib: debug: qb_ipcs_shm_connect: connecting to client [3154]
Apr 02 14:36:55 [31771] orana cib: debug: qb_rb_open_2: shm size:524301; real_size:528384; rb->word_size:132096
Apr 02 14:36:55 [31771] orana cib: debug: qb_rb_open_2: shm size:524301; real_size:528384; rb->word_size:132096
Apr 02 14:36:55 [31771] orana cib: debug: qb_rb_open_2: shm size:524301; real_size:528384; rb->word_size:132096
Apr 02 14:36:55 [31771] orana cib: debug: cib_process_xpath: Processing cib_query op for //constraints with /cib/configuration/constraints
Apr 02 14:36:55 [31771] orana cib: debug: qb_ipcs_dispatch_connection_request: HUP conn (31771-3154-14)
Apr 02 14:36:55 [31771] orana cib: debug: qb_ipcs_disconnect: qb_ipcs_disconnect(31771-3154-14) state:2
Apr 02 14:36:55 [31771] orana cib: debug: crm_client_destroy: Destroying 0 events
Apr 02 14:36:55 [31771] orana cib: debug: qb_rb_close_helper: Free'ing ringbuffer: /dev/shm/qb-cib_rw-response-31771-3154-14-header
Apr 02 14:36:55 [31771] orana cib: debug: qb_rb_close_helper: Free'ing ringbuffer: /dev/shm/qb-cib_rw-event-31771-3154-14-header
Apr 02 14:36:55 [31771] orana cib: debug: qb_rb_close_helper: Free'ing ringbuffer: /dev/shm/qb-cib_rw-request-31771-3154-14-header
Apr 02 14:36:55 [31771] orana cib: debug: crm_client_new: Connecting 0x562de6f3b9c0 for uid=0 gid=0 pid=3155 id=45b75fd5-1e01-44ee-8314-84352568f911
Apr 02 14:36:55 [31771] orana cib: debug: handle_new_connection: IPC credentials authenticated (31771-3155-14)
Apr 02 14:36:55 [31771] orana cib: debug: qb_ipcs_shm_connect: connecting to client [3155]
Apr 02 14:36:55 [31771] orana cib: debug: qb_rb_open_2: shm size:524301; real_size:528384; rb->word_size:132096
Apr 02 14:36:55 [31771] orana cib: debug: qb_rb_open_2: shm size:524301; real_size:528384; rb->word_size:132096
Apr 02 14:36:55 [31771] orana cib: debug: qb_rb_open_2: shm size:524301; real_size:528384; rb->word_size:132096
Apr 02 14:36:55 [31771] orana cib: info: cib_process_request: Forwarding cib_replace operation for section configuration to all (origin=local/cibadmin/2)
Apr 02 14:36:55 [31771] orana cib: info: cib_perform_op: Diff: --- 0.23.0 2
Apr 02 14:36:55 [31771] orana cib: info: cib_perform_op: Diff: +++ 0.24.0 (null)
Apr 02 14:36:55 [31771] orana cib: info: cib_perform_op: -- /cib/configuration/constraints/rsc_location[@id='cli-prefer-unicloud-master']
Apr 02 14:36:55 [31771] orana cib: info: cib_perform_op: + /cib: @epoch=24
Apr 02 14:36:55 [31771] orana cib: debug: activateCibXml: Triggering CIB write for cib_replace op
Apr 02 14:36:55 [31771] orana cib: info: cib_process_request: Completed cib_replace operation for section configuration: OK (rc=0, origin=orana/cibadmin/2, version=0.24.0)
Apr 02 14:36:55 [31776] orana crmd: debug: te_update_diff: Processing (cib_replace) diff: 0.23.0 -> 0.24.0 (S_TRANSITION_ENGINE)
Apr 02 14:36:55 [31776] orana crmd: info: abort_transition_graph: Transition aborted by deletion of rsc_location[@id='cli-prefer-unicloud-master']: Configuration change | cib=0.24.0 source=te_update_diff:456 path=/cib/configuration/constraints/rsc_location[@id='cli-prefer-unicloud-master'] complete=false
Apr 02 14:36:55 [31776] orana crmd: debug: run_graph: Transition 29 (Complete=6, Pending=1, Fired=0, Skipped=0, Incomplete=8, Source=/var/lib/pacemaker/pengine/pe-input-2753.bz2): In-progress
Apr 02 14:36:55 [31772] orana stonith-ng: debug: xml_patch_version_check: Can apply patch 0.24.0 to 0.23.0
Apr 02 14:36:55 [31772] orana stonith-ng: info: update_cib_stonith_devices_v2: Updating device list from the cib: delete rsc_location[@id='cli-prefer-unicloud-master']
Apr 02 14:36:55 [31772] orana stonith-ng: info: cib_devices_update: Updating devices to version 0.24.0
Apr 02 14:36:55 [31771] orana cib: debug: qb_ipcs_dispatch_connection_request: HUP conn (31771-3155-14)
Apr 02 14:36:55 [31771] orana cib: debug: qb_ipcs_disconnect: qb_ipcs_disconnect(31771-3155-14) state:2
Apr 02 14:36:55 [31771] orana cib: debug: crm_client_destroy: Destroying 0 events
Apr 02 14:36:55 [31771] orana cib: debug: qb_rb_close_helper: Free'ing ringbuffer: /dev/shm/qb-cib_rw-response-31771-3155-14-header
Apr 02 14:36:55 [31771] orana cib: debug: qb_rb_close_helper: Free'ing ringbuffer: /dev/shm/qb-cib_rw-event-31771-3155-14-header
Apr 02 14:36:55 [31772] orana stonith-ng: debug: unpack_config: STONITH timeout: 60000
Apr 02 14:36:55 [31772] orana stonith-ng: debug: unpack_config: STONITH of failed nodes is enabled
Apr 02 14:36:55 [31771] orana cib: debug: qb_rb_close_helper: Free'ing ringbuffer: /dev/shm/qb-cib_rw-request-31771-3155-14-header
Apr 02 14:36:55 [31772] orana stonith-ng: debug: unpack_config: Concurrent fencing is disabled
Apr 02 14:36:55 [31772] orana stonith-ng: debug: unpack_config: Stop all active resources: false
Apr 02 14:36:55 [31772] orana stonith-ng: debug: unpack_config: Cluster is symmetric - resources can run anywhere by default
Apr 02 14:36:55 [31772] orana stonith-ng: debug: unpack_config: Default stickiness: 0
Apr 02 14:36:55 [31772] orana stonith-ng: notice: unpack_config: On loss of CCM Quorum: Ignore
Apr 02 14:36:55 [31772] orana stonith-ng: debug: unpack_config: Node scores: 'red' = -INFINITY, 'yellow' = 0, 'green' = 0
Apr 02 14:36:55 [31772] orana stonith-ng: debug: cib_device_update: Device fence-uc-orana is allowed on orana: score=0
Apr 02 14:36:55 [31772] orana stonith-ng: debug: stonith_device_register: Device 'fence-uc-orana' already existed in device list (2 active devices)
Apr 02 14:36:55 [31772] orana stonith-ng: debug: cib_device_update: Device fence-uc-tigana is allowed on orana: score=0
Apr 02 14:36:55 [31772] orana stonith-ng: debug: stonith_device_register: Device 'fence-uc-tigana' already existed in device list (2 active devices)
Apr 02 14:36:55 [31771] orana cib: info: cib_file_backup: Archived previous version as /var/lib/pacemaker/cib/cib-24.raw
Apr 02 14:36:55 [31771] orana cib: debug: cib_file_write_with_digest: Writing CIB to disk
Apr 02 14:36:55 [31771] orana cib: info: cib_file_write_with_digest: Wrote version 0.24.0 of the CIB to disk (digest: 2ab6f104b6793dd70a926b0eb170da46)
Apr 02 14:36:55 [31771] orana cib: debug: cib_file_write_with_digest: Wrote digest 2ab6f104b6793dd70a926b0eb170da46 to disk
Apr 02 14:36:55 [31771] orana cib: info: cib_file_write_with_digest: Reading cluster configuration file /var/lib/pacemaker/cib/cib.T0mnF0 (digest: /var/lib/pacemaker/cib/cib.N43LtR)
Apr 02 14:36:55 [31771] orana cib: debug: cib_file_write_with_digest: Activating /var/lib/pacemaker/cib/cib.T0mnF0
Apr 02 14:37:00 [31771] orana cib: info: cib_process_ping: Reporting our current digest to orana: 8ef501abd0f1cd229c11fb25d0439690 for 0.24.0 (0x562de6fdf2d0 0)
Apr 02 14:37:01 [31773] orana lrmd: debug: recurring_action_timer: Scheduling another invocation of unicloud_monitor_10000
Apr 02 14:37:01 [3178] orana crm_node: info: crm_xml_cleanup: Cleaning up memory from libxml2
Apr 02 14:37:01 [31773] orana lrmd: debug: operation_finished: unicloud_monitor_10000:3175 - exited with rc=8
Apr 02 14:37:01 [31773] orana lrmd: debug: operation_finished: unicloud_monitor_10000:3175:stderr [ -- empty -- ]
Apr 02 14:37:01 [31773] orana lrmd: debug: operation_finished: unicloud_monitor_10000:3175:stdout [ -- empty -- ]
Apr 02 14:37:01 [31773] orana lrmd: debug: log_finished: finished - rsc:unicloud action:monitor call_id:38 pid:3175 exit-code:8 exec-time:0ms queue-time:0ms
Apr 02 14:37:01 [31776] orana crmd: debug: throttle_cib_load: cib load: 0.003000 (9 ticks in 30s)
Apr 02 14:37:01 [31776] orana crmd: debug: throttle_mode: Current load is 1.060000 across 8 core(s)
Apr 02 14:37:10 [31771] orana cib: debug: crm_client_new: Connecting 0x562de6f3b9c0 for uid=0 gid=0 pid=3206 id=e5922931-9f20-4063-8ec3-20b23b124f3a
Apr 02 14:37:10 [31771] orana cib: debug: handle_new_connection: IPC credentials authenticated (31771-3206-14)
Apr 02 14:37:10 [31771] orana cib: debug: qb_ipcs_shm_connect: connecting to client [3206]
Apr 02 14:37:10 [31771] orana cib: debug: qb_rb_open_2: shm size:524301; real_size:528384; rb->word_size:132096
Apr 02 14:37:10 [31771] orana cib: debug: qb_rb_open_2: shm size:524301; real_size:528384; rb->word_size:132096
Apr 02 14:37:10 [31771] orana cib: debug: qb_rb_open_2: shm size:524301; real_size:528384; rb->word_size:132096
Apr 02 14:37:10 [31771] orana cib: debug: qb_ipcs_dispatch_connection_request: HUP conn (31771-3206-14)
Apr 02 14:37:10 [31771] orana cib: debug: qb_ipcs_disconnect: qb_ipcs_disconnect(31771-3206-14) state:2
Apr 02 14:37:10 [31771] orana cib: debug: crm_client_destroy: Destroying 0 events
Apr 02 14:37:10 [31771] orana cib: debug: qb_rb_close_helper: Free'ing ringbuffer: /dev/shm/qb-cib_ro-response-31771-3206-14-header
Apr 02 14:37:10 [31771] orana cib: debug: qb_rb_close_helper: Free'ing ringbuffer: /dev/shm/qb-cib_ro-event-31771-3206-14-header
Apr 02 14:37:10 [31771] orana cib: debug: qb_rb_close_helper: Free'ing ringbuffer: /dev/shm/qb-cib_ro-request-31771-3206-14-header
Apr 02 14:37:10 [31771] orana cib: debug: crm_client_new: Connecting 0x562de6f3b9c0 for uid=0 gid=0 pid=3208 id=3a2ac7e0-e255-4a96-b955-48ca0a26f514
Apr 02 14:37:10 [31771] orana cib: debug: handle_new_connection: IPC credentials authenticated (31771-3208-14)
Apr 02 14:37:10 [31771] orana cib: debug: qb_ipcs_shm_connect: connecting to client [3208]
Apr 02 14:37:10 [31771] orana cib: debug: qb_rb_open_2: shm size:524301; real_size:528384; rb->word_size:132096
Apr 02 14:37:10 [31771] orana cib: debug: qb_rb_open_2: shm size:524301; real_size:528384; rb->word_size:132096
Apr 02 14:37:10 [31771] orana cib: debug: qb_rb_open_2: shm size:524301; real_size:528384; rb->word_size:132096
Apr 02 14:37:10 [31771] orana cib: debug: qb_ipcs_dispatch_connection_request: HUP conn (31771-3208-14)
Apr 02 14:37:10 [31771] orana cib: debug: qb_ipcs_disconnect: qb_ipcs_disconnect(31771-3208-14) state:2
Apr 02 14:37:10 [31771] orana cib: debug: crm_client_destroy: Destroying 0 events
Apr 02 14:37:10 [31771] orana cib: debug: qb_rb_close_helper: Free'ing ringbuffer: /dev/shm/qb-cib_rw-response-31771-3208-14-header
Apr 02 14:37:10 [31771] orana cib: debug: qb_rb_close_helper: Free'ing ringbuffer: /dev/shm/qb-cib_rw-event-31771-3208-14-header
Apr 02 14:37:10 [31771] orana cib: debug: qb_rb_close_helper: Free'ing ringbuffer: /dev/shm/qb-cib_rw-request-31771-3208-14-header
Apr 02 14:37:10 [31755] orana pacemakerd: debug: crm_client_new: Connecting 0x5622eb12de00 for uid=0 gid=0 pid=3209 id=229225c3-dfc6-43e1-a798-6cdaf72c8471
Apr 02 14:37:10 [31755] orana pacemakerd: debug: handle_new_connection: IPC credentials authenticated (31755-3209-11)
Apr 02 14:37:10 [31755] orana pacemakerd: debug: qb_ipcs_shm_connect: connecting to client [3209]
Apr 02 14:37:10 [31755] orana pacemakerd: debug: qb_rb_open_2: shm size:131085; real_size:135168; rb->word_size:33792
Apr 02 14:37:10 [31755] orana pacemakerd: debug: qb_rb_open_2: shm size:131085; real_size:135168; rb->word_size:33792
Apr 02 14:37:10 [31755] orana pacemakerd: debug: qb_rb_open_2: shm size:131085; real_size:135168; rb->word_size:33792
Apr 02 14:37:10 [31755] orana pacemakerd: debug: qb_ipcs_dispatch_connection_request: HUP conn (31755-3209-11)
Apr 02 14:37:10 [31755] orana pacemakerd: debug: qb_ipcs_disconnect: qb_ipcs_disconnect(31755-3209-11) state:2
Apr 02 14:37:10 [31755] orana pacemakerd: debug: crm_client_destroy: Destroying 0 events
Apr 02 14:37:10 [31755] orana pacemakerd: debug: qb_rb_close_helper: Free'ing ringbuffer: /dev/shm/qb-pacemakerd-response-31755-3209-11-header
Apr 02 14:37:10 [31755] orana pacemakerd: debug: qb_rb_close_helper: Free'ing ringbuffer: /dev/shm/qb-pacemakerd-event-31755-3209-11-header
Apr 02 14:37:10 [31755] orana pacemakerd: debug: qb_rb_close_helper: Free'ing ringbuffer: /dev/shm/qb-pacemakerd-request-31755-3209-11-header
Apr 02 14:37:11 [31773] orana lrmd: debug: recurring_action_timer: Scheduling another invocation of unicloud_monitor_10000
Apr 02 14:37:11 [3226] orana crm_node: info: crm_xml_cleanup: Cleaning up memory from libxml2
Apr 02 14:37:11 [31773] orana lrmd: debug: operation_finished: unicloud_monitor_10000:3223 - exited with rc=8
Apr 02 14:37:11 [31773] orana lrmd: debug: operation_finished: unicloud_monitor_10000:3223:stderr [ -- empty -- ]
Apr 02 14:37:11 [31773] orana lrmd: debug: operation_finished: unicloud_monitor_10000:3223:stdout [ -- empty -- ]
Apr 02 14:37:11 [31773] orana lrmd: debug: log_finished: finished - rsc:unicloud action:monitor call_id:38 pid:3223 exit-code:8 exec-time:0ms queue-time:0ms
Apr 02 14:37:12 [31771] orana cib: debug: crm_client_new: Connecting 0x562de6f3b9c0 for uid=0 gid=0 pid=3242 id=5be6804d-913e-4a7c-968e-73ced1ee08d3
Apr 02 14:37:12 [31771] orana cib: debug: handle_new_connection: IPC credentials authenticated (31771-3242-14)
Apr 02 14:37:12 [31771] orana cib: debug: qb_ipcs_shm_connect: connecting to client [3242]
Apr 02 14:37:12 [31771] orana cib: debug: qb_rb_open_2: shm size:524301; real_size:528384; rb->word_size:132096
Apr 02 14:37:12 [31771] orana cib: debug: qb_rb_open_2: shm size:524301; real_size:528384; rb->word_size:132096
Apr 02 14:37:12 [31771] orana cib: debug: qb_rb_open_2: shm size:524301; real_size:528384; rb->word_size:132096
Apr 02 14:37:12 [31771] orana cib: debug: qb_ipcs_dispatch_connection_request: HUP conn (31771-3242-14)
Apr 02 14:37:12 [31771] orana cib: debug: qb_ipcs_disconnect: qb_ipcs_disconnect(31771-3242-14) state:2
Apr 02 14:37:12 [31771] orana cib: debug: crm_client_destroy: Destroying 0 events
Apr 02 14:37:12 [31771] orana cib: debug: qb_rb_close_helper: Free'ing ringbuffer: /dev/shm/qb-cib_ro-response-31771-3242-14-header
Apr 02 14:37:12 [31771] orana cib: debug: qb_rb_close_helper: Free'ing ringbuffer: /dev/shm/qb-cib_ro-event-31771-3242-14-header
Apr 02 14:37:12 [31771] orana cib: debug: qb_rb_close_helper: Free'ing ringbuffer: /dev/shm/qb-cib_ro-request-31771-3242-14-header
Apr 02 14:37:12 [31771] orana cib: debug: crm_client_new: Connecting 0x562de6f3b9c0 for uid=0 gid=0 pid=3244 id=cda94fc1-cd06-4a4f-9bd5-8228c0e9277d
Apr 02 14:37:12 [31771] orana cib: debug: handle_new_connection: IPC credentials authenticated (31771-3244-14)
Apr 02 14:37:12 [31771] orana cib: debug: qb_ipcs_shm_connect: connecting to client [3244]
Apr 02 14:37:12 [31771] orana cib: debug: qb_rb_open_2: shm size:524301; real_size:528384; rb->word_size:132096
Apr 02 14:37:12 [31771] orana cib: debug: qb_rb_open_2: shm size:524301; real_size:528384; rb->word_size:132096
Apr 02 14:37:12 [31771] orana cib: debug: qb_rb_open_2: shm size:524301; real_size:528384; rb->word_size:132096
Apr 02 14:37:12 [31771] orana cib: debug: qb_ipcs_dispatch_connection_request: HUP conn (31771-3244-14)
Apr 02 14:37:12 [31771] orana cib: debug: qb_ipcs_disconnect: qb_ipcs_disconnect(31771-3244-14) state:2
Apr 02 14:37:12 [31771] orana cib: debug: crm_client_destroy: Destroying 0 events
Apr 02 14:37:12 [31771] orana cib: debug: qb_rb_close_helper: Free'ing ringbuffer: /dev/shm/qb-cib_rw-response-31771-3244-14-header
Apr 02 14:37:12 [31771] orana cib: debug: qb_rb_close_helper: Free'ing ringbuffer: /dev/shm/qb-cib_rw-event-31771-3244-14-header
Apr 02 14:37:12 [31771] orana cib: debug: qb_rb_close_helper: Free'ing ringbuffer: /dev/shm/qb-cib_rw-request-31771-3244-14-header
Apr 02 14:37:12 [31755] orana pacemakerd: debug: crm_client_new: Connecting 0x5622eb12de00 for uid=0 gid=0 pid=3245 id=421171cb-c4c1-411b-8a30-7feafce90112
Apr 02 14:37:12 [31755] orana pacemakerd: debug: handle_new_connection: IPC credentials authenticated (31755-3245-11)
Apr 02 14:37:12 [31755] orana pacemakerd: debug: qb_ipcs_shm_connect: connecting to client [3245]
Apr 02 14:37:12 [31755] orana pacemakerd: debug: qb_rb_open_2: shm size:131085; real_size:135168; rb->word_size:33792
Apr 02 14:37:12 [31755] orana pacemakerd: debug: qb_rb_open_2: shm size:131085; real_size:135168; rb->word_size:33792
Apr 02 14:37:12 [31755] orana pacemakerd: debug: qb_rb_open_2: shm size:131085; real_size:135168; rb->word_size:33792
Apr 02 14:37:12 [31755] orana pacemakerd: debug: qb_ipcs_dispatch_connection_request: HUP conn (31755-3245-11)
Apr 02 14:37:12 [31755] orana pacemakerd: debug: qb_ipcs_disconnect: qb_ipcs_disconnect(31755-3245-11) state:2
Apr 02 14:37:12 [31755] orana pacemakerd: debug: crm_client_destroy: Destroying 0 events
Apr 02 14:37:12 [31755] orana pacemakerd: debug: qb_rb_close_helper: Free'ing ringbuffer: /dev/shm/qb-pacemakerd-response-31755-3245-11-header
Apr 02 14:37:12 [31755] orana pacemakerd: debug: qb_rb_close_helper: Free'ing ringbuffer: /dev/shm/qb-pacemakerd-event-31755-3245-11-header
Apr 02 14:37:12 [31755] orana pacemakerd: debug: qb_rb_close_helper: Free'ing ringbuffer: /dev/shm/qb-pacemakerd-request-31755-3245-11-header
Apr 02 14:37:13 [31771] orana cib: debug: crm_client_new: Connecting 0x562de6f3b9c0 for uid=0 gid=0 pid=3260 id=d93e533e-595f-41f3-9914-4afa4d972558
Apr 02 14:37:13 [31771] orana cib: debug: handle_new_connection: IPC credentials authenticated (31771-3260-14)
Apr 02 14:37:13 [31771] orana cib: debug: qb_ipcs_shm_connect: connecting to client [3260]
Apr 02 14:37:13 [31771] orana cib: debug: qb_rb_open_2: shm size:524301; real_size:528384; rb->word_size:132096
Apr 02 14:37:13 [31771] orana cib: debug: qb_rb_open_2: shm size:524301; real_size:528384; rb->word_size:132096
Apr 02 14:37:13 [31771] orana cib: debug: qb_rb_open_2: shm size:524301; real_size:528384; rb->word_size:132096
Apr 02 14:37:13 [31771] orana cib: debug: qb_ipcs_dispatch_connection_request: HUP conn (31771-3260-14)
Apr 02 14:37:13 [31771] orana cib: debug: qb_ipcs_disconnect: qb_ipcs_disconnect(31771-3260-14) state:2
Apr 02 14:37:13 [31771] orana cib: debug: crm_client_destroy: Destroying 0 events
Apr 02 14:37:13 [31771] orana cib: debug: qb_rb_close_helper: Free'ing ringbuffer: /dev/shm/qb-cib_ro-response-31771-3260-14-header
Apr 02 14:37:13 [31771] orana cib: debug: qb_rb_close_helper: Free'ing ringbuffer: /dev/shm/qb-cib_ro-event-31771-3260-14-header
Apr 02 14:37:13 [31771] orana cib: debug: qb_rb_close_helper: Free'ing ringbuffer: /dev/shm/qb-cib_ro-request-31771-3260-14-header
Apr 02 14:37:13 [31771] orana cib: debug: crm_client_new: Connecting 0x562de6f3b9c0 for uid=0 gid=0 pid=3262 id=a9bca220-6a4c-4645-9161-f90bab8517ef
Apr 02 14:37:13 [31771] orana cib: debug: handle_new_connection: IPC credentials authenticated (31771-3262-14)
Apr 02 14:37:13 [31771] orana cib: debug: qb_ipcs_shm_connect: connecting to client [3262]
Apr 02 14:37:13 [31771] orana cib: debug: qb_rb_open_2: shm size:524301; real_size:528384; rb->word_size:132096
Apr 02 14:37:13 [31771] orana cib: debug: qb_rb_open_2: shm size:524301; real_size:528384; rb->word_size:132096
Apr 02 14:37:13 [31771] orana cib: debug: qb_rb_open_2: shm size:524301; real_size:528384; rb->word_size:132096
Apr 02 14:37:13 [31771] orana cib: debug: qb_ipcs_dispatch_connection_request: HUP conn (31771-3262-14)
Apr 02 14:37:13 [31771] orana cib: debug: qb_ipcs_disconnect: qb_ipcs_disconnect(31771-3262-14) state:2
Apr 02 14:37:13 [31771] orana cib: debug: crm_client_destroy: Destroying 0 events
Apr 02 14:37:13 [31771] orana cib: debug: qb_rb_close_helper: Free'ing ringbuffer: /dev/shm/qb-cib_rw-response-31771-3262-14-header
Apr 02 14:37:13 [31771] orana cib: debug: qb_rb_close_helper: Free'ing ringbuffer: /dev/shm/qb-cib_rw-event-31771-3262-14-header
Apr 02 14:37:13 [31771] orana cib: debug: qb_rb_close_helper: Free'ing ringbuffer: /dev/shm/qb-cib_rw-request-31771-3262-14-header
Apr 02 14:37:13 [31755] orana pacemakerd: debug: crm_client_new: Connecting 0x5622eb12de00 for uid=0 gid=0 pid=3263 id=e7c5c55b-dfcd-41b1-8594-ce09e9b7bdf7
Apr 02 14:37:13 [31755] orana pacemakerd: debug: handle_new_connection: IPC credentials authenticated (31755-3263-11)
Apr 02 14:37:13 [31755] orana pacemakerd: debug: qb_ipcs_shm_connect: connecting to client [3263]
Apr 02 14:37:13 [31755] orana pacemakerd: debug: qb_rb_open_2: shm size:131085; real_size:135168; rb->word_size:33792
Apr 02 14:37:13 [31755] orana pacemakerd: debug: qb_rb_open_2: shm size:131085; real_size:135168; rb->word_size:33792
Apr 02 14:37:13 [31755] orana pacemakerd: debug: qb_rb_open_2: shm size:131085; real_size:135168; rb->word_size:33792
Apr 02 14:37:13 [31755] orana pacemakerd: debug: qb_ipcs_dispatch_connection_request: HUP conn (31755-3263-11)
Apr 02 14:37:13 [31755] orana pacemakerd: debug: qb_ipcs_disconnect: qb_ipcs_disconnect(31755-3263-11) state:2
Apr 02 14:37:13 [31755] orana pacemakerd: debug: crm_client_destroy: Destroying 0 events
Apr 02 14:37:13 [31755] orana pacemakerd: debug: qb_rb_close_helper: Free'ing ringbuffer: /dev/shm/qb-pacemakerd-response-31755-3263-11-header
Apr 02 14:37:13 [31755] orana pacemakerd: debug: qb_rb_close_helper: Free'ing ringbuffer: /dev/shm/qb-pacemakerd-event-31755-3263-11-header
Apr 02 14:37:13 [31755] orana pacemakerd: debug: qb_rb_close_helper: Free'ing ringbuffer: /dev/shm/qb-pacemakerd-request-31755-3263-11-header
Apr 02 14:37:13 [31771] orana cib: debug: crm_client_new: Connecting 0x562de6f3b9c0 for uid=0 gid=0 pid=3278 id=07491ea3-dc76-4cae-8ab5-0c344e4d5aa8
Apr 02 14:37:13 [31771] orana cib: debug: handle_new_connection: IPC credentials authenticated (31771-3278-14)
Apr 02 14:37:13 [31771] orana cib: debug: qb_ipcs_shm_connect: connecting to client [3278]
Apr 02 14:37:13 [31771] orana cib: debug: qb_rb_open_2: shm size:524301; real_size:528384; rb->word_size:132096
Apr 02 14:37:13 [31771] orana cib: debug: qb_rb_open_2: shm size:524301; real_size:528384; rb->word_size:132096
Apr 02 14:37:13 [31771] orana cib: debug: qb_rb_open_2: shm size:524301; real_size:528384; rb->word_size:132096
Apr 02 14:37:13 [31771] orana cib: debug: qb_ipcs_dispatch_connection_request: HUP conn (31771-3278-14)
Apr 02 14:37:13 [31771] orana cib: debug: qb_ipcs_disconnect: qb_ipcs_disconnect(31771-3278-14) state:2
Apr 02 14:37:13 [31771] orana cib: debug: crm_client_destroy: Destroying 0 events
Apr 02 14:37:13 [31771] orana cib: debug: qb_rb_close_helper: Free'ing ringbuffer: /dev/shm/qb-cib_ro-response-31771-3278-14-header
Apr 02 14:37:13 [31771] orana cib: debug: qb_rb_close_helper: Free'ing ringbuffer: /dev/shm/qb-cib_ro-event-31771-3278-14-header
Apr 02 14:37:13 [31771] orana cib: debug: qb_rb_close_helper: Free'ing ringbuffer: /dev/shm/qb-cib_ro-request-31771-3278-14-header
Apr 02 14:37:13 [31771] orana cib: debug: crm_client_new: Connecting 0x562de6f3b9c0 for uid=0 gid=0 pid=3280 id=9ba471ea-cbe5-4ed5-b6ab-8fb3b3190e07
Apr 02 14:37:13 [31771] orana cib: debug: handle_new_connection: IPC credentials authenticated (31771-3280-14)
Apr 02 14:37:13 [31771] orana cib: debug: qb_ipcs_shm_connect: connecting to client [3280]
Apr 02 14:37:13 [31771] orana cib: debug: qb_rb_open_2: shm size:524301; real_size:528384; rb->word_size:132096
Apr 02 14:37:13 [31771] orana cib: debug: qb_rb_open_2: shm size:524301; real_size:528384; rb->word_size:132096
Apr 02 14:37:13 [31771] orana cib: debug: qb_rb_open_2: shm size:524301; real_size:528384; rb->word_size:132096
Apr 02 14:37:13 [31771] orana cib: debug: qb_ipcs_dispatch_connection_request: HUP conn (31771-3280-14)
Apr 02 14:37:13 [31771] orana cib: debug: qb_ipcs_disconnect: qb_ipcs_disconnect(31771-3280-14) state:2
Apr 02 14:37:13 [31771] orana cib: debug: crm_client_destroy: Destroying 0 events
Apr 02 14:37:13 [31771] orana cib: debug: qb_rb_close_helper: Free'ing ringbuffer: /dev/shm/qb-cib_rw-response-31771-3280-14-header
Apr 02 14:37:13 [31771] orana cib: debug: qb_rb_close_helper: Free'ing ringbuffer: /dev/shm/qb-cib_rw-event-31771-3280-14-header
Apr 02 14:37:13 [31771] orana cib: debug: qb_rb_close_helper: Free'ing ringbuffer: /dev/shm/qb-cib_rw-request-31771-3280-14-header
Apr 02 14:37:13 [31755] orana pacemakerd: debug: crm_client_new: Connecting 0x5622eb12de00 for uid=0 gid=0 pid=3281 id=9d7208cb-6397-41ce-83ed-345507556d52
Apr 02 14:37:13 [31755] orana pacemakerd: debug: handle_new_connection: IPC credentials authenticated (31755-3281-11)
Apr 02 14:37:13 [31755] orana pacemakerd: debug: qb_ipcs_shm_connect: connecting to client [3281]
Apr 02 14:37:13 [31755] orana pacemakerd: debug: qb_rb_open_2: shm size:131085; real_size:135168; rb->word_size:33792
Apr 02 14:37:13 [31755] orana pacemakerd: debug: qb_rb_open_2: shm size:131085; real_size:135168; rb->word_size:33792
Apr 02 14:37:13 [31755] orana pacemakerd: debug: qb_rb_open_2: shm size:131085; real_size:135168; rb->word_size:33792
Apr 02 14:37:13 [31755] orana pacemakerd: debug: qb_ipcs_dispatch_connection_request: HUP conn (31755-3281-11)
Apr 02 14:37:13 [31755] orana pacemakerd: debug: qb_ipcs_disconnect: qb_ipcs_disconnect(31755-3281-11) state:2
Apr 02 14:37:13 [31755] orana pacemakerd: debug: crm_client_destroy: Destroying 0 events
Apr 02 14:37:13 [31755] orana pacemakerd: debug: qb_rb_close_helper: Free'ing ringbuffer: /dev/shm/qb-pacemakerd-response-31755-3281-11-header
Apr 02 14:37:13 [31755] orana pacemakerd: debug: qb_rb_close_helper: Free'ing ringbuffer: /dev/shm/qb-pacemakerd-event-31755-3281-11-header
Apr 02 14:37:13 [31755] orana pacemakerd: debug: qb_rb_close_helper: Free'ing ringbuffer: /dev/shm/qb-pacemakerd-request-31755-3281-11-header
Apr 02 14:37:14 [31772] orana stonith-ng: info: st_child_term: Child 3063 timed out, sending SIGTERM
Apr 02 14:37:14 [31772] orana stonith-ng: notice: stonith_action_async_done: Child process 3063 performing action 'monitor' timed out with signal 15
Apr 02 14:37:14 [31772] orana stonith-ng: debug: st_child_done: Operation 'monitor' on 'fence-uc-orana' completed with rc=-62 (0 remaining)
Apr 02 14:37:14 [31772] orana stonith-ng: notice: log_operation: Operation 'monitor' [3063] for device 'fence-uc-orana' returned: -62 (Timer expired)
Apr 02 14:37:14 [31772] orana stonith-ng: debug: child_waitpid: wait(3125) = 0: Success (0)
Apr 02 14:37:14 [31773] orana lrmd: info: log_finished: finished - rsc:fence-uc-orana action:start call_id:42 exit-code:1 exec-time:20006ms queue-time:0ms
Apr 02 14:37:14 [31776] orana crmd: debug: create_operation_update: do_update_resource: Updating resource fence-uc-orana after start op Timed Out (interval=0)
Apr 02 14:37:14 [31776] orana crmd: debug: stonith_action_create: Initiating action metadata for agent fence_ilo4 (target=(null))
Apr 02 14:37:14 [31776] orana crmd: debug: internal_stonith_action_execute: forking
Apr 02 14:37:14 [31776] orana crmd: debug: internal_stonith_action_execute: sending args
Apr 02 14:37:14 [31776] orana crmd: debug: internal_stonith_action_execute: result = 0
Apr 02 14:37:14 [31776] orana crmd: error: process_lrm_event: Result of start operation for fence-uc-orana on orana: Timed Out | call=42 key=fence-uc-orana_start_0 timeout=20000ms <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
Apr 02 14:37:14 [31776] orana crmd: debug: send_notifications: Sending 'resource' alert to '/var/lib/pacemaker/pw_alert.sh' via '/var/lib/pacemaker/pw_alert.sh'
Apr 02 14:37:14 [31771] orana cib: info: cib_process_request: Forwarding cib_modify operation for section status to all (origin=local/crmd/91)
Apr 02 14:37:14 [31776] orana crmd: debug: update_history_cache: Updating history for 'fence-uc-orana' with start op
Apr 02 14:37:14 [31776] orana crmd: debug: child_waitpid: wait(3300) = 0: Resource temporarily unavailable (11)
Apr 02 14:37:14 [31771] orana cib: info: cib_perform_op: Diff: --- 0.24.0 2
Apr 02 14:37:14 [31771] orana cib: info: cib_perform_op: Diff: +++ 0.24.1 (null)
Apr 02 14:37:14 [31771] orana cib: info: cib_perform_op: + /cib: @num_updates=1
Apr 02 14:37:14 [31771] orana cib: info: cib_perform_op: + /cib/status/node_state[@id='1']/lrm[@id='1']/lrm_resources/lrm_resource[@id='fence-uc-orana']/lrm_rsc_op[@id='fence-uc-orana_last_0']: @operation_key=fence-uc-orana_start_0, @operation=start, @transition-key=6:29:0:80fd8561-42be-4d70-865d-ae837d35e118, @transition-magic=2:1;6:29:0:80fd8561-42be-4d70-865d-ae837d35e118, @call-id=42, @rc-code=1, @op-status=2, @exec-time=20006, @op-digest=6003bb66f1f096d6575040016bcf6afe, @op-secure-digest
Apr 02 14:37:14 [31771] orana cib: info: cib_perform_op: ++ /cib/status/node_state[@id='1']/lrm[@id='1']/lrm_resources/lrm_resource[@id='fence-uc-orana']: <lrm_rsc_op id="fence-uc-orana_last_failure_0" operation_key="fence-uc-orana_start_0" operation="start" crm-debug-origin="do_update_resource" crm_feature_set="3.0.12" transition-key="6:29:0:80fd8561-42be-4d70-865d-ae837d35e118" transition-magic="2:1;6:29:0:80fd8561-42be-4d70-865d-ae837d35e118" on_node="orana" call-id="42" rc-code="1" op-statu
Apr 02 14:37:14 [31771] orana cib: info: cib_process_request: Completed cib_modify operation for section status: OK (rc=0, origin=orana/crmd/91, version=0.24.1)
Apr 02 14:37:14 [31776] orana crmd: debug: te_update_diff: Processing (cib_modify) diff: 0.24.0 -> 0.24.1 (S_TRANSITION_ENGINE)
Apr 02 14:37:14 [31776] orana crmd: warning: status_from_rc: Action 6 (fence-uc-orana_start_0) on orana failed (target: 0 vs. rc: 1): Error
Apr 02 14:37:14 [31772] orana stonith-ng: debug: xml_patch_version_check: Can apply patch 0.24.1 to 0.24.0
Apr 02 14:37:14 [31776] orana crmd: info: abort_transition_graph: Transition aborted by operation fence-uc-orana_start_0 'modify' on orana: Event failed | magic=2:1;6:29:0:80fd8561-42be-4d70-865d-ae837d35e118 cib=0.24.1 source=match_graph_event:310 complete=false
Apr 02 14:37:14 [31776] orana crmd: info: match_graph_event: Action fence-uc-orana_start_0 (6) confirmed on orana (rc=1)
Apr 02 14:37:14 [31776] orana crmd: info: update_failcount: Updating failcount for fence-uc-orana on orana after failed start: rc=1 (update=INFINITY, time=1554196034)
Apr 02 14:37:14 [31776] orana crmd: debug: attrd_update_delegate: Asked attrd to update fail-count-fence-uc-orana=INFINITY for orana: OK (0)
Apr 02 14:37:14 [31776] orana crmd: debug: attrd_update_delegate: Asked attrd to update last-failure-fence-uc-orana=1554196034 for orana: OK (0)
Apr 02 14:37:14 [31776] orana crmd: info: process_graph_event: Detected action (29.6) fence-uc-orana_start_0.42=unknown error: failed
Apr 02 14:37:14 [31776] orana crmd: warning: status_from_rc: Action 6 (fence-uc-orana_start_0) on orana failed (target: 0 vs. rc: 1): Error
Apr 02 14:37:14 [31774] orana attrd: debug: attrd_client_update: Broadcasting fail-count-fence-uc-orana[orana] = INFINITY
Apr 02 14:37:14 [31776] orana crmd: info: abort_transition_graph: Transition aborted by operation fence-uc-orana_start_0 'create' on orana: Event failed | magic=2:1;6:29:0:80fd8561-42be-4d70-865d-ae837d35e118 cib=0.24.1 source=match_graph_event:310 complete=false
Apr 02 14:37:14 [31776] orana crmd: info: match_graph_event: Action fence-uc-orana_start_0 (6) confirmed on orana (rc=1)
Apr 02 14:37:14 [31776] orana crmd: info: update_failcount: Updating failcount for fence-uc-orana on orana after failed start: rc=1 (update=INFINITY, time=1554196034)
Apr 02 14:37:14 [31776] orana crmd: debug: attrd_update_delegate: Asked attrd to update fail-count-fence-uc-orana=INFINITY for orana: OK (0)
Apr 02 14:37:14 [31774] orana attrd: debug: attrd_client_update: Broadcasting last-failure-fence-uc-orana[orana] = 1554196034
Apr 02 14:37:14 [31776] orana crmd: debug: attrd_update_delegate: Asked attrd to update last-failure-fence-uc-orana=1554196034 for orana: OK (0)
Apr 02 14:37:14 [31776] orana crmd: info: process_graph_event: Detected action (29.6) fence-uc-orana_start_0.42=unknown error: failed
Apr 02 14:37:14 [31774] orana attrd: info: attrd_peer_update: Setting fail-count-fence-uc-orana[orana]: (null) -> INFINITY from orana
Apr 02 14:37:14 [31776] orana crmd: notice: run_graph: Transition 29 (Complete=7, Pending=0, Fired=0, Skipped=0, Incomplete=8, Source=/var/lib/pacemaker/pengine/pe-input-2753.bz2): Complete
Apr 02 14:37:14 [31774] orana attrd: debug: attrd_client_update: Broadcasting fail-count-fence-uc-orana[orana] = INFINITY
Apr 02 14:37:14 [31776] orana crmd: debug: te_graph_trigger: Transition 29 is now complete
Apr 02 14:37:14 [31776] orana crmd: debug: notify_crmd: Processing transition completion in state S_TRANSITION_ENGINE
Apr 02 14:37:14 [31776] orana crmd: debug: notify_crmd: Transition 29 status: restart - Configuration change
Apr 02 14:37:14 [31776] orana crmd: debug: s_crmd_fsa: Processing I_PE_CALC: [ state=S_TRANSITION_ENGINE cause=C_FSA_INTERNAL origin=notify_crmd ]
Apr 02 14:37:14 [31774] orana attrd: debug: attrd_client_update: Broadcasting last-failure-fence-uc-orana[orana] = 1554196034
Apr 02 14:37:14 [31776] orana crmd: info: do_state_transition: State transition S_TRANSITION_ENGINE -> S_POLICY_ENGINE | input=I_PE_CALC cause=C_FSA_INTERNAL origin=notify_crmd
Apr 02 14:37:14 [31776] orana crmd: debug: do_state_transition: All 2 cluster nodes are eligible to run resources.
Apr 02 14:37:14 [31776] orana crmd: debug: do_pe_invoke: Query 92: Requesting the current CIB: S_POLICY_ENGINE
Apr 02 14:37:14 [31774] orana attrd: info: attrd_peer_update: Setting last-failure-fence-uc-orana[orana]: (null) -> 1554196034 from orana
Apr 02 14:37:14 [31771] orana cib: info: cib_perform_op: Diff: --- 0.24.1 2
Apr 02 14:37:14 [31771] orana cib: info: cib_perform_op: Diff: +++ 0.24.2 (null)
Apr 02 14:37:14 [31771] orana cib: info: cib_perform_op: + /cib: @num_updates=2
Apr 02 14:37:14 [31771] orana cib: info: cib_perform_op: ++ /cib/status/node_state[@id='1']/transient_attributes[@id='1']/instance_attributes[@id='status-1']: <nvpair id="status-1-fail-count-fence-uc-orana" name="fail-count-fence-uc-orana" value="INFINITY"/>
Apr 02 14:37:14 [31771] orana cib: info: cib_process_request: Completed cib_modify operation for section status: OK (rc=0, origin=tigana/attrd/9, version=0.24.2)
Apr 02 14:37:14 [31772] orana stonith-ng: debug: xml_patch_version_check: Can apply patch 0.24.2 to 0.24.1
Apr 02 14:37:14 [31771] orana cib: info: cib_perform_op: Diff: --- 0.24.2 2
Apr 02 14:37:14 [31771] orana cib: info: cib_perform_op: Diff: +++ 0.24.3 (null)
Apr 02 14:37:14 [31771] orana cib: info: cib_perform_op: + /cib: @num_updates=3
Apr 02 14:37:14 [31771] orana cib: info: cib_perform_op: ++ /cib/status/node_state[@id='1']/transient_attributes[@id='1']/instance_attributes[@id='status-1']: <nvpair id="status-1-last-failure-fence-uc-orana" name="last-failure-fence-uc-orana" value="1554196034"/>
Apr 02 14:37:14 [31771] orana cib: info: cib_process_request: Completed cib_modify operation for section status: OK (rc=0, origin=tigana/attrd/10, version=0.24.3)
Apr 02 14:37:14 [31772] orana stonith-ng: debug: xml_patch_version_check: Can apply patch 0.24.3 to 0.24.2
Apr 02 14:37:14 [31776] orana crmd: debug: do_pe_invoke_callback: Invoking the PE: query=92, ref=pe_calc-dc-1554196034-112, seq=780, quorate=1
Apr 02 14:37:14 [31776] orana crmd: debug: te_update_diff: Processing (cib_modify) diff: 0.24.1 -> 0.24.2 (S_POLICY_ENGINE)
Apr 02 14:37:14 [31776] orana crmd: info: abort_transition_graph: Transition aborted by status-1-fail-count-fence-uc-orana doing create fail-count-fence-uc-orana=INFINITY: Transient attribute change | cib=0.24.2 source=abort_unless_down:343 path=/cib/status/node_state[@id='1']/transient_attributes[@id='1']/instance_attributes[@id='status-1'] complete=true
Apr 02 14:37:14 [31776] orana crmd: debug: te_update_diff: Processing (cib_modify) diff: 0.24.2 -> 0.24.3 (S_POLICY_ENGINE)
Apr 02 14:37:14 [31776] orana crmd: info: abort_transition_graph: Transition aborted by status-1-last-failure-fence-uc-orana doing create last-failure-fence-uc-orana=1554196034: Transient attribute change | cib=0.24.3 source=abort_unless_down:343 path=/cib/status/node_state[@id='1']/transient_attributes[@id='1']/instance_attributes[@id='status-1'] complete=true
Apr 02 14:37:14 [31776] orana crmd: debug: s_crmd_fsa: Processing I_PE_CALC: [ state=S_POLICY_ENGINE cause=C_FSA_INTERNAL origin=abort_transition_graph ]
Apr 02 14:37:14 [31776] orana crmd: debug: do_pe_invoke: Query 93: Requesting the current CIB: S_POLICY_ENGINE
Apr 02 14:37:14 [31776] orana crmd: debug: s_crmd_fsa: Processing I_PE_CALC: [ state=S_POLICY_ENGINE cause=C_FSA_INTERNAL origin=abort_transition_graph ]
Apr 02 14:37:14 [31776] orana crmd: debug: do_pe_invoke: Query 94: Requesting the current CIB: S_POLICY_ENGINE
Apr 02 14:37:14 [31775] orana pengine: debug: unpack_config: STONITH timeout: 60000
Apr 02 14:37:14 [31775] orana pengine: debug: unpack_config: STONITH of failed nodes is enabled
Apr 02 14:37:14 [31775] orana pengine: debug: unpack_config: Concurrent fencing is disabled
Apr 02 14:37:14 [31775] orana pengine: debug: unpack_config: Stop all active resources: false
Apr 02 14:37:14 [31775] orana pengine: debug: unpack_config: Cluster is symmetric - resources can run anywhere by default
Apr 02 14:37:14 [31775] orana pengine: debug: unpack_config: Default stickiness: 0
Apr 02 14:37:14 [31775] orana pengine: notice: unpack_config: On loss of CCM Quorum: Ignore
Apr 02 14:37:14 [31775] orana pengine: debug: unpack_config: Node scores: 'red' = -INFINITY, 'yellow' = 0, 'green' = 0
Apr 02 14:37:14 [31775] orana pengine: info: determine_online_status_fencing: Node tigana is active
Apr 02 14:37:14 [31775] orana pengine: info: determine_online_status: Node tigana is online
Apr 02 14:37:14 [31775] orana pengine: info: determine_online_status_fencing: Node orana is active
Apr 02 14:37:14 [31775] orana pengine: info: determine_online_status: Node orana is online
Apr 02 14:37:14 [31775] orana pengine: debug: find_anonymous_clone: Internally renamed unicloud on tigana to unicloud:0
Apr 02 14:37:14 [31775] orana pengine: debug: unpack_rsc_op: Expired operation '6:29:0:80fd8561-42be-4d70-865d-ae837d35e118' on orana returned 'unknown error' (1) instead of the expected value: 'ok' (0)
Apr 02 14:37:14 [31775] orana pengine: notice: unpack_rsc_op: Ignoring expired calculated failure fence-uc-orana_start_0 (rc=1, magic=2:1;6:29:0:80fd8561-42be-4d70-865d-ae837d35e118) on orana
Apr 02 14:37:14 [31775] orana pengine: debug: unpack_rsc_op: Expired operation '6:29:0:80fd8561-42be-4d70-865d-ae837d35e118' on orana returned 'unknown error' (1) instead of the expected value: 'ok' (0)
Apr 02 14:37:14 [31775] orana pengine: notice: unpack_rsc_op: Ignoring expired calculated failure fence-uc-orana_start_0 (rc=1, magic=2:1;6:29:0:80fd8561-42be-4d70-865d-ae837d35e118) on orana
Apr 02 14:37:14 [31775] orana pengine: debug: find_anonymous_clone: Internally renamed unicloud on orana to unicloud:1
More information about the Users
mailing list