<div dir="auto">Hi,<div dir="auto"><br></div><div dir="auto">Yes, it is....</div><div dir="auto"><br></div><div dir="auto">The qemu-kvm process is executed by the oneadmin user.</div><div dir="auto"><br></div><div dir="auto">When I cluster tries the live migration, what users do play?</div><div dir="auto"><br></div><div dir="auto">Oneadmin</div><div dir="auto">Root</div><div dir="auto">Hacluster</div><div dir="auto"><br></div><div dir="auto">I have just configured pasworless ssh connection with oneadmin.</div><div dir="auto"><br></div><div dir="auto">Do I need to configure any other passwordless ssh connection with any other user?</div><div dir="auto"><br></div><div dir="auto">What user executes the virsh migrate - - live?</div><div dir="auto"><br></div><div dir="auto">Is there any way to check ssk keys? </div><div dir="auto"><br></div><div dir="auto">Sorry for all theese questions. </div><div dir="auto"><br></div><div dir="auto">Thanks a lot </div><div dir="auto"><br></div><div dir="auto"><br></div><br><div class="gmail_extra" dir="auto"><br><div class="gmail_quote">El 1 sept. 2017 0:12, &quot;Ken Gaillot&quot; &lt;<a href="mailto:kgaillot@redhat.com">kgaillot@redhat.com</a>&gt; escribió:<br type="attribution"><blockquote class="quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div class="elided-text">On Thu, 2017-08-31 at 23:45 +0200, Oscar Segarra wrote:<br>
&gt; Hi Ken,<br>
&gt;<br>
&gt;<br>
&gt; Thanks a lot for you quick answer:<br>
&gt;<br>
&gt;<br>
&gt; Regarding to selinux, it is disabled. The FW is disabled as well.<br>
&gt;<br>
&gt;<br>
&gt; [root@vdicnode01 ~]# sestatus<br>
&gt; SELinux status:                 disabled<br>
&gt;<br>
&gt;<br>
&gt; [root@vdicnode01 ~]# service firewalld status<br>
&gt; Redirecting to /bin/systemctl status  firewalld.service<br>
&gt; ● firewalld.service - firewalld - dynamic firewall daemon<br>
&gt;    Loaded: loaded (/usr/lib/systemd/system/<wbr>firewalld.service;<br>
&gt; disabled; vendor preset: enabled)<br>
&gt;    Active: inactive (dead)<br>
&gt;      Docs: man:firewalld(1)<br>
&gt;<br>
&gt;<br>
&gt; On migration, it performs a gracefully shutdown and a start on the new<br>
&gt; node.<br>
&gt;<br>
&gt;<br>
&gt; I attach the logs when trying to migrate from vdicnode02 to<br>
&gt; vdicnode01:<br>
&gt;<br>
&gt;<br>
&gt; vdicnode02 corosync.log:<br>
&gt; Aug 31 23:38:17 [1521] vdicnode02        cib:     info:<br>
&gt; cib_perform_op: Diff: --- 0.161.2 2<br>
&gt; Aug 31 23:38:17 [1521] vdicnode02        cib:     info:<br>
&gt; cib_perform_op: Diff: +++ 0.162.0 (null)<br>
&gt; Aug 31 23:38:17 [1521] vdicnode02        cib:     info:<br>
&gt; cib_perform_op:<br>
&gt; -- /cib/configuration/<wbr>constraints/rsc_location[@id=&#39;<wbr>location-vm-vdicdb01-<wbr>vdicnode01--INFINITY&#39;]<br>
&gt; Aug 31 23:38:17 [1521] vdicnode02        cib:     info:<br>
&gt; cib_perform_op: +  /cib:  @epoch=162, @num_updates=0<br>
&gt; Aug 31 23:38:17 [1521] vdicnode02        cib:     info:<br>
&gt; cib_process_request:    Completed cib_replace operation for section<br>
&gt; configuration: OK (rc=0, origin=vdicnode01/cibadmin/2,<br>
&gt; version=0.162.0)<br>
&gt; Aug 31 23:38:17 [1521] vdicnode02        cib:     info:<br>
&gt; cib_file_backup:        Archived previous version<br>
&gt; as /var/lib/pacemaker/cib/cib-65.<wbr>raw<br>
&gt; Aug 31 23:38:17 [1521] vdicnode02        cib:     info:<br>
&gt; cib_file_write_with_digest:     Wrote version 0.162.0 of the CIB to<br>
&gt; disk (digest: 1f87611b60cd7c48b95b6b788b47f6<wbr>5f)<br>
&gt; Aug 31 23:38:17 [1521] vdicnode02        cib:     info:<br>
&gt; cib_file_write_with_digest:     Reading cluster configuration<br>
&gt; file /var/lib/pacemaker/cib/cib.<wbr>jt2KPw<br>
&gt; (digest: /var/lib/pacemaker/cib/cib.<wbr>Kwqfpl)<br>
&gt; Aug 31 23:38:22 [1521] vdicnode02        cib:     info:<br>
&gt; cib_process_ping:       Reporting our current digest to vdicnode01:<br>
&gt; dace3a23264934279d439420d5a716<wbr>cc for 0.162.0 (0x7f96bb26c5c0 0)<br>
&gt; Aug 31 23:38:27 [1521] vdicnode02        cib:     info:<br>
&gt; cib_perform_op: Diff: --- 0.162.0 2<br>
&gt; Aug 31 23:38:27 [1521] vdicnode02        cib:     info:<br>
&gt; cib_perform_op: Diff: +++ 0.163.0 (null)<br>
&gt; Aug 31 23:38:27 [1521] vdicnode02        cib:     info:<br>
&gt; cib_perform_op: +  /cib:  @epoch=163<br>
&gt; Aug 31 23:38:27 [1521] vdicnode02        cib:     info:<br>
&gt; cib_perform_op: ++ /cib/configuration/<wbr>constraints:  &lt;rsc_location<br>
&gt; id=&quot;location-vm-vdicdb01-<wbr>vdicnode02--INFINITY&quot; node=&quot;vdicnode02&quot;<br>
&gt; rsc=&quot;vm-vdicdb01&quot; score=&quot;-INFINITY&quot;/&gt;<br>
&gt; Aug 31 23:38:27 [1521] vdicnode02        cib:     info:<br>
&gt; cib_process_request:    Completed cib_replace operation for section<br>
&gt; configuration: OK (rc=0, origin=vdicnode01/cibadmin/2,<br>
&gt; version=0.163.0)<br>
&gt; Aug 31 23:38:27 [1521] vdicnode02        cib:     info:<br>
&gt; cib_file_backup:        Archived previous version<br>
&gt; as /var/lib/pacemaker/cib/cib-66.<wbr>raw<br>
&gt; Aug 31 23:38:27 [1521] vdicnode02        cib:     info:<br>
&gt; cib_file_write_with_digest:     Wrote version 0.163.0 of the CIB to<br>
&gt; disk (digest: 47a548b36746de9275d66cc6aeb0fd<wbr>c4)<br>
&gt; Aug 31 23:38:27 [1521] vdicnode02        cib:     info:<br>
&gt; cib_file_write_with_digest:     Reading cluster configuration<br>
&gt; file /var/lib/pacemaker/cib/cib.<wbr>rcgXiT<br>
&gt; (digest: /var/lib/pacemaker/cib/cib.<wbr>7geMfi)<br>
&gt; Aug 31 23:38:27 [1523] vdicnode02       lrmd:     info:<br>
&gt; cancel_recurring_action:        Cancelling ocf operation<br>
&gt; vm-vdicdb01_monitor_10000<br>
&gt; Aug 31 23:38:27 [1526] vdicnode02       crmd:     info:<br>
&gt; do_lrm_rsc_op:  Performing<br>
&gt; key=6:6:0:fe1a9b0a-816c-4b97-<wbr>96cb-b90dbf71417a<br>
&gt; op=vm-vdicdb01_migrate_to_0<br>
&gt; Aug 31 23:38:27 [1523] vdicnode02       lrmd:     info: log_execute:<br>
&gt;    executing - rsc:vm-vdicdb01 action:migrate_to call_id:9<br>
&gt; Aug 31 23:38:27 [1526] vdicnode02       crmd:     info:<br>
&gt; process_lrm_event:      Result of monitor operation for vm-vdicdb01 on<br>
&gt; vdicnode02: Cancelled | call=7 key=vm-vdicdb01_monitor_10000<br>
&gt; confirmed=true<br>
&gt; Aug 31 23:38:27 [1521] vdicnode02        cib:     info:<br>
&gt; cib_perform_op: Diff: --- 0.163.0 2<br>
&gt; Aug 31 23:38:27 [1521] vdicnode02        cib:     info:<br>
&gt; cib_perform_op: Diff: +++ 0.163.1 (null)<br>
&gt; Aug 31 23:38:27 [1521] vdicnode02        cib:     info:<br>
&gt; cib_perform_op: +  /cib:  @num_updates=1<br>
&gt; Aug 31 23:38:27 [1521] vdicnode02        cib:     info:<br>
&gt; cib_perform_op: +<br>
&gt;  /cib/status/node_state[@id=&#39;2&#39;<wbr>]/lrm[@id=&#39;2&#39;]/lrm_resources/<wbr>lrm_resource[@id=&#39;vm-vdicdb01&#39;<wbr>]/lrm_rsc_op[@id=&#39;vm-vdicdb01_<wbr>last_0&#39;]:  @operation_key=vm-vdicdb01_<wbr>migrate_to_0, @operation=migrate_to, @crm-debug-origin=cib_action_<wbr>update, @transition-key=6:6:0:<wbr>fe1a9b0a-816c-4b97-96cb-<wbr>b90dbf71417a, @transition-magic=-1:193;6:6:<wbr>0:fe1a9b0a-816c-4b97-96cb-<wbr>b90dbf71417a, @call-id=-1, @rc-code=193, @op-status=-1, @last-run=1504215507, @last-rc-cha<br>
&gt; Aug 31 23:38:27 [1521] vdicnode02        cib:     info:<br>
&gt; cib_process_request:    Completed cib_modify operation for section<br>
&gt; status: OK (rc=0, origin=vdicnode01/crmd/41, version=0.163.1)<br>
&gt; VirtualDomain(vm-vdicdb01)[<wbr>5241]:       2017/08/31_23:38:27 INFO:<br>
&gt; vdicdb01: Starting live migration to vdicnode01 (using: virsh<br>
&gt; --connect=qemu:///system --quiet migrate --live  vdicdb01 qemu<br>
&gt; +ssh://vdicnode01/system ).<br>
&gt; VirtualDomain(vm-vdicdb01)[<wbr>5241]:       2017/08/31_23:38:27 ERROR:<br>
&gt; vdicdb01: live migration to vdicnode01 failed: 1<br>
&gt; Aug 31 23:38:27 [1523] vdicnode02       lrmd:   notice:<br>
&gt; operation_finished:     vm-vdicdb01_migrate_to_0:5241:<wbr>stderr [ error:<br>
&gt; Cannot recv data: Host key verification failed.: Connection reset by<br>
&gt; peer ]<br>
<br>
</div>^^^ There you go. Sounds like the ssh key isn&#39;t being accepted. No idea<br>
why though.<br>
<div class="elided-text"><br>
<br>
<br>
&gt; Aug 31 23:38:27 [1523] vdicnode02       lrmd:   notice:<br>
&gt; operation_finished:     vm-vdicdb01_migrate_to_0:5241:<wbr>stderr<br>
&gt; [ ocf-exit-reason:vdicdb01: live migration to vdicnode01 failed: 1 ]<br>
&gt; Aug 31 23:38:27 [1523] vdicnode02       lrmd:     info: log_finished:<br>
&gt; finished - rsc:vm-vdicdb01 action:migrate_to call_id:9 pid:5241<br>
&gt; exit-code:1 exec-time:78ms queue-time:0ms<br>
&gt; Aug 31 23:38:27 [1526] vdicnode02       crmd:   notice:<br>
&gt; process_lrm_event:      Result of migrate_to operation for vm-vdicdb01<br>
&gt; on vdicnode02: 1 (unknown error) | call=9 key=vm-vdicdb01_migrate_to_0<br>
&gt; confirmed=true cib-update=14<br>
&gt; Aug 31 23:38:27 [1526] vdicnode02       crmd:   notice:<br>
&gt; process_lrm_event:      vdicnode02-vm-vdicdb01_<wbr>migrate_to_0:9 [ error:<br>
&gt; Cannot recv data: Host key verification failed.: Connection reset by<br>
&gt; peer\nocf-exit-reason:<wbr>vdicdb01: live migration to vdicnode01 failed: 1<br>
&gt; \n ]<br>
&gt; Aug 31 23:38:27 [1521] vdicnode02        cib:     info:<br>
&gt; cib_process_request:    Forwarding cib_modify operation for section<br>
&gt; status to all (origin=local/crmd/14)<br>
&gt; Aug 31 23:38:27 [1521] vdicnode02        cib:     info:<br>
&gt; cib_perform_op: Diff: --- 0.163.1 2<br>
&gt; Aug 31 23:38:27 [1521] vdicnode02        cib:     info:<br>
&gt; cib_perform_op: Diff: +++ 0.163.2 (null)<br>
&gt; Aug 31 23:38:27 [1521] vdicnode02        cib:     info:<br>
&gt; cib_perform_op: +  /cib:  @num_updates=2<br>
&gt; Aug 31 23:38:27 [1521] vdicnode02        cib:     info:<br>
&gt; cib_perform_op: +<br>
&gt;  /cib/status/node_state[@id=&#39;2&#39;<wbr>]/lrm[@id=&#39;2&#39;]/lrm_resources/<wbr>lrm_resource[@id=&#39;vm-vdicdb01&#39;<wbr>]/lrm_rsc_op[@id=&#39;vm-vdicdb01_<wbr>last_0&#39;]:  @crm-debug-origin=do_update_<wbr>resource, @transition-magic=0:1;6:6:0:<wbr>fe1a9b0a-816c-4b97-96cb-<wbr>b90dbf71417a, @call-id=9, @rc-code=1, @op-status=0, @exec-time=78, @exit-reason=vdicdb01: live migration to vdicnode01 failed: 1<br>
&gt; Aug 31 23:38:27 [1521] vdicnode02        cib:     info:<br>
&gt; cib_perform_op:<br>
&gt; ++ /cib/status/node_state[@id=&#39;2&#39;<wbr>]/lrm[@id=&#39;2&#39;]/lrm_resources/<wbr>lrm_resource[@id=&#39;vm-vdicdb01&#39;<wbr>]:  &lt;lrm_rsc_op id=&quot;vm-vdicdb01_last_failure_<wbr>0&quot; operation_key=&quot;vm-vdicdb01_<wbr>migrate_to_0&quot; operation=&quot;migrate_to&quot; crm-debug-origin=&quot;do_update_<wbr>resource&quot; crm_feature_set=&quot;3.0.10&quot; transition-key=&quot;6:6:0:<wbr>fe1a9b0a-816c-4b97-96cb-<wbr>b90dbf71417a&quot; transition-magic=&quot;0:1;6:6:0:<wbr>fe1a9b0a-816c-4b97-96cb-<wbr>b90dbf71417a&quot; exit-reason=&quot;vdicdb01: live migration to vdicn<br>
&gt; Aug 31 23:38:27 [1521] vdicnode02        cib:     info:<br>
&gt; cib_process_request:    Completed cib_modify operation for section<br>
&gt; status: OK (rc=0, origin=vdicnode02/crmd/14, version=0.163.2)<br>
&gt; Aug 31 23:38:27 [1526] vdicnode02       crmd:     info:<br>
&gt; do_lrm_rsc_op:  Performing<br>
&gt; key=2:7:0:fe1a9b0a-816c-4b97-<wbr>96cb-b90dbf71417a op=vm-vdicdb01_stop_0<br>
&gt; Aug 31 23:38:27 [1523] vdicnode02       lrmd:     info: log_execute:<br>
&gt;    executing - rsc:vm-vdicdb01 action:stop call_id:10<br>
&gt; VirtualDomain(vm-vdicdb01)[<wbr>5285]:       2017/08/31_23:38:27 INFO:<br>
&gt; Issuing graceful shutdown request for domain vdicdb01.<br>
&gt; Aug 31 23:38:27 [1521] vdicnode02        cib:     info:<br>
&gt; cib_perform_op: Diff: --- 0.163.2 2<br>
&gt; Aug 31 23:38:27 [1521] vdicnode02        cib:     info:<br>
&gt; cib_perform_op: Diff: +++ 0.163.3 (null)<br>
&gt; Aug 31 23:38:27 [1521] vdicnode02        cib:     info:<br>
&gt; cib_perform_op: +  /cib:  @num_updates=3<br>
&gt; Aug 31 23:38:27 [1521] vdicnode02        cib:     info:<br>
&gt; cib_perform_op: +<br>
&gt;  /cib/status/node_state[@id=&#39;1&#39;<wbr>]/lrm[@id=&#39;1&#39;]/lrm_resources/<wbr>lrm_resource[@id=&#39;vm-vdicdb01&#39;<wbr>]/lrm_rsc_op[@id=&#39;vm-vdicdb01_<wbr>last_0&#39;]:  @operation_key=vm-vdicdb01_<wbr>stop_0, @operation=stop, @transition-key=4:7:0:<wbr>fe1a9b0a-816c-4b97-96cb-<wbr>b90dbf71417a, @transition-magic=0:0;4:7:0:<wbr>fe1a9b0a-816c-4b97-96cb-<wbr>b90dbf71417a, @call-id=6, @rc-code=0, @last-run=1504215507, @last-rc-change=1504215507, @exec-time=57<br>
&gt; Aug 31 23:38:27 [1521] vdicnode02        cib:     info:<br>
&gt; cib_process_request:    Completed cib_modify operation for section<br>
&gt; status: OK (rc=0, origin=vdicnode01/crmd/43, version=0.163.3)<br>
&gt; Aug 31 23:38:30 [1523] vdicnode02       lrmd:     info: log_finished:<br>
&gt; finished - rsc:vm-vdicdb01 action:stop call_id:10 pid:5285 exit-code:0<br>
&gt; exec-time:3159ms queue-time:0ms<br>
&gt; Aug 31 23:38:30 [1526] vdicnode02       crmd:   notice:<br>
&gt; process_lrm_event:      Result of stop operation for vm-vdicdb01 on<br>
&gt; vdicnode02: 0 (ok) | call=10 key=vm-vdicdb01_stop_0 confirmed=true<br>
&gt; cib-update=15<br>
&gt; Aug 31 23:38:30 [1521] vdicnode02        cib:     info:<br>
&gt; cib_process_request:    Forwarding cib_modify operation for section<br>
&gt; status to all (origin=local/crmd/15)<br>
&gt; Aug 31 23:38:30 [1521] vdicnode02        cib:     info:<br>
&gt; cib_perform_op: Diff: --- 0.163.3 2<br>
&gt; Aug 31 23:38:30 [1521] vdicnode02        cib:     info:<br>
&gt; cib_perform_op: Diff: +++ 0.163.4 (null)<br>
&gt; Aug 31 23:38:30 [1521] vdicnode02        cib:     info:<br>
&gt; cib_perform_op: +  /cib:  @num_updates=4<br>
&gt; Aug 31 23:38:30 [1521] vdicnode02        cib:     info:<br>
&gt; cib_perform_op: +<br>
&gt;  /cib/status/node_state[@id=&#39;2&#39;<wbr>]/lrm[@id=&#39;2&#39;]/lrm_resources/<wbr>lrm_resource[@id=&#39;vm-vdicdb01&#39;<wbr>]/lrm_rsc_op[@id=&#39;vm-vdicdb01_<wbr>last_0&#39;]:  @operation_key=vm-vdicdb01_<wbr>stop_0, @operation=stop, @transition-key=2:7:0:<wbr>fe1a9b0a-816c-4b97-96cb-<wbr>b90dbf71417a, @transition-magic=0:0;2:7:0:<wbr>fe1a9b0a-816c-4b97-96cb-<wbr>b90dbf71417a, @call-id=10, @rc-code=0, @exec-time=3159<br>
&gt; Aug 31 23:38:30 [1521] vdicnode02        cib:     info:<br>
&gt; cib_process_request:    Completed cib_modify operation for section<br>
&gt; status: OK (rc=0, origin=vdicnode02/crmd/15, version=0.163.4)<br>
&gt; Aug 31 23:38:31 [1521] vdicnode02        cib:     info:<br>
&gt; cib_perform_op: Diff: --- 0.163.4 2<br>
&gt; Aug 31 23:38:31 [1521] vdicnode02        cib:     info:<br>
&gt; cib_perform_op: Diff: +++ 0.163.5 (null)<br>
&gt; Aug 31 23:38:31 [1521] vdicnode02        cib:     info:<br>
&gt; cib_perform_op: +  /cib:  @num_updates=5<br>
&gt; Aug 31 23:38:31 [1521] vdicnode02        cib:     info:<br>
&gt; cib_perform_op: +<br>
&gt;  /cib/status/node_state[@id=&#39;1&#39;<wbr>]/lrm[@id=&#39;1&#39;]/lrm_resources/<wbr>lrm_resource[@id=&#39;vm-vdicdb01&#39;<wbr>]/lrm_rsc_op[@id=&#39;vm-vdicdb01_<wbr>last_0&#39;]:  @operation_key=vm-vdicdb01_<wbr>start_0, @operation=start, @transition-key=5:7:0:<wbr>fe1a9b0a-816c-4b97-96cb-<wbr>b90dbf71417a, @transition-magic=0:0;5:7:0:<wbr>fe1a9b0a-816c-4b97-96cb-<wbr>b90dbf71417a, @call-id=7, @last-run=1504215510, @last-rc-change=1504215510, @exec-time=528<br>
&gt; Aug 31 23:38:31 [1521] vdicnode02        cib:     info:<br>
&gt; cib_process_request:    Completed cib_modify operation for section<br>
&gt; status: OK (rc=0, origin=vdicnode01/crmd/44, version=0.163.5)<br>
&gt; Aug 31 23:38:31 [1521] vdicnode02        cib:     info:<br>
&gt; cib_perform_op: Diff: --- 0.163.5 2<br>
&gt; Aug 31 23:38:31 [1521] vdicnode02        cib:     info:<br>
&gt; cib_perform_op: Diff: +++ 0.163.6 (null)<br>
&gt; Aug 31 23:38:31 [1521] vdicnode02        cib:     info:<br>
&gt; cib_perform_op: +  /cib:  @num_updates=6<br>
&gt; Aug 31 23:38:31 [1521] vdicnode02        cib:     info:<br>
&gt; cib_perform_op:<br>
&gt; ++ /cib/status/node_state[@id=&#39;1&#39;<wbr>]/lrm[@id=&#39;1&#39;]/lrm_resources/<wbr>lrm_resource[@id=&#39;vm-vdicdb01&#39;<wbr>]:  &lt;lrm_rsc_op id=&quot;vm-vdicdb01_monitor_10000&quot; operation_key=&quot;vm-vdicdb01_<wbr>monitor_10000&quot; operation=&quot;monitor&quot; crm-debug-origin=&quot;do_update_<wbr>resource&quot; crm_feature_set=&quot;3.0.10&quot; transition-key=&quot;6:7:0:<wbr>fe1a9b0a-816c-4b97-96cb-<wbr>b90dbf71417a&quot; transition-magic=&quot;0:0;6:7:0:<wbr>fe1a9b0a-816c-4b97-96cb-<wbr>b90dbf71417a&quot; on_node=&quot;vdicnode01&quot; call-id=&quot;8&quot; rc-code=&quot;0&quot; op-s<br>
&gt; Aug 31 23:38:31 [1521] vdicnode02        cib:     info:<br>
&gt; cib_process_request:    Completed cib_modify operation for section<br>
&gt; status: OK (rc=0, origin=vdicnode01/crmd/45, version=0.163.6)<br>
&gt; Aug 31 23:38:36 [1521] vdicnode02        cib:     info:<br>
&gt; cib_process_ping:       Reporting our current digest to vdicnode01:<br>
&gt; 9141ea9880f5a44b133003982d863b<wbr>c8 for 0.163.6 (0x7f96bb2625a0 0)<br>
&gt;<br>
&gt;<br>
&gt;<br>
&gt;<br>
&gt;<br>
&gt;<br>
&gt; vdicnode01 - corosync.log<br>
&gt; Aug 31 23:38:27 [1531] vdicnode01        cib:     info:<br>
&gt; cib_process_request:    Forwarding cib_replace operation for section<br>
&gt; configuration to all (origin=local/cibadmin/2)<br>
&gt; Aug 31 23:38:27 [1531] vdicnode01        cib:     info:<br>
&gt; cib_perform_op: Diff: --- 0.162.0 2<br>
&gt; Aug 31 23:38:27 [1531] vdicnode01        cib:     info:<br>
&gt; cib_perform_op: Diff: +++ 0.163.0 (null)<br>
&gt; Aug 31 23:38:27 [1531] vdicnode01        cib:     info:<br>
&gt; cib_perform_op: +  /cib:  @epoch=163<br>
&gt; Aug 31 23:38:27 [1531] vdicnode01        cib:     info:<br>
&gt; cib_perform_op: ++ /cib/configuration/<wbr>constraints:  &lt;rsc_location<br>
&gt; id=&quot;location-vm-vdicdb01-<wbr>vdicnode02--INFINITY&quot; node=&quot;vdicnode02&quot;<br>
&gt; rsc=&quot;vm-vdicdb01&quot; score=&quot;-INFINITY&quot;/&gt;<br>
&gt; Aug 31 23:38:27 [1531] vdicnode01        cib:     info:<br>
&gt; cib_process_request:    Completed cib_replace operation for section<br>
&gt; configuration: OK (rc=0, origin=vdicnode01/cibadmin/2,<br>
&gt; version=0.163.0)<br>
&gt; Aug 31 23:38:27 [1536] vdicnode01       crmd:     info:<br>
&gt; abort_transition_graph: Transition aborted by<br>
&gt; rsc_location.location-vm-<wbr>vdicdb01-vdicnode02--INFINITY &#39;create&#39;:<br>
&gt; Non-status change | cib=0.163.0 source=te_update_diff:436<br>
&gt; path=/cib/configuration/<wbr>constraints complete=true<br>
&gt; Aug 31 23:38:27 [1536] vdicnode01       crmd:   notice:<br>
&gt; do_state_transition:    State transition S_IDLE -&gt; S_POLICY_ENGINE |<br>
&gt; input=I_PE_CALC cause=C_FSA_INTERNAL origin=abort_transition_graph<br>
&gt; Aug 31 23:38:27 [1531] vdicnode01        cib:     info:<br>
&gt; cib_file_backup:        Archived previous version<br>
&gt; as /var/lib/pacemaker/cib/cib-85.<wbr>raw<br>
&gt; Aug 31 23:38:27 [1531] vdicnode01        cib:     info:<br>
&gt; cib_file_write_with_digest:     Wrote version 0.163.0 of the CIB to<br>
&gt; disk (digest: 47a548b36746de9275d66cc6aeb0fd<wbr>c4)<br>
&gt; Aug 31 23:38:27 [1531] vdicnode01        cib:     info:<br>
&gt; cib_file_write_with_digest:     Reading cluster configuration<br>
&gt; file /var/lib/pacemaker/cib/cib.<wbr>npBIW2<br>
&gt; (digest: /var/lib/pacemaker/cib/cib.<wbr>bDogoB)<br>
&gt; Aug 31 23:38:27 [1535] vdicnode01    pengine:     info:<br>
&gt; determine_online_status:        Node vdicnode02 is online<br>
&gt; Aug 31 23:38:27 [1535] vdicnode01    pengine:     info:<br>
&gt; determine_online_status:        Node vdicnode01 is online<br>
&gt; Aug 31 23:38:27 [1535] vdicnode01    pengine:     info: native_print:<br>
&gt; vm-vdicdb01     (ocf::heartbeat:VirtualDomain)<wbr>: Started vdicnode02<br>
&gt; Aug 31 23:38:27 [1535] vdicnode01    pengine:     info: RecurringOp:<br>
&gt; Start recurring monitor (10s) for vm-vdicdb01 on vdicnode01<br>
&gt; Aug 31 23:38:27 [1535] vdicnode01    pengine:   notice: LogActions:<br>
&gt; Migrate vm-vdicdb01     (Started vdicnode02 -&gt; vdicnode01)<br>
&gt; Aug 31 23:38:27 [1535] vdicnode01    pengine:   notice:<br>
&gt; process_pe_message:     Calculated transition 6, saving inputs<br>
&gt; in /var/lib/pacemaker/pengine/pe-<wbr>input-96.bz2<br>
&gt; Aug 31 23:38:27 [1536] vdicnode01       crmd:     info:<br>
&gt; do_state_transition:    State transition S_POLICY_ENGINE -&gt;<br>
&gt; S_TRANSITION_ENGINE | input=I_PE_SUCCESS cause=C_IPC_MESSAGE<br>
&gt; origin=handle_response<br>
&gt; Aug 31 23:38:27 [1536] vdicnode01       crmd:     info: do_te_invoke:<br>
&gt; Processing graph 6 (ref=pe_calc-dc-1504215507-24) derived<br>
&gt; from /var/lib/pacemaker/pengine/pe-<wbr>input-96.bz2<br>
&gt; Aug 31 23:38:27 [1536] vdicnode01       crmd:   notice:<br>
&gt; te_rsc_command: Initiating migrate_to operation<br>
&gt; vm-vdicdb01_migrate_to_0 on vdicnode02 | action 6<br>
&gt; Aug 31 23:38:27 [1536] vdicnode01       crmd:     info:<br>
&gt; create_operation_update:        cib_action_update: Updating resource<br>
&gt; vm-vdicdb01 after migrate_to op pending (interval=0)<br>
&gt; Aug 31 23:38:27 [1531] vdicnode01        cib:     info:<br>
&gt; cib_process_request:    Forwarding cib_modify operation for section<br>
&gt; status to all (origin=local/crmd/41)<br>
&gt; Aug 31 23:38:27 [1531] vdicnode01        cib:     info:<br>
&gt; cib_perform_op: Diff: --- 0.163.0 2<br>
&gt; Aug 31 23:38:27 [1531] vdicnode01        cib:     info:<br>
&gt; cib_perform_op: Diff: +++ 0.163.1 (null)<br>
&gt; Aug 31 23:38:27 [1531] vdicnode01        cib:     info:<br>
&gt; cib_perform_op: +  /cib:  @num_updates=1<br>
&gt; Aug 31 23:38:27 [1531] vdicnode01        cib:     info:<br>
&gt; cib_perform_op: +<br>
&gt;  /cib/status/node_state[@id=&#39;2&#39;<wbr>]/lrm[@id=&#39;2&#39;]/lrm_resources/<wbr>lrm_resource[@id=&#39;vm-vdicdb01&#39;<wbr>]/lrm_rsc_op[@id=&#39;vm-vdicdb01_<wbr>last_0&#39;]:  @operation_key=vm-vdicdb01_<wbr>migrate_to_0, @operation=migrate_to, @crm-debug-origin=cib_action_<wbr>update, @transition-key=6:6:0:<wbr>fe1a9b0a-816c-4b97-96cb-<wbr>b90dbf71417a, @transition-magic=-1:193;6:6:<wbr>0:fe1a9b0a-816c-4b97-96cb-<wbr>b90dbf71417a, @call-id=-1, @rc-code=193, @op-status=-1, @last-run=1504215507, @last-rc-cha<br>
&gt; Aug 31 23:38:27 [1531] vdicnode01        cib:     info:<br>
&gt; cib_process_request:    Completed cib_modify operation for section<br>
&gt; status: OK (rc=0, origin=vdicnode01/crmd/41, version=0.163.1)<br>
&gt; Aug 31 23:38:27 [1531] vdicnode01        cib:     info:<br>
&gt; cib_perform_op: Diff: --- 0.163.1 2<br>
&gt; Aug 31 23:38:27 [1531] vdicnode01        cib:     info:<br>
&gt; cib_perform_op: Diff: +++ 0.163.2 (null)<br>
&gt; Aug 31 23:38:27 [1531] vdicnode01        cib:     info:<br>
&gt; cib_perform_op: +  /cib:  @num_updates=2<br>
&gt; Aug 31 23:38:27 [1531] vdicnode01        cib:     info:<br>
&gt; cib_perform_op: +<br>
&gt;  /cib/status/node_state[@id=&#39;2&#39;<wbr>]/lrm[@id=&#39;2&#39;]/lrm_resources/<wbr>lrm_resource[@id=&#39;vm-vdicdb01&#39;<wbr>]/lrm_rsc_op[@id=&#39;vm-vdicdb01_<wbr>last_0&#39;]:  @crm-debug-origin=do_update_<wbr>resource, @transition-magic=0:1;6:6:0:<wbr>fe1a9b0a-816c-4b97-96cb-<wbr>b90dbf71417a, @call-id=9, @rc-code=1, @op-status=0, @exec-time=78, @exit-reason=vdicdb01: live migration to vdicnode01 failed: 1<br>
&gt; Aug 31 23:38:27 [1531] vdicnode01        cib:     info:<br>
&gt; cib_perform_op:<br>
&gt; ++ /cib/status/node_state[@id=&#39;2&#39;<wbr>]/lrm[@id=&#39;2&#39;]/lrm_resources/<wbr>lrm_resource[@id=&#39;vm-vdicdb01&#39;<wbr>]:  &lt;lrm_rsc_op id=&quot;vm-vdicdb01_last_failure_<wbr>0&quot; operation_key=&quot;vm-vdicdb01_<wbr>migrate_to_0&quot; operation=&quot;migrate_to&quot; crm-debug-origin=&quot;do_update_<wbr>resource&quot; crm_feature_set=&quot;3.0.10&quot; transition-key=&quot;6:6:0:<wbr>fe1a9b0a-816c-4b97-96cb-<wbr>b90dbf71417a&quot; transition-magic=&quot;0:1;6:6:0:<wbr>fe1a9b0a-816c-4b97-96cb-<wbr>b90dbf71417a&quot; exit-reason=&quot;vdicdb01: live migration to vdicn<br>
&gt; Aug 31 23:38:27 [1531] vdicnode01        cib:     info:<br>
&gt; cib_process_request:    Completed cib_modify operation for section<br>
&gt; status: OK (rc=0, origin=vdicnode02/crmd/14, version=0.163.2)<br>
&gt; Aug 31 23:38:27 [1536] vdicnode01       crmd:  warning:<br>
&gt; status_from_rc: Action 6 (vm-vdicdb01_migrate_to_0) on vdicnode02<br>
&gt; failed (target: 0 vs. rc: 1): Error<br>
&gt; Aug 31 23:38:27 [1536] vdicnode01       crmd:   notice:<br>
&gt; abort_transition_graph: Transition aborted by operation<br>
&gt; vm-vdicdb01_migrate_to_0 &#39;modify&#39; on vdicnode02: Event failed |<br>
&gt; magic=0:1;6:6:0:fe1a9b0a-816c-<wbr>4b97-96cb-b90dbf71417a cib=0.163.2<br>
&gt; source=match_graph_event:310 complete=false<br>
&gt; Aug 31 23:38:27 [1536] vdicnode01       crmd:     info:<br>
&gt; match_graph_event:      Action vm-vdicdb01_migrate_to_0 (6) confirmed<br>
&gt; on vdicnode02 (rc=1)<br>
&gt; Aug 31 23:38:27 [1536] vdicnode01       crmd:     info:<br>
&gt; process_graph_event:    Detected action (6.6)<br>
&gt; vm-vdicdb01_migrate_to_0.9=<wbr>unknown error: failed<br>
&gt; Aug 31 23:38:27 [1536] vdicnode01       crmd:  warning:<br>
&gt; status_from_rc: Action 6 (vm-vdicdb01_migrate_to_0) on vdicnode02<br>
&gt; failed (target: 0 vs. rc: 1): Error<br>
&gt; Aug 31 23:38:27 [1536] vdicnode01       crmd:     info:<br>
&gt; abort_transition_graph: Transition aborted by operation<br>
&gt; vm-vdicdb01_migrate_to_0 &#39;create&#39; on vdicnode02: Event failed |<br>
&gt; magic=0:1;6:6:0:fe1a9b0a-816c-<wbr>4b97-96cb-b90dbf71417a cib=0.163.2<br>
&gt; source=match_graph_event:310 complete=false<br>
&gt; Aug 31 23:38:27 [1536] vdicnode01       crmd:     info:<br>
&gt; match_graph_event:      Action vm-vdicdb01_migrate_to_0 (6) confirmed<br>
&gt; on vdicnode02 (rc=1)<br>
&gt; Aug 31 23:38:27 [1536] vdicnode01       crmd:     info:<br>
&gt; process_graph_event:    Detected action (6.6)<br>
&gt; vm-vdicdb01_migrate_to_0.9=<wbr>unknown error: failed<br>
&gt; Aug 31 23:38:27 [1536] vdicnode01       crmd:   notice: run_graph:<br>
&gt;    Transition 6 (Complete=1, Pending=0, Fired=0, Skipped=0,<br>
&gt; Incomplete=5, Source=/var/lib/pacemaker/<wbr>pengine/pe-input-96.bz2):<br>
&gt; Complete<br>
&gt; Aug 31 23:38:27 [1536] vdicnode01       crmd:     info:<br>
&gt; do_state_transition:    State transition S_TRANSITION_ENGINE -&gt;<br>
&gt; S_POLICY_ENGINE | input=I_PE_CALC cause=C_FSA_INTERNAL<br>
&gt; origin=notify_crmd<br>
&gt; Aug 31 23:38:27 [1535] vdicnode01    pengine:     info:<br>
&gt; determine_online_status:        Node vdicnode02 is online<br>
&gt; Aug 31 23:38:27 [1535] vdicnode01    pengine:     info:<br>
&gt; determine_online_status:        Node vdicnode01 is online<br>
&gt; Aug 31 23:38:27 [1535] vdicnode01    pengine:  warning:<br>
&gt; unpack_rsc_op_failure:  Processing failed op migrate_to for<br>
&gt; vm-vdicdb01 on vdicnode02: unknown error (1)<br>
&gt; Aug 31 23:38:27 [1535] vdicnode01    pengine:  warning:<br>
&gt; unpack_rsc_op_failure:  Processing failed op migrate_to for<br>
&gt; vm-vdicdb01 on vdicnode02: unknown error (1)<br>
&gt; Aug 31 23:38:27 [1535] vdicnode01    pengine:     info: native_print:<br>
&gt; vm-vdicdb01     (ocf::heartbeat:VirtualDomain)<wbr>: FAILED<br>
&gt; Aug 31 23:38:27 [1535] vdicnode01    pengine:     info: native_print:<br>
&gt; 1 : vdicnode01<br>
&gt; Aug 31 23:38:27 [1535] vdicnode01    pengine:     info: native_print:<br>
&gt; 2 : vdicnode02<br>
&gt; Aug 31 23:38:27 [1535] vdicnode01    pengine:    error:<br>
&gt; native_create_actions:  Resource vm-vdicdb01 (ocf::VirtualDomain) is<br>
&gt; active on 2 nodes attempting recovery<br>
&gt; Aug 31 23:38:27 [1535] vdicnode01    pengine:  warning:<br>
&gt; native_create_actions:  See<br>
&gt; <a href="http://clusterlabs.org/wiki/FAQ#Resource_is_Too_Active" rel="noreferrer" target="_blank">http://clusterlabs.org/wiki/<wbr>FAQ#Resource_is_Too_Active</a> for more<br>
&gt; information.<br>
&gt; Aug 31 23:38:27 [1535] vdicnode01    pengine:     info: RecurringOp:<br>
&gt; Start recurring monitor (10s) for vm-vdicdb01 on vdicnode01<br>
&gt; Aug 31 23:38:27 [1535] vdicnode01    pengine:   notice: LogActions:<br>
&gt; Recover vm-vdicdb01     (Started vdicnode01)<br>
&gt; Aug 31 23:38:27 [1535] vdicnode01    pengine:    error:<br>
&gt; process_pe_message:     Calculated transition 7 (with errors), saving<br>
&gt; inputs in /var/lib/pacemaker/pengine/pe-<wbr>error-8.bz2<br>
&gt; Aug 31 23:38:27 [1536] vdicnode01       crmd:     info:<br>
&gt; do_state_transition:    State transition S_POLICY_ENGINE -&gt;<br>
&gt; S_TRANSITION_ENGINE | input=I_PE_SUCCESS cause=C_IPC_MESSAGE<br>
&gt; origin=handle_response<br>
&gt; Aug 31 23:38:27 [1536] vdicnode01       crmd:     info: do_te_invoke:<br>
&gt; Processing graph 7 (ref=pe_calc-dc-1504215507-26) derived<br>
&gt; from /var/lib/pacemaker/pengine/pe-<wbr>error-8.bz2<br>
&gt; Aug 31 23:38:27 [1536] vdicnode01       crmd:   notice:<br>
&gt; te_rsc_command: Initiating stop operation vm-vdicdb01_stop_0 locally<br>
&gt; on vdicnode01 | action 4<br>
&gt; Aug 31 23:38:27 [1536] vdicnode01       crmd:     info:<br>
&gt; do_lrm_rsc_op:  Performing<br>
&gt; key=4:7:0:fe1a9b0a-816c-4b97-<wbr>96cb-b90dbf71417a op=vm-vdicdb01_stop_0<br>
&gt; Aug 31 23:38:27 [1533] vdicnode01       lrmd:     info: log_execute:<br>
&gt;    executing - rsc:vm-vdicdb01 action:stop call_id:6<br>
&gt; Aug 31 23:38:27 [1536] vdicnode01       crmd:   notice:<br>
&gt; te_rsc_command: Initiating stop operation vm-vdicdb01_stop_0 on<br>
&gt; vdicnode02 | action 2<br>
&gt; VirtualDomain(vm-vdicdb01)[<wbr>5268]:       2017/08/31_23:38:27 INFO:<br>
&gt; Domain vdicdb01 already stopped.<br>
&gt; Aug 31 23:38:27 [1533] vdicnode01       lrmd:     info: log_finished:<br>
&gt; finished - rsc:vm-vdicdb01 action:stop call_id:6 pid:5268 exit-code:0<br>
&gt; exec-time:57ms queue-time:0ms<br>
&gt; Aug 31 23:38:27 [1536] vdicnode01       crmd:   notice:<br>
&gt; process_lrm_event:      Result of stop operation for vm-vdicdb01 on<br>
&gt; vdicnode01: 0 (ok) | call=6 key=vm-vdicdb01_stop_0 confirmed=true<br>
&gt; cib-update=43<br>
&gt; Aug 31 23:38:27 [1531] vdicnode01        cib:     info:<br>
&gt; cib_process_request:    Forwarding cib_modify operation for section<br>
&gt; status to all (origin=local/crmd/43)<br>
&gt; Aug 31 23:38:27 [1531] vdicnode01        cib:     info:<br>
&gt; cib_perform_op: Diff: --- 0.163.2 2<br>
&gt; Aug 31 23:38:27 [1531] vdicnode01        cib:     info:<br>
&gt; cib_perform_op: Diff: +++ 0.163.3 (null)<br>
&gt; Aug 31 23:38:27 [1531] vdicnode01        cib:     info:<br>
&gt; cib_perform_op: +  /cib:  @num_updates=3<br>
&gt; Aug 31 23:38:27 [1531] vdicnode01        cib:     info:<br>
&gt; cib_perform_op: +<br>
&gt;  /cib/status/node_state[@id=&#39;1&#39;<wbr>]/lrm[@id=&#39;1&#39;]/lrm_resources/<wbr>lrm_resource[@id=&#39;vm-vdicdb01&#39;<wbr>]/lrm_rsc_op[@id=&#39;vm-vdicdb01_<wbr>last_0&#39;]:  @operation_key=vm-vdicdb01_<wbr>stop_0, @operation=stop, @transition-key=4:7:0:<wbr>fe1a9b0a-816c-4b97-96cb-<wbr>b90dbf71417a, @transition-magic=0:0;4:7:0:<wbr>fe1a9b0a-816c-4b97-96cb-<wbr>b90dbf71417a, @call-id=6, @rc-code=0, @last-run=1504215507, @last-rc-change=1504215507, @exec-time=57<br>
&gt; Aug 31 23:38:27 [1536] vdicnode01       crmd:     info:<br>
&gt; match_graph_event:      Action vm-vdicdb01_stop_0 (4) confirmed on<br>
&gt; vdicnode01 (rc=0)<br>
&gt; Aug 31 23:38:27 [1531] vdicnode01        cib:     info:<br>
&gt; cib_process_request:    Completed cib_modify operation for section<br>
&gt; status: OK (rc=0, origin=vdicnode01/crmd/43, version=0.163.3)<br>
&gt; Aug 31 23:38:30 [1531] vdicnode01        cib:     info:<br>
&gt; cib_perform_op: Diff: --- 0.163.3 2<br>
&gt; Aug 31 23:38:30 [1531] vdicnode01        cib:     info:<br>
&gt; cib_perform_op: Diff: +++ 0.163.4 (null)<br>
&gt; Aug 31 23:38:30 [1531] vdicnode01        cib:     info:<br>
&gt; cib_perform_op: +  /cib:  @num_updates=4<br>
&gt; Aug 31 23:38:30 [1531] vdicnode01        cib:     info:<br>
&gt; cib_perform_op: +<br>
&gt;  /cib/status/node_state[@id=&#39;2&#39;<wbr>]/lrm[@id=&#39;2&#39;]/lrm_resources/<wbr>lrm_resource[@id=&#39;vm-vdicdb01&#39;<wbr>]/lrm_rsc_op[@id=&#39;vm-vdicdb01_<wbr>last_0&#39;]:  @operation_key=vm-vdicdb01_<wbr>stop_0, @operation=stop, @transition-key=2:7:0:<wbr>fe1a9b0a-816c-4b97-96cb-<wbr>b90dbf71417a, @transition-magic=0:0;2:7:0:<wbr>fe1a9b0a-816c-4b97-96cb-<wbr>b90dbf71417a, @call-id=10, @rc-code=0, @exec-time=3159<br>
&gt; Aug 31 23:38:30 [1531] vdicnode01        cib:     info:<br>
&gt; cib_process_request:    Completed cib_modify operation for section<br>
&gt; status: OK (rc=0, origin=vdicnode02/crmd/15, version=0.163.4)<br>
&gt; Aug 31 23:38:30 [1536] vdicnode01       crmd:     info:<br>
&gt; match_graph_event:      Action vm-vdicdb01_stop_0 (2) confirmed on<br>
&gt; vdicnode02 (rc=0)<br>
&gt; Aug 31 23:38:30 [1536] vdicnode01       crmd:   notice:<br>
&gt; te_rsc_command: Initiating start operation vm-vdicdb01_start_0 locally<br>
&gt; on vdicnode01 | action 5<br>
&gt; Aug 31 23:38:30 [1536] vdicnode01       crmd:     info:<br>
&gt; do_lrm_rsc_op:  Performing<br>
&gt; key=5:7:0:fe1a9b0a-816c-4b97-<wbr>96cb-b90dbf71417a op=vm-vdicdb01_start_0<br>
&gt; Aug 31 23:38:30 [1533] vdicnode01       lrmd:     info: log_execute:<br>
&gt;    executing - rsc:vm-vdicdb01 action:start call_id:7<br>
&gt; Aug 31 23:38:31 [1533] vdicnode01       lrmd:     info: log_finished:<br>
&gt; finished - rsc:vm-vdicdb01 action:start call_id:7 pid:5401 exit-code:0<br>
&gt; exec-time:528ms queue-time:0ms<br>
&gt; Aug 31 23:38:31 [1536] vdicnode01       crmd:     info:<br>
&gt; action_synced_wait:     Managed VirtualDomain_meta-data_0 process 5486<br>
&gt; exited with rc=0<br>
&gt; Aug 31 23:38:31 [1536] vdicnode01       crmd:   notice:<br>
&gt; process_lrm_event:      Result of start operation for vm-vdicdb01 on<br>
&gt; vdicnode01: 0 (ok) | call=7 key=vm-vdicdb01_start_0 confirmed=true<br>
&gt; cib-update=44<br>
&gt; Aug 31 23:38:31 [1531] vdicnode01        cib:     info:<br>
&gt; cib_process_request:    Forwarding cib_modify operation for section<br>
&gt; status to all (origin=local/crmd/44)<br>
&gt; Aug 31 23:38:31 [1531] vdicnode01        cib:     info:<br>
&gt; cib_perform_op: Diff: --- 0.163.4 2<br>
&gt; Aug 31 23:38:31 [1531] vdicnode01        cib:     info:<br>
&gt; cib_perform_op: Diff: +++ 0.163.5 (null)<br>
&gt; Aug 31 23:38:31 [1531] vdicnode01        cib:     info:<br>
&gt; cib_perform_op: +  /cib:  @num_updates=5<br>
&gt; Aug 31 23:38:31 [1531] vdicnode01        cib:     info:<br>
&gt; cib_perform_op: +<br>
&gt;  /cib/status/node_state[@id=&#39;1&#39;<wbr>]/lrm[@id=&#39;1&#39;]/lrm_resources/<wbr>lrm_resource[@id=&#39;vm-vdicdb01&#39;<wbr>]/lrm_rsc_op[@id=&#39;vm-vdicdb01_<wbr>last_0&#39;]:  @operation_key=vm-vdicdb01_<wbr>start_0, @operation=start, @transition-key=5:7:0:<wbr>fe1a9b0a-816c-4b97-96cb-<wbr>b90dbf71417a, @transition-magic=0:0;5:7:0:<wbr>fe1a9b0a-816c-4b97-96cb-<wbr>b90dbf71417a, @call-id=7, @last-run=1504215510, @last-rc-change=1504215510, @exec-time=528<br>
&gt; Aug 31 23:38:31 [1531] vdicnode01        cib:     info:<br>
&gt; cib_process_request:    Completed cib_modify operation for section<br>
&gt; status: OK (rc=0, origin=vdicnode01/crmd/44, version=0.163.5)<br>
&gt; Aug 31 23:38:31 [1536] vdicnode01       crmd:     info:<br>
&gt; match_graph_event:      Action vm-vdicdb01_start_0 (5) confirmed on<br>
&gt; vdicnode01 (rc=0)<br>
&gt; Aug 31 23:38:31 [1536] vdicnode01       crmd:   notice:<br>
&gt; te_rsc_command: Initiating monitor operation vm-vdicdb01_monitor_10000<br>
&gt; locally on vdicnode01 | action 6<br>
&gt; Aug 31 23:38:31 [1536] vdicnode01       crmd:     info:<br>
&gt; do_lrm_rsc_op:  Performing<br>
&gt; key=6:7:0:fe1a9b0a-816c-4b97-<wbr>96cb-b90dbf71417a<br>
&gt; op=vm-vdicdb01_monitor_10000<br>
&gt; Aug 31 23:38:31 [1536] vdicnode01       crmd:     info:<br>
&gt; process_lrm_event:      Result of monitor operation for vm-vdicdb01 on<br>
&gt; vdicnode01: 0 (ok) | call=8 key=vm-vdicdb01_monitor_10000<br>
&gt; confirmed=false cib-update=45<br>
&gt; Aug 31 23:38:31 [1531] vdicnode01        cib:     info:<br>
&gt; cib_process_request:    Forwarding cib_modify operation for section<br>
&gt; status to all (origin=local/crmd/45)<br>
&gt; Aug 31 23:38:31 [1531] vdicnode01        cib:     info:<br>
&gt; cib_perform_op: Diff: --- 0.163.5 2<br>
&gt; Aug 31 23:38:31 [1531] vdicnode01        cib:     info:<br>
&gt; cib_perform_op: Diff: +++ 0.163.6 (null)<br>
&gt; Aug 31 23:38:31 [1531] vdicnode01        cib:     info:<br>
&gt; cib_perform_op: +  /cib:  @num_updates=6<br>
&gt; Aug 31 23:38:31 [1531] vdicnode01        cib:     info:<br>
&gt; cib_perform_op:<br>
&gt; ++ /cib/status/node_state[@id=&#39;1&#39;<wbr>]/lrm[@id=&#39;1&#39;]/lrm_resources/<wbr>lrm_resource[@id=&#39;vm-vdicdb01&#39;<wbr>]:  &lt;lrm_rsc_op id=&quot;vm-vdicdb01_monitor_10000&quot; operation_key=&quot;vm-vdicdb01_<wbr>monitor_10000&quot; operation=&quot;monitor&quot; crm-debug-origin=&quot;do_update_<wbr>resource&quot; crm_feature_set=&quot;3.0.10&quot; transition-key=&quot;6:7:0:<wbr>fe1a9b0a-816c-4b97-96cb-<wbr>b90dbf71417a&quot; transition-magic=&quot;0:0;6:7:0:<wbr>fe1a9b0a-816c-4b97-96cb-<wbr>b90dbf71417a&quot; on_node=&quot;vdicnode01&quot; call-id=&quot;8&quot; rc-code=&quot;0&quot; op-s<br>
&gt; Aug 31 23:38:31 [1531] vdicnode01        cib:     info:<br>
&gt; cib_process_request:    Completed cib_modify operation for section<br>
&gt; status: OK (rc=0, origin=vdicnode01/crmd/45, version=0.163.6)<br>
&gt; Aug 31 23:38:31 [1536] vdicnode01       crmd:     info:<br>
&gt; match_graph_event:      Action vm-vdicdb01_monitor_10000 (6) confirmed<br>
&gt; on vdicnode01 (rc=0)<br>
&gt; Aug 31 23:38:31 [1536] vdicnode01       crmd:   notice: run_graph:<br>
&gt;    Transition 7 (Complete=5, Pending=0, Fired=0, Skipped=0,<br>
&gt; Incomplete=0, Source=/var/lib/pacemaker/<wbr>pengine/pe-error-8.bz2):<br>
&gt; Complete<br>
&gt; Aug 31 23:38:31 [1536] vdicnode01       crmd:     info: do_log: Input<br>
&gt; I_TE_SUCCESS received in state S_TRANSITION_ENGINE from notify_crmd<br>
&gt; Aug 31 23:38:31 [1536] vdicnode01       crmd:   notice:<br>
&gt; do_state_transition:    State transition S_TRANSITION_ENGINE -&gt; S_IDLE<br>
&gt; | input=I_TE_SUCCESS cause=C_FSA_INTERNAL origin=notify_crmd<br>
&gt; Aug 31 23:38:36 [1531] vdicnode01        cib:     info:<br>
&gt; cib_process_ping:       Reporting our current digest to vdicnode01:<br>
&gt; 9141ea9880f5a44b133003982d863b<wbr>c8 for 0.163.6 (0x7f61cec09270 0)<br>
&gt;<br>
&gt;<br>
&gt; Thanks a lot<br>
&gt;<br>
&gt; 2017-08-31 16:20 GMT+02:00 Ken Gaillot &lt;<a href="mailto:kgaillot@redhat.com">kgaillot@redhat.com</a>&gt;:<br>
&gt;         On Thu, 2017-08-31 at 01:13 +0200, Oscar Segarra wrote:<br>
&gt;         &gt; Hi,<br>
&gt;         &gt;<br>
&gt;         &gt;<br>
&gt;         &gt; In my environment, I have just two hosts, where qemu-kvm<br>
&gt;         process is<br>
&gt;         &gt; launched by a regular user (oneadmin) - open nebula -<br>
&gt;         &gt;<br>
&gt;         &gt;<br>
&gt;         &gt; I have created a VirtualDomain resource that starts and<br>
&gt;         stops the VM<br>
&gt;         &gt; perfectly. Nevertheless, when I change the location weight<br>
&gt;         in order to<br>
&gt;         &gt; force the migration, It raises a migration failure &quot;error:<br>
&gt;         1&quot;<br>
&gt;         &gt;<br>
&gt;         &gt;<br>
&gt;         &gt; If I execute the virsh migrate command (that appears in<br>
&gt;         corosync.log)<br>
&gt;         &gt; from command line, it works perfectly.<br>
&gt;         &gt;<br>
&gt;         &gt;<br>
&gt;         &gt; Anybody has experienced the same issue?<br>
&gt;         &gt;<br>
&gt;         &gt;<br>
&gt;         &gt; Thanks in advance for your help<br>
&gt;<br>
&gt;<br>
&gt;         If something works from the command line but not when run by a<br>
&gt;         daemon,<br>
&gt;         my first suspicion is SELinux. Check the audit log for denials<br>
&gt;         around<br>
&gt;         that time.<br>
&gt;<br>
&gt;         I&#39;d also check the system log and Pacemaker detail log around<br>
&gt;         that time<br>
&gt;         to see if there is any more information.<br>
&gt;         --<br>
&gt;         Ken Gaillot &lt;<a href="mailto:kgaillot@redhat.com">kgaillot@redhat.com</a>&gt;<br>
&gt;<br>
&gt;<br>
&gt;<br>
&gt;<br>
&gt;<br>
&gt;         ______________________________<wbr>_________________<br>
&gt;         Users mailing list: <a href="mailto:Users@clusterlabs.org">Users@clusterlabs.org</a><br>
&gt;         <a href="http://lists.clusterlabs.org/mailman/listinfo/users" rel="noreferrer" target="_blank">http://lists.clusterlabs.org/<wbr>mailman/listinfo/users</a><br>
&gt;<br>
&gt;         Project Home: <a href="http://www.clusterlabs.org" rel="noreferrer" target="_blank">http://www.clusterlabs.org</a><br>
&gt;         Getting started:<br>
&gt;         <a href="http://www.clusterlabs.org/doc/Cluster_from_Scratch.pdf" rel="noreferrer" target="_blank">http://www.clusterlabs.org/<wbr>doc/Cluster_from_Scratch.pdf</a><br>
&gt;         Bugs: <a href="http://bugs.clusterlabs.org" rel="noreferrer" target="_blank">http://bugs.clusterlabs.org</a><br>
&gt;<br>
&gt;<br>
<br>
</div><font color="#888888">--<br>
Ken Gaillot &lt;<a href="mailto:kgaillot@redhat.com">kgaillot@redhat.com</a>&gt;<br>
<br>
<br>
<br>
<br>
</font></blockquote></div><br></div></div>