[ClusterLabs] All IP resources deleted once a fenced node rejoins

Arjun Pandey apandepublic at gmail.com
Thu Jan 14 07:18:12 UTC 2016


Hi

I am running a 2 node cluster with this config on centos 6.6

Master/Slave Set: foo-master [foo]
   Masters: [ kamet ]
   Slaves: [ orana ]
fence-uc-orana (stonith:fence_ilo4): Started kamet
fence-uc-kamet (stonith:fence_ilo4): Started orana
C-3 (ocf::pw:IPaddr): Started kamet
C-FLT (ocf::pw:IPaddr): Started kamet
C-FLT2 (ocf::pw:IPaddr): Started kamet
E-3 (ocf::pw:IPaddr): Started kamet
MGMT-FLT (ocf::pw:IPaddr): Started kamet
M-FLT (ocf::pw:IPaddr): Started kamet
M-FLT2 (ocf::pw:IPaddr): Started kamet
S-FLT (ocf::pw:IPaddr): Started kamet
S-FLT2 (ocf::pw:IPaddr): Started kamet


where i have a multi-state resource foo being run in master/slave mode
and  IPaddr RA is just modified IPAddr2 RA. Additionally i have a
collocation constraint for the IP addr to be collocated with the master.
I have additionally configured fencing and when i plug out the
redundancy interface fencing gets triggered correctly. However once
the fenced node(kamet) is rejoining i see all my floating IP resources
are deleted
and system looks to be in this state. Also if i log into kamet i see
that the floating ip addresses are actually available.

Based on the logs the IP resources are marked unrunnable and later
marked as orphaned.


Master/Slave Set: foo-master [foo]
   Masters: [ orana ]
   Slaves: [ kamet ]
fence-uc-orana (stonith:fence_ilo4): Started orana
fence-uc-kamet (stonith:fence_ilo4): Started orana

CIB state post fencing of kamet.
<cib admin_epoch="0" cib-last-written="Wed Jan 13 19:28:22 2016"
crm_feature_set="3.0.9" epoch="72" num_updates="7"
validate-with="pacemaker-2.0" have-quorum="1" dc-uuid="orana">
  <configuration>
    <crm_config>
      <cluster_property_set id="cib-bootstrap-options">
        <nvpair id="cib-bootstrap-options-no-quorum-policy"
name="no-quorum-policy" value="ignore"/>
        <nvpair id="cib-bootstrap-options-cluster-recheck-interval"
name="cluster-recheck-interval" value="30s"/>
        <nvpair id="cib-bootstrap-options-stonith-enabled"
name="stonith-enabled" value="true"/>
        <nvpair id="cib-bootstrap-options-dc-version"
name="dc-version" value="1.1.11-97629de"/>
        <nvpair id="cib-bootstrap-options-cluster-infrastructure"
name="cluster-infrastructure" value="cman"/>
        <nvpair id="cib-bootstrap-options-last-lrm-refresh"
name="last-lrm-refresh" value="1452665061"/>
      </cluster_property_set>
    </crm_config>
    <nodes>
      <node id="kamet" uname="kamet"/>
      <node id="orana" uname="orana"/>
    </nodes>
    <resources>
      <master id="foo-master">
        <primitive class="ocf" id="foo" provider="pw" type="uc">
          <instance_attributes id="foo-instance_attributes">
            <nvpair id="foo-instance_attributes-state" name="state"
value="/var/run/uc/role"/>
          </instance_attributes>
          <operations>
            <op id="foo-start-interval-0s" interval="0s" name="start"
on-fail="restart" timeout="100s"/>
            <op id="foo-monitor-interval-10s-role-Master"
interval="10s" name="monitor" on-fail="restart" role="Master"
timeout="100s"/>
            <op id="foo-monitor-interval-11s-role-Slave"
interval="11s" name="monitor" on-fail="restart" role="Slave"
timeout="100s"/>
            <op id="foo-promote-interval-0s" interval="0s"
name="promote" on-fail="restart" timeout="100s"/>
            <op id="foo-demote-interval-0s" interval="0s"
name="demote" on-fail="restart" timeout="100s"/>
            <op id="foo-stop-interval-0s" interval="0s" name="stop"
on-fail="restart" timeout="100s"/>
          </operations>
        </primitive>
        <meta_attributes id="foo-master-meta_attributes">
          <nvpair id="foo-master-meta_attributes-master-max"
name="master-max" value="1"/>
          <nvpair id="foo-master-meta_attributes-master-node-max"
name="master-node-max" value="1"/>
          <nvpair id="foo-master-meta_attributes-clone-max"
name="clone-max" value="2"/>
          <nvpair id="foo-master-meta_attributes-clone-node-max"
name="clone-node-max" value="1"/>
          <nvpair id="foo-master-meta_attributes-notify" name="notify"
value="true"/>
          <nvpair id="foo-master-meta_attributes-ordered"
name="ordered" value="true"/>
        </meta_attributes>
      </master>
      <primitive class="stonith" id="fence-uc-orana" type="fence_ilo4">
        <instance_attributes id="fence-uc-orana-instance_attributes">
          <nvpair id="fence-uc-orana-instance_attributes-login"
name="login" value="parallel"/>
          <nvpair id="fence-uc-orana-instance_attributes-passwd"
name="passwd" value="wireless"/>
          <nvpair id="fence-uc-orana-instance_attributes-ipaddr"
name="ipaddr" value="10.11.10.30"/>
          <nvpair
id="fence-uc-orana-instance_attributes-pcmk_host_list"
name="pcmk_host_list" value="orana"/>
          <nvpair id="fence-uc-orana-instance_attributes-action"
name="action" value="off"/>
          <nvpair id="fence-uc-orana-instance_attributes-lanplus"
name="lanplus" value="1"/>
          <nvpair id="fence-uc-orana-instance_attributes-delay"
name="delay" value="10"/>
        </instance_attributes>
        <operations>
          <op id="fence-uc-orana-start-interval-0s" interval="0s"
name="start" on-fail="restart"/>
          <op id="fence-uc-orana-monitor-interval-5s" interval="5s"
name="monitor" on-fail="restart"/>
        </operations>
        <meta_attributes id="fence-uc-orana-meta_attributes">
          <nvpair id="fence-uc-orana-meta_attributes-failure-timeout"
name="failure-timeout" value="3s"/>
        </meta_attributes>
      </primitive>
      <primitive class="stonith" id="fence-uc-kamet" type="fence_ilo4">
        <instance_attributes id="fence-uc-kamet-instance_attributes">
          <nvpair id="fence-uc-kamet-instance_attributes-login"
name="login" value="parallel"/>
          <nvpair id="fence-uc-kamet-instance_attributes-passwd"
name="passwd" value="wireless"/>
          <nvpair id="fence-uc-kamet-instance_attributes-ipaddr"
name="ipaddr" value="10.11.10.21"/>
          <nvpair
id="fence-uc-kamet-instance_attributes-pcmk_host_list"
name="pcmk_host_list" value="kamet"/>
          <nvpair id="fence-uc-kamet-instance_attributes-action"
name="action" value="off"/>
          <nvpair id="fence-uc-kamet-instance_attributes-lanplus"
name="lanplus" value="1"/>
          <nvpair id="fence-uc-kamet-instance_attributes-delay"
name="delay" value="0"/>
        </instance_attributes>
        <operations>
          <op id="fence-uc-kamet-start-interval-0s" interval="0s"
name="start" on-fail="restart"/>
          <op id="fence-uc-kamet-monitor-interval-5s" interval="5s"
name="monitor" on-fail="restart"/>
        </operations>
        <meta_attributes id="fence-uc-kamet-meta_attributes"/>
      </primitive>
      <primitive class="ocf" id="C-3" provider="pw" type="IPaddr">
        <instance_attributes id="C-3-instance_attributes">
          <nvpair id="C-3-instance_attributes-ip" name="ip" value="10.41.1.22"/>
          <nvpair id="C-3-instance_attributes-nic" name="nic" value="v40"/>
          <nvpair id="C-3-instance_attributes-cidr_netmask"
name="cidr_netmask" value="32"/>
          <nvpair id="C-3-instance_attributes-iflabel" name="iflabel"
value="C-3"/>
        </instance_attributes>
        <operations>
          <op id="C-3-start-timeout-20s" interval="0s" name="start"
timeout="20s"/>
          <op id="C-3-stop-timeout-20s" interval="0s" name="stop"
timeout="20s"/>
          <op id="C-3-monitor-interval-200ms" interval="200ms" name="monitor"/>
        </operations>
        <meta_attributes id="C-3-meta_attributes">
          <nvpair id="C-3-meta_attributes-failure-timeout"
name="failure-timeout" value="3s"/>
          <nvpair id="C-3-meta_attributes-migration-threshold"
name="migration-threshold" value="2"/>
        </meta_attributes>
      </primitive>
      <primitive class="ocf" id="C-FLT" provider="pw" type="IPaddr">
        <instance_attributes id="C-FLT-instance_attributes">
          <nvpair id="C-FLT-instance_attributes-ip" name="ip"
value="10.41.0.108"/>
          <nvpair id="C-FLT-instance_attributes-nic" name="nic" value="v40"/>
          <nvpair id="C-FLT-instance_attributes-cidr_netmask"
name="cidr_netmask" value="32"/>
          <nvpair id="C-FLT-instance_attributes-iflabel"
name="iflabel" value="C-FLT"/>
        </instance_attributes>
        <operations>
          <op id="C-FLT-start-timeout-20s" interval="0s" name="start"
timeout="20s"/>
          <op id="C-FLT-stop-timeout-20s" interval="0s" name="stop"
timeout="20s"/>
          <op id="C-FLT-monitor-interval-200ms" interval="200ms"
name="monitor"/>
        </operations>
        <meta_attributes id="C-FLT-meta_attributes">
          <nvpair id="C-FLT-meta_attributes-failure-timeout"
name="failure-timeout" value="3s"/>
          <nvpair id="C-FLT-meta_attributes-migration-threshold"
name="migration-threshold" value="2"/>
        </meta_attributes>
      </primitive>
      <primitive class="ocf" id="C-FLT2" provider="pw" type="IPaddr">
        <instance_attributes id="C-FLT2-instance_attributes">
          <nvpair id="C-FLT2-instance_attributes-ip" name="ip"
value="10.41.1.15"/>
          <nvpair id="C-FLT2-instance_attributes-nic" name="nic" value="v40"/>
          <nvpair id="C-FLT2-instance_attributes-cidr_netmask"
name="cidr_netmask" value="32"/>
          <nvpair id="C-FLT2-instance_attributes-iflabel"
name="iflabel" value="C-FLT2"/>
        </instance_attributes>
        <operations>
          <op id="C-FLT2-start-timeout-20s" interval="0s" name="start"
timeout="20s"/>
          <op id="C-FLT2-stop-timeout-20s" interval="0s" name="stop"
timeout="20s"/>
          <op id="C-FLT2-monitor-interval-200ms" interval="200ms"
name="monitor"/>
        </operations>
        <meta_attributes id="C-FLT2-meta_attributes">
          <nvpair id="C-FLT2-meta_attributes-failure-timeout"
name="failure-timeout" value="3s"/>
          <nvpair id="C-FLT2-meta_attributes-migration-threshold"
name="migration-threshold" value="2"/>
        </meta_attributes>
      </primitive>
      <primitive class="ocf" id="E-3" provider="pw" type="IPaddr">
        <instance_attributes id="E-3-instance_attributes">
          <nvpair id="E-3-instance_attributes-ip" name="ip" value="10.21.1.22"/>
          <nvpair id="E-3-instance_attributes-nic" name="nic" value="v20"/>
          <nvpair id="E-3-instance_attributes-cidr_netmask"
name="cidr_netmask" value="32"/>
          <nvpair id="E-3-instance_attributes-iflabel" name="iflabel"
value="E-3"/>
        </instance_attributes>
        <operations>
          <op id="E-3-start-timeout-20s" interval="0s" name="start"
timeout="20s"/>
          <op id="E-3-stop-timeout-20s" interval="0s" name="stop"
timeout="20s"/>
          <op id="E-3-monitor-interval-200ms" interval="200ms" name="monitor"/>
        </operations>
        <meta_attributes id="E-3-meta_attributes">
          <nvpair id="E-3-meta_attributes-failure-timeout"
name="failure-timeout" value="3s"/>
          <nvpair id="E-3-meta_attributes-migration-threshold"
name="migration-threshold" value="2"/>
        </meta_attributes>
      </primitive>
      <primitive class="ocf" id="MGMT-FLT" provider="pw" type="IPaddr">
        <instance_attributes id="MGMT-FLT-instance_attributes">
          <nvpair id="MGMT-FLT-instance_attributes-ip" name="ip"
value="10.61.0.227"/>
          <nvpair id="MGMT-FLT-instance_attributes-nic" name="nic"
value="eth0"/>
          <nvpair id="MGMT-FLT-instance_attributes-cidr_netmask"
name="cidr_netmask" value="32"/>
          <nvpair id="MGMT-FLT-instance_attributes-iflabel"
name="iflabel" value="MGMT-FLT"/>
        </instance_attributes>
        <operations>
          <op id="MGMT-FLT-start-timeout-20s" interval="0s"
name="start" timeout="20s"/>
          <op id="MGMT-FLT-stop-timeout-20s" interval="0s" name="stop"
timeout="20s"/>
          <op id="MGMT-FLT-monitor-interval-200ms" interval="200ms"
name="monitor"/>
        </operations>
        <meta_attributes id="MGMT-FLT-meta_attributes">
          <nvpair id="MGMT-FLT-meta_attributes-failure-timeout"
name="failure-timeout" value="3s"/>
          <nvpair id="MGMT-FLT-meta_attributes-migration-threshold"
name="migration-threshold" value="2"/>
        </meta_attributes>
      </primitive>
      <primitive class="ocf" id="M-FLT" provider="pw" type="IPaddr">
        <instance_attributes id="M-FLT-instance_attributes">
          <nvpair id="M-FLT-instance_attributes-ip" name="ip"
value="10.21.0.108"/>
          <nvpair id="M-FLT-instance_attributes-nic" name="nic" value="v20"/>
          <nvpair id="M-FLT-instance_attributes-cidr_netmask"
name="cidr_netmask" value="32"/>
          <nvpair id="M-FLT-instance_attributes-iflabel"
name="iflabel" value="M-FLT"/>
        </instance_attributes>
        <operations>
          <op id="M-FLT-start-timeout-20s" interval="0s" name="start"
timeout="20s"/>
          <op id="M-FLT-stop-timeout-20s" interval="0s" name="stop"
timeout="20s"/>
          <op id="M-FLT-monitor-interval-200ms" interval="200ms"
name="monitor"/>
        </operations>
        <meta_attributes id="M-FLT-meta_attributes">
          <nvpair id="M-FLT-meta_attributes-failure-timeout"
name="failure-timeout" value="3s"/>
          <nvpair id="M-FLT-meta_attributes-migration-threshold"
name="migration-threshold" value="2"/>
        </meta_attributes>
      </primitive>
      <primitive class="ocf" id="M-FLT2" provider="pw" type="IPaddr">
        <instance_attributes id="M-FLT2-instance_attributes">
          <nvpair id="M-FLT2-instance_attributes-ip" name="ip"
value="10.21.1.15"/>
          <nvpair id="M-FLT2-instance_attributes-nic" name="nic" value="v20"/>
          <nvpair id="M-FLT2-instance_attributes-cidr_netmask"
name="cidr_netmask" value="32"/>
          <nvpair id="M-FLT2-instance_attributes-iflabel"
name="iflabel" value="M-FLT2"/>
        </instance_attributes>
        <operations>
          <op id="M-FLT2-start-timeout-20s" interval="0s" name="start"
timeout="20s"/>
          <op id="M-FLT2-stop-timeout-20s" interval="0s" name="stop"
timeout="20s"/>
          <op id="M-FLT2-monitor-interval-200ms" interval="200ms"
name="monitor"/>
        </operations>
        <meta_attributes id="M-FLT2-meta_attributes">
          <nvpair id="M-FLT2-meta_attributes-failure-timeout"
name="failure-timeout" value="3s"/>
          <nvpair id="M-FLT2-meta_attributes-migration-threshold"
name="migration-threshold" value="2"/>
        </meta_attributes>
      </primitive>
      <primitive class="ocf" id="S-FLT" provider="pw" type="IPaddr">
        <instance_attributes id="S-FLT-instance_attributes">
          <nvpair id="S-FLT-instance_attributes-ip" name="ip"
value="10.31.0.108"/>
          <nvpair id="S-FLT-instance_attributes-nic" name="nic" value="v30"/>
          <nvpair id="S-FLT-instance_attributes-cidr_netmask"
name="cidr_netmask" value="32"/>
          <nvpair id="S-FLT-instance_attributes-iflabel"
name="iflabel" value="S-FLT"/>
        </instance_attributes>
        <operations>
          <op id="S-FLT-start-timeout-20s" interval="0s" name="start"
timeout="20s"/>
          <op id="S-FLT-stop-timeout-20s" interval="0s" name="stop"
timeout="20s"/>
          <op id="S-FLT-monitor-interval-200ms" interval="200ms"
name="monitor"/>
        </operations>
        <meta_attributes id="S-FLT-meta_attributes">
          <nvpair id="S-FLT-meta_attributes-failure-timeout"
name="failure-timeout" value="3s"/>
          <nvpair id="S-FLT-meta_attributes-migration-threshold"
name="migration-threshold" value="2"/>
        </meta_attributes>
      </primitive>
      <primitive class="ocf" id="S-FLT2" provider="pw" type="IPaddr">
        <instance_attributes id="S-FLT2-instance_attributes">
          <nvpair id="S-FLT2-instance_attributes-ip" name="ip"
value="10.31.1.15"/>
          <nvpair id="S-FLT2-instance_attributes-nic" name="nic" value="v30"/>
          <nvpair id="S-FLT2-instance_attributes-cidr_netmask"
name="cidr_netmask" value="32"/>
          <nvpair id="S-FLT2-instance_attributes-iflabel"
name="iflabel" value="S-FLT2"/>
        </instance_attributes>
        <operations>
          <op id="S-FLT2-start-timeout-20s" interval="0s" name="start"
timeout="20s"/>
          <op id="S-FLT2-stop-timeout-20s" interval="0s" name="stop"
timeout="20s"/>
          <op id="S-FLT2-monitor-interval-200ms" interval="200ms"
name="monitor"/>
        </operations>
        <meta_attributes id="S-FLT2-meta_attributes">
          <nvpair id="S-FLT2-meta_attributes-failure-timeout"
name="failure-timeout" value="3s"/>
          <nvpair id="S-FLT2-meta_attributes-migration-threshold"
name="migration-threshold" value="2"/>
        </meta_attributes>
      </primitive>
    </resources>
    <constraints>
      <rsc_colocation id="colocation-C-3-foo-master-INFINITY"
rsc="C-3" rsc-role="Started" score="INFINITY" with-rsc="foo-master"
with-rsc-role="Master"/>
      <rsc_order first="C-3" first-action="start"
id="order-C-3-foo-master-mandatory" then="foo-master"
then-action="promote"/>
      <rsc_colocation id="colocation-C-FLT-foo-master-INFINITY"
rsc="C-FLT" rsc-role="Started" score="INFINITY" with-rsc="foo-master"
with-rsc-role="Master"/>
      <rsc_order first="C-FLT" first-action="start"
id="order-C-FLT-foo-master-mandatory" then="foo-master"
then-action="promote"/>
      <rsc_colocation id="colocation-C-FLT2-foo-master-INFINITY"
rsc="C-FLT2" rsc-role="Started" score="INFINITY" with-rsc="foo-master"
with-rsc-role="Master"/>
      <rsc_order first="C-FLT2" first-action="start"
id="order-C-FLT2-foo-master-mandatory" then="foo-master"
then-action="promote"/>
      <rsc_colocation id="colocation-E-3-foo-master-INFINITY"
rsc="E-3" rsc-role="Started" score="INFINITY" with-rsc="foo-master"
with-rsc-role="Master"/>
      <rsc_order first="E-3" first-action="start"
id="order-E-3-foo-master-mandatory" then="foo-master"
then-action="promote"/>
      <rsc_colocation id="colocation-MGMT-FLT-foo-master-INFINITY"
rsc="MGMT-FLT" rsc-role="Started" score="INFINITY"
with-rsc="foo-master" with-rsc-role="Master"/>
      <rsc_order first="MGMT-FLT" first-action="start"
id="order-MGMT-FLT-foo-master-mandatory" then="foo-master"
then-action="promote"/>
      <rsc_colocation id="colocation-M-FLT-foo-master-INFINITY"
rsc="M-FLT" rsc-role="Started" score="INFINITY" with-rsc="foo-master"
with-rsc-role="Master"/>
      <rsc_order first="M-FLT" first-action="start"
id="order-M-FLT-foo-master-mandatory" then="foo-master"
then-action="promote"/>
      <rsc_colocation id="colocation-M-FLT2-foo-master-INFINITY"
rsc="M-FLT2" rsc-role="Started" score="INFINITY" with-rsc="foo-master"
with-rsc-role="Master"/>
      <rsc_order first="M-FLT2" first-action="start"
id="order-M-FLT2-foo-master-mandatory" then="foo-master"
then-action="promote"/>
      <rsc_colocation id="colocation-S-FLT-foo-master-INFINITY"
rsc="S-FLT" rsc-role="Started" score="INFINITY" with-rsc="foo-master"
with-rsc-role="Master"/>
      <rsc_order first="S-FLT" first-action="start"
id="order-S-FLT-foo-master-mandatory" then="foo-master"
then-action="promote"/>
      <rsc_colocation id="colocation-S-FLT2-foo-master-INFINITY"
rsc="S-FLT2" rsc-role="Started" score="INFINITY" with-rsc="foo-master"
with-rsc-role="Master"/>
      <rsc_order first="S-FLT2" first-action="start"
id="order-S-FLT2-foo-master-mandatory" then="foo-master"
then-action="promote"/>
      <rsc_colocation
id="colocation-fence-uc-orana-foo-master-INFINITY"
rsc="fence-uc-orana" rsc-role="Started" score="INFINITY"
with-rsc="foo-master" with-rsc-role="Master"/>
      <rsc_colocation
id="colocation-fence-uc-kamet-foo-master-INFINITY"
rsc="fence-uc-kamet" rsc-role="Started" score="INFINITY"
with-rsc="foo-master" with-rsc-role="Master"/>
      <rsc_order first="fence-uc-kamet" first-action="start"
id="order-fence-uc-kamet-foo-master-mandatory" then="foo-master"
then-action="promote"/>
      <rsc_order first="fence-uc-orana" first-action="start"
id="order-fence-uc-orana-foo-master-mandatory" then="foo-master"
then-action="promote"/>
    </constraints>
  </configuration>
  <status>
    <node_state id="kamet" uname="kamet" in_ccm="false" crmd="offline"
crm-debug-origin="send_stonith_update" join="down" expected="down"/>
    <node_state id="orana" uname="orana" in_ccm="true" crmd="online"
crm-debug-origin="do_update_resource" join="member" expected="member">
      <lrm id="orana">
        <lrm_resources>
          <lrm_resource id="fence-uc-kamet" type="fence_ilo4" class="stonith">
            <lrm_rsc_op id="fence-uc-kamet_last_0"
operation_key="fence-uc-kamet_start_0" operation="start"
crm-debug-origin="do_update_resource" crm_feature_set="3.0.9"
transition-key="13:2304:0:70de24fe-0f17-47ec-a193-a2cdd9f9e5c7"
transition-magic="0:0;13:2304:0:70de24fe-0f17-47ec-a193-a2cdd9f9e5c7"
call-id="131" rc-code="0" op-status="0" interval="0"
last-run="1452731302" last-rc-change="1452731302" exec-time="80"
queue-time="0" op-digest="93013a04ac64e00066a4f7b080b31515"
on_node="orana"/>
            <lrm_rsc_op id="fence-uc-kamet_monitor_5000"
operation_key="fence-uc-kamet_monitor_5000" operation="monitor"
crm-debug-origin="do_update_resource" crm_feature_set="3.0.9"
transition-key="1:2304:0:70de24fe-0f17-47ec-a193-a2cdd9f9e5c7"
transition-magic="0:0;1:2304:0:70de24fe-0f17-47ec-a193-a2cdd9f9e5c7"
call-id="132" rc-code="0" op-status="0" interval="5000"
last-rc-change="1452731303" exec-time="128" queue-time="0"
op-digest="e884a8e46b98e8563278b8117eb4db82" on_node="orana"/>
          </lrm_resource>
          <lrm_resource id="foo" type="uc" class="ocf" provider="pw">
            <lrm_rsc_op id="foo_last_0" operation_key="foo_promote_0"
operation="promote" crm-debug-origin="do_update_resource"
crm_feature_set="3.0.9"
transition-key="19:2304:0:70de24fe-0f17-47ec-a193-a2cdd9f9e5c7"
transition-magic="0:0;19:2304:0:70de24fe-0f17-47ec-a193-a2cdd9f9e5c7"
call-id="133" rc-code="0" op-status="0" interval="0"
last-run="1452731303" last-rc-change="1452731303" exec-time="51"
queue-time="0" op-digest="96d68f528fd950fa93acae8f44e75df5"
on_node="orana"/>
            <lrm_rsc_op id="foo_monitor_10000"
operation_key="foo_monitor_10000" operation="monitor"
crm-debug-origin="do_update_resource" crm_feature_set="3.0.9"
transition-key="20:2304:8:70de24fe-0f17-47ec-a193-a2cdd9f9e5c7"
transition-magic="0:8;20:2304:8:70de24fe-0f17-47ec-a193-a2cdd9f9e5c7"
call-id="135" rc-code="8" op-status="0" interval="10000"
last-rc-change="1452731303" exec-time="39" queue-time="1"
op-digest="f4e31338d1a8837389f1948c9c05d8a8" on_node="orana"/>
          </lrm_resource>
          <lrm_resource id="C-3" type="IPaddr" class="ocf" provider="pw">
            <lrm_rsc_op id="C-3_last_0" operation_key="C-3_start_0"
operation="start" crm-debug-origin="do_update_resource"
crm_feature_set="3.0.9"
transition-key="36:2301:0:70de24fe-0f17-47ec-a193-a2cdd9f9e5c7"
transition-magic="0:0;36:2301:0:70de24fe-0f17-47ec-a193-a2cdd9f9e5c7"
call-id="96" rc-code="0" op-status="0" interval="0"
last-run="1452731294" last-rc-change="1452731294" exec-time="123"
queue-time="0" op-digest="8dd2987e2ec7dfb4c2955839593b8e9f"
on_node="orana"/>
            <lrm_rsc_op id="C-3_monitor_200"
operation_key="C-3_monitor_200" operation="monitor"
crm-debug-origin="do_update_resource" crm_feature_set="3.0.9"
transition-key="37:2301:0:70de24fe-0f17-47ec-a193-a2cdd9f9e5c7"
transition-magic="0:0;37:2301:0:70de24fe-0f17-47ec-a193-a2cdd9f9e5c7"
call-id="113" rc-code="0" op-status="0" interval="200"
last-rc-change="1452731295" exec-time="93" queue-time="0"
op-digest="8ede7902b002a044cd67ffc05a799bdc" on_node="orana"/>
          </lrm_resource>
          <lrm_resource id="C-FLT" type="IPaddr" class="ocf" provider="pw">
            <lrm_rsc_op id="C-FLT_last_0"
operation_key="C-FLT_start_0" operation="start"
crm-debug-origin="do_update_resource" crm_feature_set="3.0.9"
transition-key="38:2301:0:70de24fe-0f17-47ec-a193-a2cdd9f9e5c7"
transition-magic="0:0;38:2301:0:70de24fe-0f17-47ec-a193-a2cdd9f9e5c7"
call-id="97" rc-code="0" op-status="0" interval="0"
last-run="1452731294" last-rc-change="1452731294" exec-time="111"
queue-time="0" op-digest="789f55f594a2d2a911e2c1b9a41e103d"
on_node="orana"/>
            <lrm_rsc_op id="C-FLT_monitor_200"
operation_key="C-FLT_monitor_200" operation="monitor"
crm-debug-origin="do_update_resource" crm_feature_set="3.0.9"
transition-key="39:2301:0:70de24fe-0f17-47ec-a193-a2cdd9f9e5c7"
transition-magic="0:0;39:2301:0:70de24fe-0f17-47ec-a193-a2cdd9f9e5c7"
call-id="108" rc-code="0" op-status="0" interval="200"
last-rc-change="1452731295" exec-time="75" queue-time="0"
op-digest="b9db07a00f66e57a44226b503d43889f" on_node="orana"/>
          </lrm_resource>
          <lrm_resource id="C-FLT2" type="IPaddr" class="ocf" provider="pw">
            <lrm_rsc_op id="C-FLT2_last_0"
operation_key="C-FLT2_start_0" operation="start"
crm-debug-origin="do_update_resource" crm_feature_set="3.0.9"
transition-key="40:2301:0:70de24fe-0f17-47ec-a193-a2cdd9f9e5c7"
transition-magic="0:0;40:2301:0:70de24fe-0f17-47ec-a193-a2cdd9f9e5c7"
call-id="98" rc-code="0" op-status="0" interval="0"
last-run="1452731294" last-rc-change="1452731294" exec-time="113"
queue-time="0" op-digest="236ab7fb7ace80f0eebd8edf4bd9224b"
on_node="orana"/>
            <lrm_rsc_op id="C-FLT2_monitor_200"
operation_key="C-FLT2_monitor_200" operation="monitor"
crm-debug-origin="do_update_resource" crm_feature_set="3.0.9"
transition-key="41:2301:0:70de24fe-0f17-47ec-a193-a2cdd9f9e5c7"
transition-magic="0:0;41:2301:0:70de24fe-0f17-47ec-a193-a2cdd9f9e5c7"
call-id="109" rc-code="0" op-status="0" interval="200"
last-rc-change="1452731295" exec-time="89" queue-time="0"
op-digest="b6c51881a4e04213406b712d950fe34c" on_node="orana"/>
          </lrm_resource>
          <lrm_resource id="E-3" type="IPaddr" class="ocf" provider="pw">
            <lrm_rsc_op id="E-3_last_0" operation_key="E-3_start_0"
operation="start" crm-debug-origin="do_update_resource"
crm_feature_set="3.0.9"
transition-key="42:2301:0:70de24fe-0f17-47ec-a193-a2cdd9f9e5c7"
transition-magic="0:0;42:2301:0:70de24fe-0f17-47ec-a193-a2cdd9f9e5c7"
call-id="99" rc-code="0" op-status="0" interval="0"
last-run="1452731294" last-rc-change="1452731294" exec-time="126"
queue-time="0" op-digest="f3631c868f7539a7fb24804bfccd48f8"
on_node="orana"/>
            <lrm_rsc_op id="E-3_monitor_200"
operation_key="E-3_monitor_200" operation="monitor"
crm-debug-origin="do_update_resource" crm_feature_set="3.0.9"
transition-key="43:2301:0:70de24fe-0f17-47ec-a193-a2cdd9f9e5c7"
transition-magic="0:0;43:2301:0:70de24fe-0f17-47ec-a193-a2cdd9f9e5c7"
call-id="114" rc-code="0" op-status="0" interval="200"
last-rc-change="1452731295" exec-time="75" queue-time="0"
op-digest="a99e7c4d084a719699ad5bd04237e8d2" on_node="orana"/>
          </lrm_resource>
          <lrm_resource id="MGMT-FLT" type="IPaddr" class="ocf" provider="pw">
            <lrm_rsc_op id="MGMT-FLT_last_0"
operation_key="MGMT-FLT_start_0" operation="start"
crm-debug-origin="do_update_resource" crm_feature_set="3.0.9"
transition-key="44:2301:0:70de24fe-0f17-47ec-a193-a2cdd9f9e5c7"
transition-magic="0:0;44:2301:0:70de24fe-0f17-47ec-a193-a2cdd9f9e5c7"
call-id="100" rc-code="0" op-status="0" interval="0"
last-run="1452731294" last-rc-change="1452731294" exec-time="114"
queue-time="0" op-digest="a78c07f01d90a62d8f139658bc7ac3c9"
on_node="orana"/>
            <lrm_rsc_op id="MGMT-FLT_monitor_200"
operation_key="MGMT-FLT_monitor_200" operation="monitor"
crm-debug-origin="do_update_resource" crm_feature_set="3.0.9"
transition-key="45:2301:0:70de24fe-0f17-47ec-a193-a2cdd9f9e5c7"
transition-magic="0:0;45:2301:0:70de24fe-0f17-47ec-a193-a2cdd9f9e5c7"
call-id="110" rc-code="0" op-status="0" interval="200"
last-rc-change="1452731295" exec-time="94" queue-time="0"
op-digest="72c18513de4f6b9105d664dd90d265e8" on_node="orana"/>
          </lrm_resource>
          <lrm_resource id="M-FLT" type="IPaddr" class="ocf" provider="pw">
            <lrm_rsc_op id="M-FLT_last_0"
operation_key="M-FLT_start_0" operation="start"
crm-debug-origin="do_update_resource" crm_feature_set="3.0.9"
transition-key="46:2301:0:70de24fe-0f17-47ec-a193-a2cdd9f9e5c7"
transition-magic="0:0;46:2301:0:70de24fe-0f17-47ec-a193-a2cdd9f9e5c7"
call-id="101" rc-code="0" op-status="0" interval="0"
last-run="1452731294" last-rc-change="1452731294" exec-time="122"
queue-time="0" op-digest="77947e7e39ef2e8073312201deb6d7e7"
on_node="orana"/>
            <lrm_rsc_op id="M-FLT_monitor_200"
operation_key="M-FLT_monitor_200" operation="monitor"
crm-debug-origin="do_update_resource" crm_feature_set="3.0.9"
transition-key="47:2301:0:70de24fe-0f17-47ec-a193-a2cdd9f9e5c7"
transition-magic="0:0;47:2301:0:70de24fe-0f17-47ec-a193-a2cdd9f9e5c7"
call-id="115" rc-code="0" op-status="0" interval="200"
last-rc-change="1452731295" exec-time="80" queue-time="0"
op-digest="a61b31bfa2661c96d41eed752b748f8a" on_node="orana"/>
          </lrm_resource>
          <lrm_resource id="M-FLT2" type="IPaddr" class="ocf" provider="pw">
            <lrm_rsc_op id="M-FLT2_last_0"
operation_key="M-FLT2_start_0" operation="start"
crm-debug-origin="do_update_resource" crm_feature_set="3.0.9"
transition-key="48:2301:0:70de24fe-0f17-47ec-a193-a2cdd9f9e5c7"
transition-magic="0:0;48:2301:0:70de24fe-0f17-47ec-a193-a2cdd9f9e5c7"
call-id="102" rc-code="0" op-status="0" interval="0"
last-run="1452731294" last-rc-change="1452731294" exec-time="112"
queue-time="0" op-digest="72fcbee579c9168ec7a8b329c0b1e8cc"
on_node="orana"/>
            <lrm_rsc_op id="M-FLT2_monitor_200"
operation_key="M-FLT2_monitor_200" operation="monitor"
crm-debug-origin="do_update_resource" crm_feature_set="3.0.9"
transition-key="49:2301:0:70de24fe-0f17-47ec-a193-a2cdd9f9e5c7"
transition-magic="0:0;49:2301:0:70de24fe-0f17-47ec-a193-a2cdd9f9e5c7"
call-id="111" rc-code="0" op-status="0" interval="200"
last-rc-change="1452731295" exec-time="89" queue-time="0"
op-digest="8ee1fce96d23a284b277c3b8fdee8b95" on_node="orana"/>
          </lrm_resource>
          <lrm_resource id="S-FLT" type="IPaddr" class="ocf" provider="pw">
            <lrm_rsc_op id="S-FLT_last_0"
operation_key="S-FLT_start_0" operation="start"
crm-debug-origin="do_update_resource" crm_feature_set="3.0.9"
transition-key="50:2301:0:70de24fe-0f17-47ec-a193-a2cdd9f9e5c7"
transition-magic="0:0;50:2301:0:70de24fe-0f17-47ec-a193-a2cdd9f9e5c7"
call-id="103" rc-code="0" op-status="0" interval="0"
last-run="1452731294" last-rc-change="1452731294" exec-time="123"
queue-time="0" op-digest="6d479331421af055b010c182b992957c"
on_node="orana"/>
            <lrm_rsc_op id="S-FLT_monitor_200"
operation_key="S-FLT_monitor_200" operation="monitor"
crm-debug-origin="do_update_resource" crm_feature_set="3.0.9"
transition-key="51:2301:0:70de24fe-0f17-47ec-a193-a2cdd9f9e5c7"
transition-magic="0:0;51:2301:0:70de24fe-0f17-47ec-a193-a2cdd9f9e5c7"
call-id="116" rc-code="0" op-status="0" interval="200"
last-rc-change="1452731295" exec-time="91" queue-time="0"
op-digest="a88069f185532a1f62424b120846ff71" on_node="orana"/>
          </lrm_resource>
          <lrm_resource id="S-FLT2" type="IPaddr" class="ocf" provider="pw">
            <lrm_rsc_op id="S-FLT2_last_0"
operation_key="S-FLT2_start_0" operation="start"
crm-debug-origin="do_update_resource" crm_feature_set="3.0.9"
transition-key="52:2301:0:70de24fe-0f17-47ec-a193-a2cdd9f9e5c7"
transition-magic="0:0;52:2301:0:70de24fe-0f17-47ec-a193-a2cdd9f9e5c7"
call-id="104" rc-code="0" op-status="0" interval="0"
last-run="1452731294" last-rc-change="1452731294" exec-time="112"
queue-time="0" op-digest="d0aa64d1a80785f83fca3ed7a7b58527"
on_node="orana"/>
            <lrm_rsc_op id="S-FLT2_monitor_200"
operation_key="S-FLT2_monitor_200" operation="monitor"
crm-debug-origin="do_update_resource" crm_feature_set="3.0.9"
transition-key="53:2301:0:70de24fe-0f17-47ec-a193-a2cdd9f9e5c7"
transition-magic="0:0;53:2301:0:70de24fe-0f17-47ec-a193-a2cdd9f9e5c7"
call-id="112" rc-code="0" op-status="0" interval="200"
last-rc-change="1452731295" exec-time="92" queue-time="0"
op-digest="5843ced8789230e395f5cb1f81e27e92" on_node="orana"/>
          </lrm_resource>
          <lrm_resource id="fence-uc-orana" type="fence_ilo4" class="stonith">
            <lrm_rsc_op id="fence-uc-orana_last_0"
operation_key="fence-uc-orana_start_0" operation="start"
crm-debug-origin="do_update_resource" crm_feature_set="3.0.9"
transition-key="14:2303:0:70de24fe-0f17-47ec-a193-a2cdd9f9e5c7"
transition-magic="0:0;14:2303:0:70de24fe-0f17-47ec-a193-a2cdd9f9e5c7"
call-id="123" rc-code="0" op-status="0" interval="0"
last-run="1452731302" last-rc-change="1452731302" exec-time="108"
queue-time="0" op-digest="363cbd5d9ec840db46e5e25e2f8b6652"
on_node="orana"/>
            <lrm_rsc_op id="fence-uc-orana_monitor_5000"
operation_key="fence-uc-orana_monitor_5000" operation="monitor"
crm-debug-origin="do_update_resource" crm_feature_set="3.0.9"
transition-key="12:2303:0:70de24fe-0f17-47ec-a193-a2cdd9f9e5c7"
transition-magic="0:0;12:2303:0:70de24fe-0f17-47ec-a193-a2cdd9f9e5c7"
call-id="124" rc-code="0" op-status="0" interval="5000"
last-rc-change="1452731302" exec-time="94" queue-time="0"
op-digest="8024392825e4a1e81b7f628e6dee69a6" on_node="orana"/>
          </lrm_resource>
        </lrm_resources>
      </lrm>
      <transient_attributes id="orana">
        <instance_attributes id="status-orana">
          <nvpair id="status-orana-shutdown" name="shutdown" value="0"/>
          <nvpair id="status-orana-probe_complete"
name="probe_complete" value="true"/>
          <nvpair id="status-orana-master-foo" name="master-foo" value="10"/>
        </instance_attributes>
      </transient_attributes>
    </node_state>
  </status>
</cib>


Attaching full corosync.log from orana.

Mentioning the interesting parts in the log here.
Jan 13 19:32:44 corosync [TOTEM ] A processor joined or left the
membership and a new membership was formed.
Jan 13 19:32:44 corosync [QUORUM] Members[2]: 1 2
Jan 13 19:32:44 corosync [QUORUM] Members[2]: 1 2
Jan 13 19:32:44 [4296] orana       crmd:     info:
cman_event_callback: Membership 7044: quorum retained
Jan 13 19:32:44 [4296] orana       crmd:   notice:
crm_update_peer_state: cman_event_callback: Node kamet[2] - state is
now member (was lost)
Jan 13 19:32:44 [4296] orana       crmd:     info:
peer_update_callback: kamet is now member (was lost)
Jan 13 19:32:44 [4291] orana        cib:     info:
cib_process_request: Forwarding cib_modify operation for section
status to master (origin=local/crmd/2482)
Jan 13 19:32:44 [4296] orana       crmd:     info: crm_cs_flush: Sent
0 CPG messages  (1 remaining, last=122): Try again (6)
Jan 13 19:32:44 [4291] orana        cib:     info: crm_cs_flush: Sent
0 CPG messages  (1 remaining, last=240): Try again (6)
Jan 13 19:32:44 [4296] orana       crmd:     info:
cman_event_callback: Membership 7044: quorum retained
Jan 13 19:32:44 [4291] orana        cib:     info:
cib_process_request: Forwarding cib_modify operation for section nodes
to master (origin=local/crmd/2483)
Jan 13 19:32:44 [4291] orana        cib:     info:
cib_process_request: Forwarding cib_modify operation for section
status to master (origin=local/crmd/2484)
Jan 13 19:32:44 [4291] orana        cib:     info:
cib_process_request: Forwarding cib_modify operation for section nodes
to master (origin=local/crmd/2485)
Jan 13 19:32:44 [4291] orana        cib:     info:
cib_process_request: Forwarding cib_modify operation for section
status to master (origin=local/crmd/2486)
Jan 13 19:32:44 corosync [CPG   ] chosen downlist: sender r(0)
ip(7.7.7.1) ; members(old:1 left:0)
Jan 13 19:32:44 corosync [MAIN  ] Completed service synchronization,
ready to provide service.
Jan 13 19:32:44 [4291] orana        cib:     info: crm_cs_flush: Sent
5 CPG messages  (0 remaining, last=245): OK (1)
Jan 13 19:32:44 [4296] orana       crmd:     info: crm_cs_flush: Sent
2 CPG messages  (0 remaining, last=124): OK (1)
Jan 13 19:32:44 [4291] orana        cib:     info: cib_perform_op:
Diff: --- 0.72.7 2
Jan 13 19:32:44 [4291] orana        cib:     info: cib_perform_op:
Diff: +++ 0.72.8 (null)
Jan 13 19:32:44 [4291] orana        cib:     info: cib_perform_op: +
/cib:  @num_updates=8
Jan 13 19:32:44 [4291] orana        cib:     info: cib_perform_op: +
/cib/status/node_state[@id='kamet']:
@crm-debug-origin=peer_update_callback
Jan 13 19:32:44 [4291] orana        cib:     info:
cib_process_request: Completed cib_modify operation for section
status: OK (rc=0, origin=orana/crmd/2482, version=0.72.8)
Jan 13 19:32:44 [4291] orana        cib:     info:
cib_process_request: Completed cib_modify operation for section nodes:
OK (rc=0, origin=orana/crmd/2483, version=0.72.8)
Jan 13 19:32:44 [4291] orana        cib:     info: cib_perform_op:
Diff: --- 0.72.8 2
Jan 13 19:32:44 [4291] orana        cib:     info: cib_perform_op:
Diff: +++ 0.72.9 (null)
Jan 13 19:32:44 [4291] orana        cib:     info: cib_perform_op: +
/cib:  @num_updates=9
Jan 13 19:32:44 [4291] orana        cib:     info: cib_perform_op: +
/cib/status/node_state[@id='kamet']:  @in_ccm=true,
@crm-debug-origin=post_cache_update
Jan 13 19:32:44 [4291] orana        cib:     info: cib_perform_op: +
/cib/status/node_state[@id='orana']:
@crm-debug-origin=post_cache_update
Jan 13 19:32:44 [4291] orana        cib:     info:
cib_process_request: Completed cib_modify operation for section
status: OK (rc=0, origin=orana/crmd/2484, version=0.72.9)
Jan 13 19:32:44 [4291] orana        cib:     info:
cib_process_request: Completed cib_modify operation for section nodes:
OK (rc=0, origin=orana/crmd/2485, version=0.72.9)
Jan 13 19:32:44 [4291] orana        cib:     info:
cib_process_request: Completed cib_modify operation for section
status: OK (rc=0, origin=orana/crmd/2486, version=0.72.9)
Jan 13 19:32:49 [4291] orana        cib:     info: cib_process_ping:
Reporting our current digest to orana:
9cc493f69fca04421704855b29514829 for 0.72.9 (0x147cad0 0)
Jan 13 19:32:53 [4296] orana       crmd:     info: crm_timer_popped:
PEngine Recheck Timer (I_PE_CALC) just popped (30000ms)
Jan 13 19:32:53 [4296] orana       crmd:   notice:
do_state_transition: State transition S_IDLE -> S_POLICY_ENGINE [
input=I_PE_CALC cause=C_TIMER_POPPED origin=crm_timer_popped ]
Jan 13 19:32:53 [4296] orana       crmd:     info:
do_state_transition: Progressed to state S_POLICY_ENGINE after
C_TIMER_POPPED
Jan 13 19:32:53 [4295] orana    pengine:   notice: unpack_config: On
loss of CCM Quorum: Ignore
Jan 13 19:32:53 [4295] orana    pengine:     info:
determine_online_status_fencing: - Node kamet is not ready to run
resources
Jan 13 19:32:53 [4295] orana    pengine:     info:
determine_online_status: Node kamet is pending
Jan 13 19:32:53 [4295] orana    pengine:     info:
determine_online_status_fencing: Node orana is active
Jan 13 19:32:53 [4295] orana    pengine:     info:
determine_online_status: Node orana is online
Jan 13 19:32:53 [4295] orana    pengine:     info: clone_print:
Master/Slave Set: foo-master [foo]
Jan 13 19:32:53 [4295] orana    pengine:     info: short_print:
Masters: [ orana ]
Jan 13 19:32:53 [4295] orana    pengine:     info: short_print:
Stopped: [ kamet ]
Jan 13 19:32:53 [4295] orana    pengine:     info: native_print:
fence-uc-orana (stonith:fence_ilo4): Started orana
Jan 13 19:32:53 [4295] orana    pengine:     info: native_print:
fence-uc-kamet (stonith:fence_ilo4): Started orana
Jan 13 19:32:53 [4295] orana    pengine:     info: native_print: C-3
(ocf::pw:IPaddr): Started orana
Jan 13 19:32:53 [4295] orana    pengine:     info: native_print: C-FLT
(ocf::pw:IPaddr): Started orana
Jan 13 19:32:53 [4295] orana    pengine:     info: native_print:
C-FLT2 (ocf::pw:IPaddr): Started orana
Jan 13 19:32:53 [4295] orana    pengine:     info: native_print: E-3
(ocf::pw:IPaddr): Started orana
Jan 13 19:32:53 [4295] orana    pengine:     info: native_print:
MGMT-FLT (ocf::pw:IPaddr): Started orana
Jan 13 19:32:53 [4295] orana    pengine:     info: native_print: M-FLT
(ocf::pw:IPaddr): Started orana
Jan 13 19:32:53 [4295] orana    pengine:     info: native_print:
M-FLT2 (ocf::pw:IPaddr): Started orana
Jan 13 19:32:53 [4295] orana    pengine:     info: native_print: S-FLT
(ocf::pw:IPaddr): Started orana
Jan 13 19:32:53 [4295] orana    pengine:     info: native_print:
S-FLT2 (ocf::pw:IPaddr): Started orana
Jan 13 19:32:53 [4295] orana    pengine:     info: native_color:
Resource foo:1 cannot run anywhere
Jan 13 19:32:53 [4295] orana    pengine:     info: master_color:
Promoting foo:0 (Master orana)
Jan 13 19:32:53 [4295] orana    pengine:     info: master_color:
foo-master: Promoted 1 instances of a possible 1 to master
Jan 13 19:32:53 [4295] orana    pengine:     info: probe_resources:
Action probe_complete-kamet on kamet is unrunnable (pending)
Jan 13 19:32:53 [4295] orana    pengine:  warning: custom_action:
Action foo:0_monitor_0 on kamet is unrunnable (pending)
Jan 13 19:32:53 [4295] orana    pengine:  warning: custom_action:
Action fence-uc-orana_monitor_0 on kamet is unrunnable (pending)
Jan 13 19:32:53 [4295] orana    pengine:  warning: custom_action:
Action fence-uc-kamet_monitor_0 on kamet is unrunnable (pending)
Jan 13 19:32:53 [4295] orana    pengine:  warning: custom_action:
Action C-3_monitor_0 on kamet is unrunnable (pending)
Jan 13 19:32:53 [4295] orana    pengine:  warning: custom_action:
Action C-FLT_monitor_0 on kamet is unrunnable (pending)
Jan 13 19:32:53 [4295] orana    pengine:  warning: custom_action:
Action C-FLT2_monitor_0 on kamet is unrunnable (pending)
Jan 13 19:32:53 [4295] orana    pengine:  warning: custom_action:
Action E-3_monitor_0 on kamet is unrunnable (pending)
Jan 13 19:32:53 [4295] orana    pengine:  warning: custom_action:
Action MGMT-FLT_monitor_0 on kamet is unrunnable (pending)
Jan 13 19:32:53 [4295] orana    pengine:  warning: custom_action:
Action M-FLT_monitor_0 on kamet is unrunnable (pending)
Jan 13 19:32:53 [4295] orana    pengine:  warning: custom_action:
Action M-FLT2_monitor_0 on kamet is unrunnable (pending)
Jan 13 19:32:53 [4295] orana    pengine:  warning: custom_action:
Action S-FLT_monitor_0 on kamet is unrunnable (pending)
Jan 13 19:32:53 [4295] orana    pengine:  warning: custom_action:
Action S-FLT2_monitor_0 on kamet is unrunnable (pending)
Jan 13 19:32:53 [4295] orana    pengine:     info: LogActions: Leave
foo:0 (Master orana)
Jan 13 19:32:53 [4295] orana    pengine:     info: LogActions: Leave
foo:1 (Stopped)
Jan 13 19:32:53 [4295] orana    pengine:     info: LogActions: Leave
fence-uc-orana (Started orana)
Jan 13 19:32:53 [4295] orana    pengine:     info: LogActions: Leave
fence-uc-kamet (Started orana)
Jan 13 19:32:53 [4295] orana    pengine:     info: LogActions: Leave
C-3 (Started orana)
Jan 13 19:32:53 [4295] orana    pengine:     info: LogActions: Leave
C-FLT (Started orana)
Jan 13 19:32:53 [4295] orana    pengine:     info: LogActions: Leave
C-FLT2 (Started orana)
Jan 13 19:32:53 [4295] orana    pengine:     info: LogActions: Leave
E-3 (Started orana)
Jan 13 19:32:53 [4295] orana    pengine:     info: LogActions: Leave
MGMT-FLT (Started orana)
Jan 13 19:32:53 [4295] orana    pengine:     info: LogActions: Leave
M-FLT (Started orana)
Jan 13 19:32:53 [4295] orana    pengine:     info: LogActions: Leave
M-FLT2 (Started orana)
Jan 13 19:32:53 [4295] orana    pengine:     info: LogActions: Leave
S-FLT (Started orana)
Jan 13 19:32:53 [4295] orana    pengine:     info: LogActions: Leave
S-FLT2 (Started orana)
Jan 13 19:32:53 [4295] orana    pengine:   notice: process_pe_message:
Calculated Transition 2313:
/var/lib/pacemaker/pengine/pe-input-1448.bz2
Jan 13 19:32:53 [4296] orana       crmd:     info:
do_state_transition: State transition S_POLICY_ENGINE ->
S_TRANSITION_ENGINE [ input=I_PE_SUCCESS cause=C_IPC_MESSAGE
origin=handle_response ]
Jan 13 19:32:53 [4296] orana       crmd:     info: do_te_invoke:
Processing graph 2313 (ref=pe_calc-dc-1452731573-2544) derived from
/var/lib/pacemaker/pengine/pe-input-1448.bz2
Jan 13 19:32:53 [4296] orana       crmd:   notice: run_graph:
Transition 2313 (Complete=0, Pending=0, Fired=0, Skipped=0,
Incomplete=0, Source=/var/lib/pacemaker/pengine/pe-input-1448.bz2):
Complete
Jan 13 19:32:53 [4296] orana       crmd:     info: do_log: FSA: Input
I_TE_SUCCESS from notify_crmd() received in state S_TRANSITION_ENGINE
Jan 13 19:32:53 [4296] orana       crmd:   notice:
do_state_transition: State transition S_TRANSITION_ENGINE -> S_IDLE [
input=I_TE_SUCCESS cause=C_FSA_INTERNAL origin=notify_crmd ]
Jan 13 19:32:59 [4292] orana stonith-ng:     info:
pcmk_cpg_membership: Joined[2.0] stonith-ng.2
Jan 13 19:32:59 [4292] orana stonith-ng:     info:
pcmk_cpg_membership: Member[2.0] stonith-ng.1
Jan 13 19:32:59 [4292] orana stonith-ng:     info:
pcmk_cpg_membership: Member[2.1] stonith-ng.2
Jan 13 19:32:59 [4292] orana stonith-ng:     info:
crm_update_peer_proc: pcmk_cpg_membership: Node kamet[2] -
corosync-cpg is now online
Jan 13 19:32:59 [4291] orana        cib:     info:
pcmk_cpg_membership: Joined[2.0] cib.2
Jan 13 19:32:59 [4291] orana        cib:     info:
pcmk_cpg_membership: Member[2.0] cib.1
Jan 13 19:32:59 [4291] orana        cib:     info:
pcmk_cpg_membership: Member[2.1] cib.2
Jan 13 19:32:59 [4291] orana        cib:     info:
crm_update_peer_proc: pcmk_cpg_membership: Node kamet[2] -
corosync-cpg is now online
Jan 13 19:33:00 [4291] orana        cib:     info: cib_perform_op:
Diff: --- 0.72.9 2
Jan 13 19:33:00 [4291] orana        cib:     info: cib_perform_op:
Diff: +++ 0.73.0 (null)
Jan 13 19:33:00 [4291] orana        cib:     info: cib_perform_op: --
/cib/configuration/crm_config/cluster_property_set[@id='cib-bootstrap-options']/nvpair[@id='cib-bootstrap-options-cluster-recheck-interval']
Jan 13 19:33:00 [4291] orana        cib:     info: cib_perform_op: --
/cib/configuration/crm_config/cluster_property_set[@id='cib-bootstrap-options']/nvpair[@id='cib-bootstrap-options-stonith-enabled']
Jan 13 19:33:00 [4291] orana        cib:     info: cib_perform_op: --
/cib/configuration/crm_config/cluster_property_set[@id='cib-bootstrap-options']/nvpair[@id='cib-bootstrap-options-dc-version']
Jan 13 19:33:00 [4291] orana        cib:     info: cib_perform_op: --
/cib/configuration/crm_config/cluster_property_set[@id='cib-bootstrap-options']/nvpair[@id='cib-bootstrap-options-cluster-infrastructure']
Jan 13 19:33:00 [4291] orana        cib:     info: cib_perform_op: --
/cib/configuration/crm_config/cluster_property_set[@id='cib-bootstrap-options']/nvpair[@id='cib-bootstrap-options-last-lrm-refresh']
Jan 13 19:33:00 [4291] orana        cib:     info: cib_perform_op: +
/cib:  @epoch=73, @num_updates=0
Jan 13 19:33:00 [4291] orana        cib:     info:
cib_process_request: Completed cib_replace operation for section
configuration: OK (rc=0, origin=kamet/cibadmin/2, version=0.73.0)
Jan 13 19:33:00 [4296] orana       crmd:     info:
abort_transition_graph: Transition aborted by deletion of
nvpair[@id='cib-bootstrap-options-cluster-recheck-interval']:
Non-status change (cib=0.73.0, source=te_update_diff:383,
path=/cib/configuration/crm_config/cluster_property_set[@id='cib-bootstrap-options']/nvpair[@id='cib-bootstrap-options-cluster-recheck-interval'],
1)
Jan 13 19:33:00 [4296] orana       crmd:   notice:
do_state_transition: State transition S_IDLE -> S_POLICY_ENGINE [
input=I_PE_CALC cause=C_FSA_INTERNAL origin=abort_transition_graph ]
Jan 13 19:33:00 [4291] orana        cib:     info: write_cib_contents:
Archived previous version as /var/lib/pacemaker/cib/cib-83.raw
Jan 13 19:33:00 [4291] orana        cib:     info: write_cib_contents:
Wrote version 0.73.0 of the CIB to disk (digest:
3e63c3f9772e422777137db356f7e41d)
Jan 13 19:33:00 [4291] orana        cib:     info: retrieveCib:
Reading cluster configuration from: /var/lib/pacemaker/cib/cib.K8qdX0
(digest: /var/lib/pacemaker/cib/cib.3INvqH)
Jan 13 19:33:00 [4295] orana    pengine:   notice: unpack_config: On
loss of CCM Quorum: Ignore
Jan 13 19:33:00 [4295] orana    pengine:     info:
determine_online_status_fencing: - Node kamet is not ready to run
resources
Jan 13 19:33:00 [4295] orana    pengine:     info:
determine_online_status: Node kamet is pending
Jan 13 19:33:00 [4295] orana    pengine:     info:
determine_online_status_fencing: Node orana is active
Jan 13 19:33:00 [4295] orana    pengine:     info:
determine_online_status: Node orana is online
Jan 13 19:33:00 [4295] orana    pengine:     info: clone_print:
Master/Slave Set: foo-master [foo]
Jan 13 19:33:00 [4295] orana    pengine:     info: short_print:
Masters: [ orana ]
Jan 13 19:33:00 [4295] orana    pengine:     info: short_print:
Stopped: [ kamet ]
Jan 13 19:33:00 [4295] orana    pengine:     info: native_print:
fence-uc-orana (stonith:fence_ilo4): Started orana
Jan 13 19:33:00 [4295] orana    pengine:     info: native_print:
fence-uc-kamet (stonith:fence_ilo4): Started orana
Jan 13 19:33:00 [4295] orana    pengine:     info: native_print: C-3
(ocf::pw:IPaddr): Started orana
Jan 13 19:33:00 [4295] orana    pengine:     info: native_print: C-FLT
(ocf::pw:IPaddr): Started orana
Jan 13 19:33:00 [4295] orana    pengine:     info: native_print:
C-FLT2 (ocf::pw:IPaddr): Started orana
Jan 13 19:33:00 [4295] orana    pengine:     info: native_print: E-3
(ocf::pw:IPaddr): Started orana
Jan 13 19:33:00 [4295] orana    pengine:     info: native_print:
MGMT-FLT (ocf::pw:IPaddr): Started orana
Jan 13 19:33:00 [4295] orana    pengine:     info: native_print: M-FLT
(ocf::pw:IPaddr): Started orana
Jan 13 19:33:00 [4295] orana    pengine:     info: native_print:
M-FLT2 (ocf::pw:IPaddr): Started orana
Jan 13 19:33:00 [4295] orana    pengine:     info: native_print: S-FLT
(ocf::pw:IPaddr): Started orana
Jan 13 19:33:00 [4295] orana    pengine:     info: native_print:
S-FLT2 (ocf::pw:IPaddr): Started orana
Jan 13 19:33:00 [4295] orana    pengine:     info: native_color:
Resource foo:1 cannot run anywhere
Jan 13 19:33:00 [4295] orana    pengine:     info: master_color:
Promoting foo:0 (Master orana)
Jan 13 19:33:00 [4295] orana    pengine:     info: master_color:
foo-master: Promoted 1 instances of a possible 1 to master
Jan 13 19:33:00 [4295] orana    pengine:     info: probe_resources:
Action probe_complete-kamet on kamet is unrunnable (pending)
Jan 13 19:33:00 [4295] orana    pengine:  warning: custom_action:
Action foo:0_monitor_0 on kamet is unrunnable (pending)
Jan 13 19:33:00 [4295] orana    pengine:  warning: custom_action:
Action fence-uc-orana_monitor_0 on kamet is unrunnable (pending)
Jan 13 19:33:00 [4295] orana    pengine:  warning: custom_action:
Action fence-uc-kamet_monitor_0 on kamet is unrunnable (pending)
Jan 13 19:33:00 [4295] orana    pengine:  warning: custom_action:
Action C-3_monitor_0 on kamet is unrunnable (pending)
Jan 13 19:33:00 [4295] orana    pengine:  warning: custom_action:
Action C-FLT_monitor_0 on kamet is unrunnable (pending)
Jan 13 19:33:00 [4295] orana    pengine:  warning: custom_action:
Action C-FLT2_monitor_0 on kamet is unrunnable (pending)
Jan 13 19:33:00 [4295] orana    pengine:  warning: custom_action:
Action E-3_monitor_0 on kamet is unrunnable (pending)
Jan 13 19:33:00 [4295] orana    pengine:  warning: custom_action:
Action MGMT-FLT_monitor_0 on kamet is unrunnable (pending)
Jan 13 19:33:00 [4295] orana    pengine:  warning: custom_action:
Action M-FLT_monitor_0 on kamet is unrunnable (pending)
Jan 13 19:33:00 [4295] orana    pengine:  warning: custom_action:
Action M-FLT2_monitor_0 on kamet is unrunnable (pending)
Jan 13 19:33:00 [4295] orana    pengine:  warning: custom_action:
Action S-FLT_monitor_0 on kamet is unrunnable (pending)
Jan 13 19:33:00 [4295] orana    pengine:  warning: custom_action:
Action S-FLT2_monitor_0 on kamet is unrunnable (pending)
Jan 13 19:33:00 [4295] orana    pengine:     info: LogActions: Leave
foo:0 (Master orana)
Jan 13 19:33:00 [4295] orana    pengine:     info: LogActions: Leave
foo:1 (Stopped)
Jan 13 19:33:00 [4295] orana    pengine:     info: LogActions: Leave
fence-uc-orana (Started orana)
Jan 13 19:33:00 [4295] orana    pengine:     info: LogActions: Leave
fence-uc-kamet (Started orana)
Jan 13 19:33:00 [4295] orana    pengine:     info: LogActions: Leave
C-3 (Started orana)
Jan 13 19:33:00 [4295] orana    pengine:     info: LogActions: Leave
C-FLT (Started orana)
Jan 13 19:33:00 [4295] orana    pengine:     info: LogActions: Leave
C-FLT2 (Started orana)
Jan 13 19:33:00 [4295] orana    pengine:     info: LogActions: Leave
E-3 (Started orana)
Jan 13 19:33:00 [4295] orana    pengine:     info: LogActions: Leave
MGMT-FLT (Started orana)
Jan 13 19:33:00 [4295] orana    pengine:     info: LogActions: Leave
M-FLT (Started orana)
Jan 13 19:33:00 [4295] orana    pengine:     info: LogActions: Leave
M-FLT2 (Started orana)
Jan 13 19:33:00 [4295] orana    pengine:     info: LogActions: Leave
S-FLT (Started orana)
Jan 13 19:33:00 [4295] orana    pengine:     info: LogActions: Leave
S-FLT2 (Started orana)
Jan 13 19:33:00 [4296] orana       crmd:     info:
do_state_transition: State transition S_POLICY_ENGINE ->
S_TRANSITION_ENGINE [ input=I_PE_SUCCESS cause=C_IPC_MESSAGE
origin=handle_response ]
Jan 13 19:33:00 [4296] orana       crmd:     info: do_te_invoke:
Processing graph 2314 (ref=pe_calc-dc-1452731580-2545) derived from
/var/lib/pacemaker/pengine/pe-input-1449.bz2
Jan 13 19:33:00 [4296] orana       crmd:   notice: run_graph:
Transition 2314 (Complete=0, Pending=0, Fired=0, Skipped=0,
Incomplete=0, Source=/var/lib/pacemaker/pengine/pe-input-1449.bz2):
Complete
Jan 13 19:33:00 [4296] orana       crmd:     info: do_log: FSA: Input
I_TE_SUCCESS from notify_crmd() received in state S_TRANSITION_ENGINE
Jan 13 19:33:00 [4296] orana       crmd:   notice:
do_state_transition: State transition S_TRANSITION_ENGINE -> S_IDLE [
input=I_TE_SUCCESS cause=C_FSA_INTERNAL origin=notify_crmd ]
Jan 13 19:33:00 [4295] orana    pengine:   notice: process_pe_message:
Calculated Transition 2314:
/var/lib/pacemaker/pengine/pe-input-1449.bz2
Jan 13 19:33:00 [4291] orana        cib:     info: cib_perform_op:
Diff: --- 0.73.0 2
Jan 13 19:33:00 [4291] orana        cib:     info: cib_perform_op:
Diff: +++ 0.74.0 (null)
Jan 13 19:33:00 [4291] orana        cib:     info: cib_perform_op: +
/cib:  @epoch=74
Jan 13 19:33:00 [4291] orana        cib:     info: cib_perform_op: ++
/cib/configuration/crm_config/cluster_property_set[@id='cib-bootstrap-options']:
 <nvpair id="cib-bootstrap-options-cluster-recheck-interval"
name="cluster-recheck-interval" value="30s"/>
Jan 13 19:33:00 [4291] orana        cib:     info:
cib_process_request: Completed cib_replace operation for section
configuration: OK (rc=0, origin=kamet/cibadmin/2, version=0.74.0)
Jan 13 19:33:00 [4296] orana       crmd:     info:
abort_transition_graph: Transition aborted by
cib-bootstrap-options-cluster-recheck-interval,
cluster-recheck-interval=30s: Non-status change (create cib=0.74.0,
source=te_update_diff:383,
path=/cib/configuration/crm_config/cluster_property_set[@id='cib-bootstrap-options'],
1)
Jan 13 19:33:00 [4296] orana       crmd:   notice:
do_state_transition: State transition S_IDLE -> S_POLICY_ENGINE [
input=I_PE_CALC cause=C_FSA_INTERNAL origin=abort_transition_graph ]
Jan 13 19:33:00 [4291] orana        cib:     info: write_cib_contents:
Archived previous version as /var/lib/pacemaker/cib/cib-84.raw
Jan 13 19:33:00 [4291] orana        cib:     info: write_cib_contents:
Wrote version 0.74.0 of the CIB to disk (digest:
ac133341560d33343b2410d8833adfb8)
Jan 13 19:33:00 [4291] orana        cib:     info: retrieveCib:
Reading cluster configuration from: /var/lib/pacemaker/cib/cib.yXjrc1
(digest: /var/lib/pacemaker/cib/cib.zRGbVH)
Jan 13 19:33:00 [4295] orana    pengine:   notice: unpack_config: On
loss of CCM Quorum: Ignore
Jan 13 19:33:00 [4295] orana    pengine:     info:
determine_online_status_fencing: - Node kamet is not ready to run
resources
Jan 13 19:33:00 [4295] orana    pengine:     info:
determine_online_status: Node kamet is pending
Jan 13 19:33:00 [4295] orana    pengine:     info:
determine_online_status_fencing: Node orana is active
Jan 13 19:33:00 [4295] orana    pengine:     info:
determine_online_status: Node orana is online
Jan 13 19:33:00 [4295] orana    pengine:     info: clone_print:
Master/Slave Set: foo-master [foo]
Jan 13 19:33:00 [4295] orana    pengine:     info: short_print:
Masters: [ orana ]
Jan 13 19:33:00 [4295] orana    pengine:     info: short_print:
Stopped: [ kamet ]
Jan 13 19:33:00 [4295] orana    pengine:     info: native_print:
fence-uc-orana (stonith:fence_ilo4): Started orana
Jan 13 19:33:00 [4295] orana    pengine:     info: native_print:
fence-uc-kamet (stonith:fence_ilo4): Started orana
Jan 13 19:33:00 [4295] orana    pengine:     info: native_print: C-3
(ocf::pw:IPaddr): Started orana
Jan 13 19:33:00 [4295] orana    pengine:     info: native_print: C-FLT
(ocf::pw:IPaddr): Started orana
Jan 13 19:33:00 [4295] orana    pengine:     info: native_print:
C-FLT2 (ocf::pw:IPaddr): Started orana
Jan 13 19:33:00 [4295] orana    pengine:     info: native_print: E-3
(ocf::pw:IPaddr): Started orana
Jan 13 19:33:00 [4295] orana    pengine:     info: native_print:
MGMT-FLT (ocf::pw:IPaddr): Started orana
Jan 13 19:33:00 [4295] orana    pengine:     info: native_print: M-FLT
(ocf::pw:IPaddr): Started orana
Jan 13 19:33:00 [4295] orana    pengine:     info: native_print:
M-FLT2 (ocf::pw:IPaddr): Started orana
Jan 13 19:33:00 [4295] orana    pengine:     info: native_print: S-FLT
(ocf::pw:IPaddr): Started orana
Jan 13 19:33:00 [4295] orana    pengine:     info: native_print:
S-FLT2 (ocf::pw:IPaddr): Started orana
Jan 13 19:33:00 [4295] orana    pengine:     info: native_color:
Resource foo:1 cannot run anywhere
Jan 13 19:33:00 [4295] orana    pengine:     info: master_color:
Promoting foo:0 (Master orana)
Jan 13 19:33:00 [4295] orana    pengine:     info: master_color:
foo-master: Promoted 1 instances of a possible 1 to master
Jan 13 19:33:00 [4295] orana    pengine:     info: probe_resources:
Action probe_complete-kamet on kamet is unrunnable (pending)
Jan 13 19:33:00 [4295] orana    pengine:  warning: custom_action:
Action foo:0_monitor_0 on kamet is unrunnable (pending)
Jan 13 19:33:00 [4295] orana    pengine:  warning: custom_action:
Action fence-uc-orana_monitor_0 on kamet is unrunnable (pending)
Jan 13 19:33:00 [4295] orana    pengine:  warning: custom_action:
Action fence-uc-kamet_monitor_0 on kamet is unrunnable (pending)
Jan 13 19:33:00 [4295] orana    pengine:  warning: custom_action:
Action C-3_monitor_0 on kamet is unrunnable (pending)
Jan 13 19:33:00 [4295] orana    pengine:  warning: custom_action:
Action C-FLT_monitor_0 on kamet is unrunnable (pending)
Jan 13 19:33:00 [4295] orana    pengine:  warning: custom_action:
Action C-FLT2_monitor_0 on kamet is unrunnable (pending)
Jan 13 19:33:00 [4295] orana    pengine:  warning: custom_action:
Action E-3_monitor_0 on kamet is unrunnable (pending)
Jan 13 19:33:00 [4295] orana    pengine:  warning: custom_action:
Action MGMT-FLT_monitor_0 on kamet is unrunnable (pending)
Jan 13 19:33:00 [4295] orana    pengine:  warning: custom_action:
Action M-FLT_monitor_0 on kamet is unrunnable (pending)
Jan 13 19:33:00 [4295] orana    pengine:  warning: custom_action:
Action M-FLT2_monitor_0 on kamet is unrunnable (pending)
Jan 13 19:33:00 [4295] orana    pengine:  warning: custom_action:
Action S-FLT_monitor_0 on kamet is unrunnable (pending)
Jan 13 19:33:00 [4295] orana    pengine:  warning: custom_action:
Action S-FLT2_monitor_0 on kamet is unrunnable (pending)
Jan 13 19:33:00 [4295] orana    pengine:     info: LogActions: Leave
foo:0 (Master orana)
Jan 13 19:33:00 [4295] orana    pengine:     info: LogActions: Leave
foo:1 (Stopped)
Jan 13 19:33:00 [4295] orana    pengine:     info: LogActions: Leave
fence-uc-orana (Started orana)
Jan 13 19:33:00 [4295] orana    pengine:     info: LogActions: Leave
fence-uc-kamet (Started orana)
Jan 13 19:33:00 [4295] orana    pengine:     info: LogActions: Leave
C-3 (Started orana)
Jan 13 19:33:00 [4295] orana    pengine:     info: LogActions: Leave
C-FLT (Started orana)
Jan 13 19:33:00 [4295] orana    pengine:     info: LogActions: Leave
C-FLT2 (Started orana)
Jan 13 19:33:00 [4295] orana    pengine:     info: LogActions: Leave
E-3 (Started orana)
Jan 13 19:33:00 [4295] orana    pengine:     info: LogActions: Leave
MGMT-FLT (Started orana)
Jan 13 19:33:00 [4295] orana    pengine:     info: LogActions: Leave
M-FLT (Started orana)
Jan 13 19:33:00 [4295] orana    pengine:     info: LogActions: Leave
M-FLT2 (Started orana)
Jan 13 19:33:00 [4295] orana    pengine:     info: LogActions: Leave
S-FLT (Started orana)
Jan 13 19:33:00 [4295] orana    pengine:     info: LogActions: Leave
S-FLT2 (Started orana)
Jan 13 19:33:00 [4296] orana       crmd:     info:
do_state_transition: State transition S_POLICY_ENGINE ->
S_TRANSITION_ENGINE [ input=I_PE_SUCCESS cause=C_IPC_MESSAGE
origin=handle_response ]
Jan 13 19:33:00 [4296] orana       crmd:     info: do_te_invoke:
Processing graph 2315 (ref=pe_calc-dc-1452731580-2546) derived from
/var/lib/pacemaker/pengine/pe-input-1450.bz2
Jan 13 19:33:00 [4296] orana       crmd:   notice: run_graph:
Transition 2315 (Complete=0, Pending=0, Fired=0, Skipped=0,
Incomplete=0, Source=/var/lib/pacemaker/pengine/pe-input-1450.bz2):
Complete
Jan 13 19:33:00 [4296] orana       crmd:     info: do_log: FSA: Input
I_TE_SUCCESS from notify_crmd() received in state S_TRANSITION_ENGINE
Jan 13 19:33:00 [4296] orana       crmd:   notice:
do_state_transition: State transition S_TRANSITION_ENGINE -> S_IDLE [
input=I_TE_SUCCESS cause=C_FSA_INTERNAL origin=notify_crmd ]
Jan 13 19:33:00 [4295] orana    pengine:   notice: process_pe_message:
Calculated Transition 2315:
/var/lib/pacemaker/pengine/pe-input-1450.bz2
Jan 13 19:33:00 [4296] orana       crmd:     info:
pcmk_cpg_membership: Joined[2.0] crmd.2
Jan 13 19:33:00 [4296] orana       crmd:     info:
pcmk_cpg_membership: Member[2.0] crmd.1
Jan 13 19:33:00 [4296] orana       crmd:     info:
pcmk_cpg_membership: Member[2.1] crmd.2
Jan 13 19:33:00 [4296] orana       crmd:     info:
crm_update_peer_proc: pcmk_cpg_membership: Node kamet[2] -
corosync-cpg is now online
Jan 13 19:33:00 [4296] orana       crmd:     info:
peer_update_callback: Client kamet/peer now has status [online]
(DC=true, changed=4000000)
Jan 13 19:33:00 [4296] orana       crmd:   notice:
do_state_transition: State transition S_IDLE -> S_INTEGRATION [
input=I_NODE_JOIN cause=C_FSA_INTERNAL origin=peer_update_callback ]
Jan 13 19:33:00 [4296] orana       crmd:     info:
do_dc_join_offer_one: An unknown node joined - (re-)offer to any
unconfirmed nodes
Jan 13 19:33:00 [4296] orana       crmd:     info: join_make_offer:
Making join offers based on membership 7044
Jan 13 19:33:00 [4296] orana       crmd:     info: join_make_offer:
join-2: Sending offer to kamet
Jan 13 19:33:00 [4291] orana        cib:     info:
cib_process_request: Forwarding cib_modify operation for section
status to master (origin=local/crmd/2492)
Jan 13 19:33:00 [4296] orana       crmd:     info:
crm_update_peer_join: join_make_offer: Node kamet[2] - join-2 phase 0
-> 1
Jan 13 19:33:00 [4296] orana       crmd:     info: join_make_offer:
Skipping orana: already known 4
Jan 13 19:33:00 [4296] orana       crmd:     info:
abort_transition_graph: Transition aborted: Peer Halt
(source=do_te_invoke:158, 1)
Jan 13 19:33:00 [4291] orana        cib:     info:
cib_process_request: Completed cib_modify operation for section nodes:
OK (rc=0, origin=kamet/crmd/3, version=0.74.0)
Jan 13 19:33:00 [4291] orana        cib:     info: cib_perform_op:
Diff: --- 0.74.0 2
Jan 13 19:33:00 [4291] orana        cib:     info: cib_perform_op:
Diff: +++ 0.74.1 (null)
Jan 13 19:33:00 [4291] orana        cib:     info: cib_perform_op: +
/cib:  @num_updates=1
Jan 13 19:33:00 [4291] orana        cib:     info: cib_perform_op: +
/cib/status/node_state[@id='kamet']:  @crmd=online,
@crm-debug-origin=peer_update_callback
Jan 13 19:33:00 [4291] orana        cib:     info:
cib_process_request: Completed cib_modify operation for section
status: OK (rc=0, origin=orana/crmd/2492, version=0.74.1)
Jan 13 19:33:00 [4291] orana        cib:     info:
cib_process_replace: Replacement 0.4.0 from kamet not applied to
0.74.1: current epoch is greater than the replacement
Jan 13 19:33:00 [4291] orana        cib:  warning:
cib_process_request: Completed cib_replace operation for section
'all': Update was older than existing configuration (rc=-205,
origin=kamet/cibadmin/2, version=0.74.1)
Jan 13 19:33:00 [4291] orana        cib:     info: cib_perform_op:
Diff: --- 0.74.1 2
Jan 13 19:33:00 [4291] orana        cib:     info: cib_perform_op:
Diff: +++ 0.75.0 (null)
Jan 13 19:33:00 [4291] orana        cib:     info: cib_perform_op: --
/cib/configuration/nodes/node[@id='kamet']
Jan 13 19:33:00 [4291] orana        cib:     info: cib_perform_op: --
/cib/configuration/nodes/node[@id='orana']
Jan 13 19:33:00 [4291] orana        cib:     info: cib_perform_op: --
/cib/configuration/resources/primitive[@id='fence-uc-orana']/meta_attributes[@id='fence-uc-orana-meta_attributes']
Jan 13 19:33:00 [4291] orana        cib:     info: cib_perform_op: --
/cib/configuration/resources/primitive[@id='fence-uc-kamet']
Jan 13 19:33:00 [4291] orana        cib:     info: cib_perform_op: --
/cib/configuration/resources/primitive[@id='C-3']
Jan 13 19:33:00 [4291] orana        cib:     info: cib_perform_op: --
/cib/configuration/resources/primitive[@id='C-FLT']
Jan 13 19:33:00 [4291] orana        cib:     info: cib_perform_op: --
/cib/configuration/resources/primitive[@id='C-FLT2']
Jan 13 19:33:00 [4291] orana        cib:     info: cib_perform_op: --
/cib/configuration/resources/primitive[@id='E-3']
Jan 13 19:33:00 [4291] orana        cib:     info: cib_perform_op: --
/cib/configuration/resources/primitive[@id='MGMT-FLT']
Jan 13 19:33:00 [4291] orana        cib:     info: cib_perform_op: --
/cib/configuration/resources/primitive[@id='M-FLT']
Jan 13 19:33:00 [4291] orana        cib:     info: cib_perform_op: --
/cib/configuration/resources/primitive[@id='M-FLT2']
Jan 13 19:33:00 [4291] orana        cib:     info: cib_perform_op: --
/cib/configuration/resources/primitive[@id='S-FLT']
Jan 13 19:33:00 [4291] orana        cib:     info: cib_perform_op: --
/cib/configuration/resources/primitive[@id='S-FLT2']
Jan 13 19:33:00 [4291] orana        cib:     info: cib_perform_op: --
/cib/configuration/constraints/rsc_colocation[@id='colocation-C-3-foo-master-INFINITY']
Jan 13 19:33:00 [4291] orana        cib:     info: cib_perform_op: --
/cib/configuration/constraints/rsc_order[@id='order-C-3-foo-master-mandatory']
Jan 13 19:33:00 [4291] orana        cib:     info: cib_perform_op: --
/cib/configuration/constraints/rsc_colocation[@id='colocation-C-FLT-foo-master-INFINITY']
Jan 13 19:33:00 [4291] orana        cib:     info: cib_perform_op: --
/cib/configuration/constraints/rsc_order[@id='order-C-FLT-foo-master-mandatory']
Jan 13 19:33:00 [4291] orana        cib:     info: cib_perform_op: --
/cib/configuration/constraints/rsc_colocation[@id='colocation-C-FLT2-foo-master-INFINITY']
Jan 13 19:33:00 [4291] orana        cib:     info: cib_perform_op: --
/cib/configuration/constraints/rsc_order[@id='order-C-FLT2-foo-master-mandatory']
Jan 13 19:33:00 [4291] orana        cib:     info: cib_perform_op: --
/cib/configuration/constraints/rsc_colocation[@id='colocation-E-3-foo-master-INFINITY']
Jan 13 19:33:00 [4291] orana        cib:     info: cib_perform_op: --
/cib/configuration/constraints/rsc_order[@id='order-E-3-foo-master-mandatory']
Jan 13 19:33:00 [4291] orana        cib:     info: cib_perform_op: --
/cib/configuration/constraints/rsc_colocation[@id='colocation-MGMT-FLT-foo-master-INFINITY']
Jan 13 19:33:00 [4291] orana        cib:     info: cib_perform_op: --
/cib/configuration/constraints/rsc_order[@id='order-MGMT-FLT-foo-master-mandatory']
Jan 13 19:33:00 [4291] orana        cib:     info: cib_perform_op: --
/cib/configuration/constraints/rsc_colocation[@id='colocation-M-FLT-foo-master-INFINITY']
Jan 13 19:33:00 [4291] orana        cib:     info: cib_perform_op: --
/cib/configuration/constraints/rsc_order[@id='order-M-FLT-foo-master-mandatory']
Jan 13 19:33:00 [4291] orana        cib:     info: cib_perform_op: --
/cib/configuration/constraints/rsc_colocation[@id='colocation-M-FLT2-foo-master-INFINITY']
Jan 13 19:33:00 [4291] orana        cib:     info: cib_perform_op: --
/cib/configuration/constraints/rsc_order[@id='order-M-FLT2-foo-master-mandatory']
Jan 13 19:33:00 [4291] orana        cib:     info: cib_perform_op: --
/cib/configuration/constraints/rsc_colocation[@id='colocation-S-FLT-foo-master-INFINITY']
Jan 13 19:33:00 [4291] orana        cib:     info: cib_perform_op: --
/cib/configuration/constraints/rsc_order[@id='order-S-FLT-foo-master-mandatory']
Jan 13 19:33:00 [4291] orana        cib:     info: cib_perform_op: --
/cib/configuration/constraints/rsc_colocation[@id='colocation-S-FLT2-foo-master-INFINITY']
Jan 13 19:33:00 [4291] orana        cib:     info: cib_perform_op: --
/cib/configuration/constraints/rsc_order[@id='order-S-FLT2-foo-master-mandatory']
Jan 13 19:33:00 [4291] orana        cib:     info: cib_perform_op: --
/cib/configuration/constraints/rsc_colocation[@id='colocation-fence-uc-orana-foo-master-INFINITY']
Jan 13 19:33:00 [4291] orana        cib:     info: cib_perform_op: --
/cib/configuration/constraints/rsc_colocation[@id='colocation-fence-uc-kamet-foo-master-INFINITY']
Jan 13 19:33:00 [4291] orana        cib:     info: cib_perform_op: --
/cib/configuration/constraints/rsc_order[@id='order-fence-uc-kamet-foo-master-mandatory']
Jan 13 19:33:00 [4291] orana        cib:     info: cib_perform_op: --
/cib/configuration/constraints/rsc_order[@id='order-fence-uc-orana-foo-master-mandatory']
Jan 13 19:33:00 [4291] orana        cib:     info: cib_perform_op: +
/cib:  @epoch=75, @num_updates=0
Jan 13 19:33:00 [4291] orana        cib:     info: cib_perform_op: +
/cib/configuration/resources/primitive[@id='fence-uc-orana']/instance_attributes[@id='fence-uc-orana-instance_attributes']/nvpair[@id='fence-uc-orana-instance_attributes-delay']:
 @value=0
Jan 13 19:33:00 [4291] orana        cib:     info:
cib_process_request: Completed cib_replace operation for section
configuration: OK (rc=0, origin=kamet/cibadmin/2, version=0.75.0)
Jan 13 19:33:00 [4292] orana stonith-ng:     info:
stonith_device_remove: Device 'C-3' not found (2 active devices)
Jan 13 19:33:00 [4292] orana stonith-ng:     info:
stonith_device_remove: Device 'C-FLT' not found (2 active devices)
Jan 13 19:33:00 [4292] orana stonith-ng:     info:
stonith_device_remove: Device 'C-FLT2' not found (2 active devices)
Jan 13 19:33:00 [4292] orana stonith-ng:     info:
stonith_device_remove: Device 'E-3' not found (2 active devices)
Jan 13 19:33:00 [4292] orana stonith-ng:     info:
stonith_device_remove: Device 'MGMT-FLT' not found (2 active devices)
Jan 13 19:33:00 [4292] orana stonith-ng:     info:
stonith_device_remove: Device 'M-FLT' not found (2 active devices)
Jan 13 19:33:00 [4292] orana stonith-ng:     info:
stonith_device_remove: Device 'M-FLT2' not found (2 active devices)
Jan 13 19:33:00 [4292] orana stonith-ng:     info:
stonith_device_remove: Device 'S-FLT' not found (2 active devices)
Jan 13 19:33:00 [4292] orana stonith-ng:     info:
stonith_device_remove: Device 'S-FLT2' not found (2 act
Thanks in advance.

Regards
Arjun
-------------- next part --------------
A non-text attachment was scrubbed...
Name: corosync.log
Type: application/octet-stream
Size: 496222 bytes
Desc: not available
URL: <https://lists.clusterlabs.org/pipermail/users/attachments/20160114/98bb0d81/attachment-0003.obj>


More information about the Users mailing list