You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This causes log noise, and it is misleading log noise, as it seems to indicate a problem with the aae_throttle, and tells the operator nothing about the node with which there is a problem:
2019-11-13 19:53:00.527 [info] <0.623.0>@riak_kv_entropy_manager:query_and_set_aae_throttle3:1097 Could not determine mailbox sizes for nodes: [bad_rpc_result]
2019-11-13 19:53:15.523 [info] <0.623.0>@riak_kv_entropy_manager:query_and_set_aae_throttle3:1097 Could not determine mailbox sizes for nodes: [bad_rpc_result]
Why was nodes/0 used rather than consulting the node watcher? There is no clue from the commit log.
There could be some scenarios if we were to use node watcher where the max throttle is no longer applied in some failure cases.
riak_core_ring:all_members/1 may be better?
The text was updated successfully, but these errors were encountered:
Any erlang node that shares the cookie can connect to a riak cluster, even if it isn't running Riak. This may be useful for monitoring purposes.
Generally Riak determines what nodes are active in the cluster from the perspective of Riak by using riak_core (and the ring) e.g.
However the riak_kv_entropy_manager uses the erlang-level
nodes/0
function:https://github.com/basho/riak_kv/blob/develop-2.9/src/riak_kv_entropy_manager.erl#L1083-L1089
This causes log noise, and it is misleading log noise, as it seems to indicate a problem with the aae_throttle, and tells the operator nothing about the node with which there is a problem:
Why was
nodes/0
used rather than consulting the node watcher? There is no clue from the commit log.There could be some scenarios if we were to use node watcher where the max throttle is no longer applied in some failure cases.
riak_core_ring:all_members/1
may be better?The text was updated successfully, but these errors were encountered: