EPICS Home

Experimental Physics and Industrial Control System


 
1994  1995  1996  1997  1998  1999  2000  2001  2002  2003  2004  2005  2006  2007  2008  2009  2010  2011  2012  2013  2014  2015  2016  <20172018  2019  2020  2021  2022  2023  2024  Index 1994  1995  1996  1997  1998  1999  2000  2001  2002  2003  2004  2005  2006  2007  2008  2009  2010  2011  2012  2013  2014  2015  2016  <20172018  2019  2020  2021  2022  2023  2024 
<== Date ==> <== Thread ==>

Subject: Re: EPICS Archiver Appliance does not transfer PVs to "Being archived"
From: Michael Davidsaver <[email protected]>
To: Abdalla Ahmad <[email protected]>, "[email protected]" <[email protected]>
Date: Mon, 24 Apr 2017 15:18:27 -0400
On 04/24/2017 10:21 AM, Michael Davidsaver wrote:
> On 04/24/2017 05:27 AM, Abdalla  Ahmad wrote:
>> Hi
>>
>>  
>>
>> I used a single machine install for testing the archiver appliance as a
>> proof-of-concept on a local machine. I added 2 PVs to the system using
>> the management console, the status goes from "initial sampling" to
>> "Appliance assigned" and it just stops here where it should go to "Being
>> archived". Why the appliance is not archiving the PVs?
> 
> I suspect that the policy hook script is rejecting these PVs for some
> reason.  Have you customized this script at all?
> 

FYI. to keep track of this policy decisions I added logging to the
policy script.  This way PVs accept/reject (and script errors) are kept.


> # -*- coding: utf-8 -*-
> 
> import logging
> from logging import handlers
> def _log():
>     L = logging.getLogger('policy')
>     if len(L.handlers)>5:
>         return L
> 
>     F = logging.Formatter('%(asctime)s\t%(levelname)s\t%(message)s')
>     H = handlers.RotatingFileHandler('/var/lib/tomcat7-archappl/engine/logs/appl-policy.log',
>                                      maxBytes=1024*1024*10, backupCount=10)
>     H.setFormatter(F)
>     L.addHandler(H)
>     L.setLevel(logging.DEBUG)
>     L.propagate = False
>     return L
> _log = _log()
> 
> def getPolicyList():
>     return {
>         'default':'Keep all data',
>         'week':'Keep past 7 days',
>         'fast':'High update rate',
>     }
> 
> _level_alarm = ['HIHI','HIGH','LOW','LOLO','LOPR','HOPR','ADEL','EGU']
> _level_setting= ['DRVH','DRVL']
> _binary = ['ZNAM', 'ONAM']
> 
> _rec_files = {
>     'calc':_level_alarm,
>     'calcout':_level_alarm,
>     'ai':_level_alarm,
>     'ao':_level_alarm+_level_setting,
>     'longin':_level_alarm,
>     'longout':_level_alarm+_level_setting,
>     'bi':_binary,
>     'bo':_binary,
> }
> 
> _all_fields = list(set(_level_alarm+_level_setting+_binary))
> def getFieldsArchivedAsPartOfStream():
>     return _all_fields
> 
> _default_stores = [
>     'pb://localhost?name=STS&rootFolder=${ARCHAPPL_SHORT_TERM_FOLDER}&partitionGranularity=PARTITION_HOUR&consolidateOnShutdown=true',
>     'pb://localhost?name=MTS&rootFolder=${ARCHAPPL_MEDIUM_TERM_FOLDER}&partitionGranularity=PARTITION_DAY&hold=8&gather=1',
>     'pb://localhost?name=LTS&rootFolder=${ARCHAPPL_LONG_TERM_FOLDER}&partitionGranularity=PARTITION_YEAR',
> ]
> 
> _week_stores = [
>     'pb://localhost?name=STS&rootFolder=${ARCHAPPL_SHORT_TERM_FOLDER}&partitionGranularity=PARTITION_HOUR&consolidateOnShutdown=true',
>     'pb://localhost?name=MTS&rootFolder=${ARCHAPPL_MEDIUM_TERM_FOLDER}&partitionGranularity=PARTITION_DAY&hold=8&gather=1',
>     'blackhole://localhost',
> ]
> 
> # Inputs
> # info['pvName']
> # info['dbrtype'] - DBR type
> # info['eventRate'] - monitor updates per second
> # info['storageRate'] - bytes per second
> # info['aliasName'] -
> # info['policyName'] -
> # info['RTYP'] - Record type name (ie DBR_CLASS_NAME)
> #
> # Outputs
> # D['samplingPeriod'] -
> # D['samplingMethod'] - 'MONITOR' or 'SCAN' or 'DONT_ARCHIVE'
> # D['policyName'] -
> # D['dataStores'] -
> # D['archiveFields'] -
> def determinePolicy(info):
>     try:
>         return _determinePolicy(info)
>     except:
>         _log.exception('Failure in _determinePolicy')
>         return {'samplingMethod':'DONT_ARCHIVE'}
> 
> def _determinePolicy(info):
>     D = {
>         'samplingPeriod':min(1.0, 1.0/(info['eventRate']*1.5)),
>         'samplingMethod':'MONITOR',
>         'policyName': info.get('policyName', 'default'),
>         'dataStores': _default_stores,
>         'archiveFields':_rec_files.get(info.get('RTYP',''), [])
>     }
> 
>     if D['policyName']=='week':
>         D['dataStores'] = _week_stores
> 
>     wf = info.get('elementCount',1)>1
>     #wf = info.get('RTYP','') in ['waveform','aai','aao','aCalcout','aSub','histogram']
>     # 2 Hz max for our 1Hz machine, use 'fast' if need faster:
>     thresshold = 0.22 if wf else 2.2
>     # This is reverted:
>     #thresshold = 10.50#limit to 10 Hz for both waveform and scalar records
> 
>     if info['eventRate']>thresshold and D['policyName'] not in ['fast', 'week']:
>         D['samplingMethod'] = 'DONT_ARCHIVE'
>         _log.warn('Refuse to log fast %s %s', info['pvName'], info)
>         return D
> 
>     else:
>         _log.info('Default log %s %s', info['pvName'], info)
> 
>     D['policyName'] = 'default'
> 
>     return D


References:
EPICS Archiver Appliance does not transfer PVs to "Being archived" Abdalla Ahmad
Re: EPICS Archiver Appliance does not transfer PVs to "Being archived" Michael Davidsaver

Navigate by Date:
Prev: Re: EPICS Archiver Appliance does not transfer PVs to "Being archived" Wang, Lin
Next: Re: Data types in monitor function Andrew Johnson
Index: 1994  1995  1996  1997  1998  1999  2000  2001  2002  2003  2004  2005  2006  2007  2008  2009  2010  2011  2012  2013  2014  2015  2016  <20172018  2019  2020  2021  2022  2023  2024 
Navigate by Thread:
Prev: Re: EPICS Archiver Appliance does not transfer PVs to "Being archived" Michael Davidsaver
Next: Re: EPICS Archiver Appliance does not transfer PVs to "Being archived" Wang, Lin
Index: 1994  1995  1996  1997  1998  1999  2000  2001  2002  2003  2004  2005  2006  2007  2008  2009  2010  2011  2012  2013  2014  2015  2016  <20172018  2019  2020  2021  2022  2023  2024