diracgrid / dirac Goto Github PK
View Code? Open in Web Editor NEWDIRAC Grid
Home Page: http://diracgrid.org
License: GNU General Public License v3.0
DIRAC Grid
Home Page: http://diracgrid.org
License: GNU General Public License v3.0
[vanessa@mardirac3 ~]$ proxy-init -g dirac_user -d
Enter Certificate password:
Contacting CS...
New connection -> 127.0.0.1:9135
Checking DN /O=GRID-FR/C=FR/O=CNRS/OU=CPPM/CN=Vanessa Hamar
Username is vhamar
Creating proxy for vhamar@dirac_user (/O=GRID-FR/C=FR/O=CNRS/OU=CPPM/CN=Vanessa Hamar)
Traceback (most recent call last):
File "/home/vanessa/DIRAC-v6r0-pre3/DIRAC/FrameworkSystem/scripts/proxy-init.py", line 135, in
success = uploadProxyToDIRACProxyManager( cliParams )
File "/home/vanessa/DIRAC-v6r0-pre3/DIRAC/FrameworkSystem/scripts/proxy-init.py", line 87, in uploadProxyToDIRACProxyManager
params.debugMsg( "Uploading user pilot proxy with group %s..." % ( params.getDIRACGroup() ) )
AttributeError: CLIParams instance has no attribute 'debugMsg'
openssl >= 1.0 requires OPENSSL_CONF env var to exist to avoid looking in built-in directory (which doesn't exist when installed somewhere different than where it was compiled)
This will allow the Monitoring pages to report it and it can be properly accounted for Stalled jobs
It could be useful to have an agent that cancels pilots that have a dirac version different from the one in the CS. It would cancel only those scheduled or waiting. It could be refined such that it deletes only those that run with a certian role (e.g. Role=production). That would avoid some time to recover when there are plenty of pilots with the wrong version scheduled to a given site.
Some notes about proxies in this version:
#1) proxy-init -g <dirac_group> fails to upload the proxy
proxy-init -g dirac_user -d
Enter Certificate password:
Contacting CS...
New connection -> 127.0.0.1:9135
Checking DN /O=GRID-FR/C=FR/O=CNRS/OU=CPPM/CN=Vanessa Hamar
Username is vhamar
Creating proxy for vhamar@dirac_user (/O=GRID-FR/C=FR/O=CNRS/OU=CPPM/CN=Vanessa Hamar)
Proxy will be uploaded to ProxyManager
Deprecation warning: DIRAC.Core.Security.Misc will not be available in next release,
use DIRAC.Core.Security.ProxyInfo instead.
Uploading user pilot proxy with group dirac_pilot...
Traceback (most recent call last):
File "/home/vanessa/DIRAC-v6r0-pre4/DIRAC/FrameworkSystem/scripts/proxy-init.py", line 135, in
success = uploadProxyToDIRACProxyManager( cliParams )
File "/home/vanessa/DIRAC-v6r0-pre4/DIRAC/FrameworkSystem/scripts/proxy-init.py", line 89, in uploadProxyToDIRACProxyManager
retVal = uploadProxy( params )
File "/home/vanessa/DIRAC-v6r0-pre4/DIRAC/FrameworkSystem/Client/ProxyUpload.py", line 106, in uploadProxy
params.debugMsg( "Loading user proxy" )
2011-06-11 09:49:53 UTC WorkloadManagement/TaskQueueDirector DEBUG: Authenticated peer (/O=GRID-FR/C=FR/O=CNRS/OU=CPPM/CN=mardirac3.in2p3.fr)
2011-06-11 09:49:53 UTC WorkloadManagement/TaskQueueDirector DEBUG: New session connecting to server at ('mardirac3.in2p3.fr', 9152)
2011-06-11 09:49:53 UTC WorkloadManagement/TaskQueueDirector VERB: New connection -> 127.0.0.1:9152
2011-06-11 09:49:53 UTC WorkloadManagement/TaskQueueDirector DEBUG: Closing socket
2011-06-11 09:49:53 UTC WorkloadManagement/TaskQueueDirector/gLitePilotDirector ERROR: Can't get a proxy for 432000 seconds: myproxy is disabled
2011-06-11 09:49:53 UTC WorkloadManagement/TaskQueueDirector/gLitePilotDirector ERROR: No proxy Available User "/O=GRID-FR/C=FR/O=CNRS/OU=CPPM/CN=Vanessa Hamar", Group "dirac_user"
2011-06-11 09:49:54 UTC WorkloadManagement/TaskQueueDirector INFO: Number of pilots to be Submitted 22
2011-06-11 09:49:54 UTC WorkloadManagement/TaskQueueDirector ERROR: submitPilot Failed: No proxy Available
2011-06-11 09:49:54 UTC WorkloadManagement/TaskQueueDirector INFO: Number of pilots Submitted 0
2011-06-11 09:49:54 UTC WorkloadManagement/TaskQueueDirector/Monitoring DEBUG: Adding mark to CPU
To resolve the problem I created a proxy using validity time for 720 hours (dirac-proxy-init -g dirac_user -v 720:00) and I uploaded it again to the server and the pilot jobs were submitted.
The error message is:
{'Message': 'Server error while serving getSiteSummaryWeb: too many values to unpack', 'OK': False, 'rpcStub':(('WorkloadManagement/WMSAdministrator', {'delegatedDN': 'SKIP', 'timeout': 600, 'skipCACheck': False, 'setup': u'LHCb-Production', 'delegatedGroup': 'lhcb_prod'}), 'getSiteSummaryWeb', ({}, [], 0, 500))
Although on service side the number of arguments is correct:
def export_getSiteSummaryWeb(self, selectDict, sortList, startItem, maxItems):
result = jobDB.getSiteSummaryWeb(selectDict, sortList, startItem, maxItems)
It would be nice, if during DIRAC installation procedure one could choose between externals shipped with DIRAC or their native counterparts, that had been already installed on the box.
Up to now DIRAC is using only the first set, which is OK, if you want to install it on the grid box. But for the development process we don't need to install another several dozens MB of binaries doubling those which are already in the system. Hence the installation script (on user's request, not by default) should be cleaver enough to check if a library/application/python module exists on the system with compliant and required version and build only missing ones.
Again a feature request: For queries returning a very large number of entries, it could be useful to return only a sub set of entries (user parameter, default to all), as sometimes, one does not need all the entries. Also, a method to return the next set would be useful. The point being to gain in packet size (transmit less data per calls).
The SiteDirector should not take blindly the max queue length and limit it to say 3 days, because BDII can contain crazy values for that.
The connections initated by the LCGFileCatalogClient in DIRAC are not thead safe because the lfc module used is not a thread safe one.
The fix put in the current production system is to put a global lock in the LCGFileCataloClient and acquire it and releasing it just after the execution. A better solution is foreseen.
When we use the RequestDBFile, the request are not taken in chronological order. This feature is mandatory to treat the request in the LHCb online context.
In the file DIRAC/requestManagementSystem/DB/RequestDBFile.py one solution to fix this issue is to add after the line 119 :
119 requestNames = os.listdir( reqDir )
requestNames.sort()
120 for requestName in requestNames:
It would be helpful to have a better error notification: for the moment there are a lot of services sending their errors to [email protected]. It would be useful to have a notifcation area from within the web portal, for admins. An other solution is that instead of sending e-mail, the Notification service could publish to an rss feed. This should be marked as feature request.
current integration version reports:
majorVersion = 5
minorVersion = 12
patchLevel = 25
preVersion = 0
That way the extensions can overwrite stuff from DIRAC in case they need to do.
I made a mistake in the definition of my metadata tags for one directory: I set a tag that should belong to a daughter directory. So I want to remove that tag for that directory. But I cannot as there are no method for that (at least not in the cli).
FC:/vo.formation.idgrilles.fr/user/v/vhamar>mkdir newDir
Successfully created directory: /vo.formation.idgrilles.fr/user/v/vhamar/newDir
FC:/vo.formation.idgrilles.fr/user/v/vhamar>ls -la
-rwxrwxr-x 0 vhamar user 1126 2011-06-11 12:08:46 bashrc
drwxrwxr-x 0 vhamar user 0 2011-06-11 12:36:53 newDir
-rwxrwxr-x 0 vhamar dirac_user 30 2011-06-11 09:30:53 test.txt
FC:/vo.formation.idgrilles.fr/user/v/vhamar>rmdir newDir
lfn: /vo.formation.idgrilles.fr/user/v/vhamar/newDir
Failed to remove directory from the catalog
Server error while serving removeDirectory: string indices must be integers, not str
FC:/vo.formation.idgrilles.fr/user/v/vhamar>ls -la
-rwxrwxr-x 0 vhamar user 1126 2011-06-11 12:08:46 bashrc
-rwxrwxr-x 0 vhamar dirac_user 30 2011-06-11 09:30:53 test.txt
dirac-dms-add-file LFN:/vo.formation.idgrilles.fr/user/v/vhamar/test1.txt test.txt CPPM-disk
Deprecation warning: DIRAC.Core.Security.Misc will not be available in next release,
use DIRAC.Core.Security.ProxyInfo instead.
ReplicaManager.putAndRegister: Failed to register file. /vo.formation.idgrilles.fr/user/v/vhamar/test1.txt {'FileCatalog': 'Failed to create directory for file'}
{'Failed': {'/vo.formation.idgrilles.fr/user/v/vhamar/test1.txt': {'register': {'Addler': '97b75b83',
'GUID': 'ED690EED-DCBD-9EDD-12E4-7EE43A19B6D5',
'LFN': '/vo.formation.idgrilles.fr/user/v/vhamar/test1.txt',
'PFN': 'srm://marsedpm.in2p3.fr/dpm/in2p3.fr/home//vo.formation.idgrilles.fr/user/v/vhamar/test1.txt',
'Size': 247,
'TargetSE': 'CPPM-disk'}}},
2011-06-08 07:53:38 UTC DataManagement/FileCatalog/MySQL DEBUG: _query: SELECT LEVEL,LPATH1,LPATH2,LPATH3,LPATH4,LPATH5,LPATH6,LPATH7,LPATH8,LPATH9,LPATH10,LPATH11,LPATH12,LPATH13,LPATH14,LPATH15 FROM FC_DirectoryLevelTree WHERE DirID=1
2011-06-08 07:53:38 UTC DataManagement/FileCatalog/MySQL DEBUG: _query: Excution failed. 1054: Unknown column 'LPATH11' in 'field list'
[vanessa@mardirac3 DMS]$ dirac-dms-filecatalog-cli
Deprecation warning: DIRAC.Core.Security.Misc will not be available in next release,
use DIRAC.Core.Security.ProxyInfo instead.
Starting DIRAC FileCatalog client
File Catalog Client $Revision: 1.17 $Date:
FC:/> ls
FC:/> mkdir vo.formation.idgrilles.fr
Failed to create directory: Excution failed.: ( 1054: Unknown column 'LPATH11' in 'field list' )
The functionality now available in proxy-init, dirac-proxy-init and dirac-upload-proxy should be collected altogether in just one command dirac-proxy-init . Other commands will be dropped. The following is to be implemented:
In multi VO installations a User might belong to more than one VO, thus it will have as many homes as VOs it belongs to.
How do we want to handle that?
Should this not be solved in the based FileCatalog Class for all Catalogs?
On the server side we have:
types_updateSoftware = [ StringTypes ]
def export_updateSoftware( self, version, rootPath = "", gridVersion = "2009-08-13" ):
and on the client:
print "Software update can take a while, please wait ..."
result = client.updateSoftware( version )
this means in particular that there is no way to update a server to a newer version of the lcgBundle via the SystemAdministrator.
I would suggest that the default is set to "", in this case whatever is on the dirac.cfg (from the initial server installation will be taken), and that the client is fixed to pass the same arguments that the server can use.
The file replicate operation is failinf now in the FC CLI while it works fine in dirac-dms-replicate-lfn command
2011-06-08 23:13:33 UTC dirac-install [ERROR] Post installation script /opt/dirac/versions/v6r0-pre3_1307574810/Web/dirac-postInstall.py failed. Check /opt/dirac/versions/v6r0-pre3_1307574810/Web/dirac-postInstall.py.err
[volhcb13] /home/dirac/scripts > cat /opt/dirac/versions/v6r0-pre3_1307574810/Web/dirac-postInstall.py.err
Traceback (most recent call last):
File "/opt/dirac/versions/v6r0-pre3_1307574810/Web/dirac-postInstall.py", line 60, in ?
zFile.extractall( publicDir )
AttributeError: ZipFile instance has no attribute 'extractall'
I have created a ConfigTemplate.cfg in LHCbDIRAC/DataManagementSystem to put information about a new Agent but when I tried to installed it the new agent with dirac-admin-sysadmin-cli my ConfigTemplate.cfg is not read. Does it mean that the ConfigTemplate.cfg could be only in DIRAC/DataManagementSystem ? or is it a bug ?
^@[lxplus314] x86_64-slc5-gcc43-opt /afs/cern.ch/lhcb/software/DEV/LHCBDIRACdirac-admin-sysadmin-cli --host volhcb03.cern.chDIRAC Root Path = /afs/cern.ch/lhcb/software/DEV/DIRAC/DIRAC_v5r13p9
volhcb03.cern.ch >install agent DataManagement StorageHistoryAgent
AT >>> agent DataManagement StorageHistoryAgent Certification ['LHCbWeb', 'LHCb']
Loading configuration template /afs/cern.ch/lhcb/software/DEV/DIRAC/DIRAC_v5r13p9/DIRAC/DataManagementSystem/ConfigTemplate.cfg
Can not find Agents/StorageHistoryAgent in template
{'Message': 'Can not find Agents/StorageHistoryAgent in template', 'OK': False}
ERROR: Can not find Agents/StorageHistoryAgent in template
[lxplus314] x86_64-slc5-gcc43-opt /afs/cern.ch/lhcb/software/DEV/LHCBDIRAC> grep StorageHistoryAgent /afs/cern.ch/lhcb/software/DEV/LHCBDIRAC/LHCBDIRAC_v6r5-pre10/LHCbDIRAC/DataManagementSystem/ConfigTemplate.cfg
StorageHistoryAgent
Adding methods to retrieve the VO associated to a given proxy. This will be more clear since Misc means nothing.
A list of "TransformationTypes" (sometimes called "TransType") is used by the agents:
Allow negative selections in the accounting pages:
AND not IN ( [List] )
instead of the current
AND in ( [List] )
When deleting files (and folders) the metadata associated to those directories are not deleted, therefore when queriing for metadata, the system fails to find a directory.
Also the query seems to ignore completely the other metadata tags in that case.
Hi Folks,
During re-factoring of TransferAgent I found a suspicious line in RequestClient:
where the server URL is added to returned S_OK["Server"]. I believe it rather should land under S_OK["Value"]["Server"], like in here:
This bug is present in master and integration, so you'd better fix it asap. ;)
Cheers,
Krzysztof
When installing a new agents or service, if such agent or service is defined in the CS but not in the ConfigTemplate.cfg, we get an error. I think this should be corrected.
The bug is manifesting itself by the following traceback:
from DIRAC.Core.Utilities.Mail import Mail
m = Mail()
m._subject ='subject'
m._message = 'body'
m._mailAddress = '[email protected]'
m._send()
Traceback (most recent call last):
File "", line 1, in
File "DIRAC/Core/Utilities/Mail.py", line 25, in _send
self.connect()
File "/opt/dirac/pro/Linux_x86_64_glibc-2.5/lib/python2.6/smtplib.py",
line 293, in connect
if not port: port = self.default_port
AttributeError: Mail instance has no attribute 'default_port'
2011-06-07 08:24:11 UTC WorkloadManagement/TaskQueueDirector EXCEPT: == EXCEPTION ==
2011-06-07 08:24:11 UTC WorkloadManagement/TaskQueueDirector EXCEPT: <type 'exceptions.NameError'>:global name 'vo' is not defined
2011-06-07 08:24:11 UTC WorkloadManagement/TaskQueueDirector EXCEPT: File "/opt/dirac/pro/DIRAC/Core/Base/AgentModule.py", line 307, in am_secureCall
2011-06-07 08:24:11 UTC WorkloadManagement/TaskQueueDirector EXCEPT: result = functor( *args )
2011-06-07 08:24:11 UTC WorkloadManagement/TaskQueueDirector EXCEPT:
2011-06-07 08:24:11 UTC WorkloadManagement/TaskQueueDirector EXCEPT: File "/opt/dirac/pro/DIRAC/WorkloadManagementSystem/Agent/TaskQueueDirector.py", line 196, in execute
2011-06-07 08:24:11 UTC WorkloadManagement/TaskQueueDirector EXCEPT: self.__checkSubmitPools()
2011-06-07 08:24:11 UTC WorkloadManagement/TaskQueueDirector EXCEPT:
2011-06-07 08:24:11 UTC WorkloadManagement/TaskQueueDirector EXCEPT: File "/opt/dirac/pro/DIRAC/WorkloadManagementSystem/Agent/TaskQueueDirector.py", line 350, in __checkSubmitPools
2011-06-07 08:24:11 UTC WorkloadManagement/TaskQueueDirector EXCEPT: self.__configureDirector( submitPool )
2011-06-07 08:24:11 UTC WorkloadManagement/TaskQueueDirector EXCEPT:
2011-06-07 08:24:11 UTC WorkloadManagement/TaskQueueDirector EXCEPT: File "/opt/dirac/pro/DIRAC/WorkloadManagementSystem/Agent/TaskQueueDirector.py", line 443, in __configureDirector
2011-06-07 08:24:11 UTC WorkloadManagement/TaskQueueDirector EXCEPT: director.configure( self.am_getModuleParam( 'section' ), submitPool )
2011-06-07 08:24:11 UTC WorkloadManagement/TaskQueueDirector EXCEPT:
2011-06-07 08:24:11 UTC WorkloadManagement/TaskQueueDirector EXCEPT: File "/opt/dirac/pro/DIRAC/WorkloadManagementSystem/private/gLitePilotDirector.py", line 48, in configure
2011-06-07 08:24:11 UTC WorkloadManagement/TaskQueueDirector EXCEPT: GridPilotDirector.configure( self, csSection, submitPool )
2011-06-07 08:24:11 UTC WorkloadManagement/TaskQueueDirector EXCEPT:
2011-06-07 08:24:11 UTC WorkloadManagement/TaskQueueDirector EXCEPT: File "/opt/dirac/pro/DIRAC/WorkloadManagementSystem/private/GridPilotDirector.py", line 71, in configure
2011-06-07 08:24:11 UTC WorkloadManagement/TaskQueueDirector EXCEPT: PilotDirector.configure( self, csSection, submitPool )
2011-06-07 08:24:11 UTC WorkloadManagement/TaskQueueDirector EXCEPT:
2011-06-07 08:24:11 UTC WorkloadManagement/TaskQueueDirector EXCEPT: File "/opt/dirac/pro/DIRAC/WorkloadManagementSystem/private/PilotDirector.py", line 138, in configure
2011-06-07 08:24:11 UTC WorkloadManagement/TaskQueueDirector EXCEPT: self.installInstallation = gConfig.getValue( '/Operations/%s/%s/Versions/PilotInstallation' % ( vo, setup ),
There are jobs with associated sites defined as "{LCG.SARA.nl, LCG.NIKHEF.nl }", which makes the job monitoring tools to fail. This happens when the site is defined in the JDL as a vector
The bug is manifesting itself by the following traceback:
from DIRAC.Core.Utilities.Mail import Mail
m = Mail()
m._subject ='subject'
m._message = 'body'
m._mailAddress = '[email protected]'
m._send()
Traceback (most recent call last):
File "", line 1, in
File "DIRAC/Core/Utilities/Mail.py", line 25, in _send
self.connect()
File "/opt/dirac/pro/Linux_x86_64_glibc-2.5/lib/python2.6/smtplib.py",
line 293, in connect
if not port: port = self.default_port
AttributeError: Mail instance has no attribute 'default_port'
When integrated to master the URL of the dirac_install.py download has to be updated to point to master and not integration and the documentation at AdministratorGuide/InstallingDIRACService/index.html also has to be updated.
Please check if this data member can be removed (it is not used) or is to be taken from Registry, might require checking the proxy first.
Lets say that meta2 has several values (one, two, three), and meta1 is a meta for a parent directory. Now one want to get all files corresponding to meta1, but exclude those that belong to meta2=two. Not possible now (or I missed something).
Script crashes because bug on ResourceStatusHandler.export_extendToken
Variable not declared ( tokenNewExpiration ) properly, must be something like datetime...
( but which datetime by default ? )
[vanessa@mardirac3 APIs]$ python
Python 2.6.6 (r266:84292, Mar 24 2011, 16:35:10)
[GCC 4.1.2 20080704 (Red Hat 4.1.2-50)] on linux2
Type "help", "copyright", "credits" or "license" for more information.
from DIRAC.Interfaces.API.Dirac import Dirac
Deprecation warning: DIRAC.Core.Security.Misc will not be available in next release,
use DIRAC.Core.Security.ProxyInfo instead.
from DIRAC.Interfaces.API.Job import Job
j = Job()
j.setCPUTime(500)
{'OK': True, 'Value': ''}
j.setExecutable('/bin/echo hello')
{'OK': True, 'Value': ''}
j.setExecutable('/bin/ls',arguments='-l')
{'OK': True, 'Value': ''}
j.setExecutable('/bin/echo hello again')
{'OK': True, 'Value': ''}
j.setName('API')
{'OK': True, 'Value': ''}dirac = Dirac()
result = dirac.submit(j)
2011-06-17 10:17:57 UTC /DiracAPI ERROR: Job submission failure No value for key "
2011-06-17 10:17:57 UTC /DiracAPI ERROR: }
print 'Submission Result: ',result
Submission Result: {'Message': 'No value for key "\n }', 'OK': False, 'rpcStub': (('WorkloadManagement/JobManager', {'skipCACheck': False, 'keepAliveLapse': 150, 'delegatedGroup': 'dirac_user', 'delegatedDN': '/O=GRID-FR/C=FR/O=CNRS/OU=CPPM/CN=Vanessa Hamar', 'timeout': 120}), 'submitJob', ('[ \n Origin = "DIRAC";\n ParametricInputData = "";\n Executable = "$DIRACROOT/scripts/dirac-jobexec";\n OutputSandbox = \n {\n \n };\n "\n };\n JobName = "API";\n StdError = "std.err";\n LogLevel = "info";\n Site = "ANY";\n SystemConfig = "ANY";\n Priority = "1";\n InputSandbox = \n {\n \n };\n "\n };\n ParametricInputSandbox = "";\n Arguments = "jobDescription.xml -o LogLevel=info";\n JobGroup = "vo.formation.idgrilles.fr";\n MaxCPUTime = "500";\n StdOutput = "std.out";\n InputData = "";\n JobType = "User"\n]',))}
It would be good practice that the issues reported have in the title the DIRAC version concerned. That would help the devs to prioritize and for users to look up their own version easier. Maybe a tag could be designed for that.
Check that returned SE is active.
The removal transformation 10410 seems stuck with many files still to be deleted. Looking at the associated tasks, I find several of them in the following state:
TaskID: 48 (created 2011-05-04 12:42:30, updated 2011-05-04 13:51:24) - 2 files- Request: 1453087 - Status: Waiting - TargetSE: CERN_M-DST,CNAF_M-DST,IN2P3-DST
Request status: {'SubRequestStatus': 'Done', 'RequestStatus': 'Waiting'}
Request info: (1453087L, 'Waiting', '00010410_00000048', None, '', '', '', None, None, datetime.datetime(2011, 5, 4, 13, 4, 54), datetime.datetime(2011, 5, 4, 13, 4, 54))
Waiting: 2 files
The 2 associated files are not deleted, and the request seems in a Waiting state...
Firstly why is SubRequestStatus "Done"? Which subrequest? :(((
Secondly this request appears in the Dirac portal as Assigned and not Waiting:
What can we do, and why this (apparent) inconsistency...
It happens some times ( especially in tutorials ) that users are submitting identical test jobs while changing their dirac group. In this case if a job is already submitted with one group it has its input sandbox registered with this group. The attempt to submit the same job with another group results in an error saying that this sandbox exists but belongs to another group. The solution can be to make the group name somehow part of the sandbox contents so that the two hashes in this case are different.
File "/scratch/ctagrid2361.ccwl0924/tmp/https_3a_2f_2flapp-lb01.in2p3.fr_3a9000_2fFwehStqM7ad5RyNt5yD0gA/DIRAC/WorkloadManagementSystem/Client/DownloadInputData.py", line 234, in __getLFN
fileName = os.path.basename( result['Value'] )
Already the case in LHCb, files can be written on different formats even if they have the same extension. It would be nice that there is the possibility to make this another default metadata for any file, like size, owner,...
Deprecation warning: DIRAC.Core.Security.Misc will not be available in next release,
use DIRAC.Core.Security.ProxyInfo instead.
2011-06-14 08:22:49 UTC Configuration/CE2CSAgent EXCEPT: Can't load agent Configuration/CE2CSAgent
2011-06-14 08:22:49 UTC Configuration/CE2CSAgent EXCEPT: == EXCEPTION ==
2011-06-14 08:22:49 UTC Configuration/CE2CSAgent EXCEPT: <type 'exceptions.TypeError'>:init() takes exactly 4 arguments (3 given)
2011-06-14 08:22:49 UTC Configuration/CE2CSAgent EXCEPT: File "/opt/dirac/pro/DIRAC/Core/Base/AgentReactor.py", line 126, in loadAgentModule
2011-06-14 08:22:49 UTC Configuration/CE2CSAgent EXCEPT: agent = agentClass( fullName, self.__baseAgentName )
2011-06-14 08:22:49 UTC Configuration/CE2CSAgent EXCEPT: ===============
2011-06-14 08:22:49 UTC Configuration/CE2CSAgent ERROR: Error while loading agent module Can't load agent Configuration/CE2CSAgent in root modules DIRAC
26204: old priority 0, new priority 19
2011-06-14 08:22:51 UTC Configuration/CE2CSAgent INFO: Loading Configuration/CE2CSAgent
2011-06-14 08:22:51 UTC Configuration/CE2CSAgent VERB: Trying to load from root module DIRAC
2011-06-14 08:22:51 UTC Configuration/CE2CSAgent VERB: Looking for file /opt/dirac/versions/v6r0-pre4_1307882881/DIRAC/ConfigurationSystem/Agent/CE2CSAgent.py
2011-06-14 08:22:51 UTC Configuration/CE2CSAgent DEBUG: Trying to load DIRAC.ConfigurationSystem.Agent.CE2CSAgent
Deprecation warning: DIRAC.Core.Security.Misc will not be available in next release,
use DIRAC.Core.Security.ProxyInfo instead.
If I run the dirac-admin-sysadmin-cli command with the wrong credentials I get this crash with DIRAC v5r13p9. I suspect that we should get an error message instead of a crash.
[lxplus314] x86_64-slc5-gcc43-opt /afs/cern.ch/lhcb/software/DEV/LHCBDIRAC/LHCBDIRAC_v6r5-pre10> dirac-admin-sysadmin-cli --host volhcb03.cern.ch
DIRAC Root Path = /afs/cern.ch/lhcb/software/DEV/DIRAC/DIRAC_v5r13p9
volhcb03.cern.ch >
volhcb03.cern.ch >
volhcb03.cern.ch >install agent DataManagementSystem StorageUsageAgent
Error: Unauthorized query to Framework/SystemAdministrator:getInfo
Traceback (most recent call last):
File "/afs/cern.ch/lhcb/software/DEV/DIRAC/DIRAC_v5r13p9/InstallArea/scripts/dirac-admin-sysadmin-cli", line 18, in
cli.cmdloop()
File "/afs/cern.ch/sw/lcg/external/Python/2.6.5/x86_64-slc5-gcc43-opt/lib/python2.6/cmd.py", line 142, in cmdloop
stop = self.onecmd(line)
File "/afs/cern.ch/sw/lcg/external/Python/2.6.5/x86_64-slc5-gcc43-opt/lib/python2.6/cmd.py", line 219, in onecmd
return func(arg)
File "/afs/cern.ch/lhcb/software/DEV/DIRAC/DIRAC_v5r13p9/InstallArea/python/DIRAC/FrameworkSystem/Client/SystemAdministratorClientCLI.py", line 380, in do_install
hostSetup = result['Value']['Setup']
KeyError: 'Value'
The dirac-install command should have the following switches defined ( among others ) :
-l specifies that the software Project is to be installed with the version specified by -r switch
alternatively
-V specifies that dirac-install should install software packages as specified in the community defaults.
The -l and -V switches are not compatible, giving both in the command line must result in a meaningful error message.
It would be nice that from the pilot monitor page, when clicking on a pilot to show the job, it opened a new tab instead of replacing the current tab.
[vanessa@mardirac3 DMS]$ dirac-dms-add-file LFN:/vo.formation.idgrilles.fr/user/v/vhamar/110611.txt test.txt DIRAC-USER --debug
New connection -> 127.0.0.1:9135
Deprecation warning: DIRAC.Core.Security.Misc will not be available in next release,
use DIRAC.Core.Security.ProxyInfo instead.
New connection -> 127.0.0.1:9152
New connection -> 127.0.0.1:9152
Trying to load from root path DIRAC
Looking for file /home/vanessa/DIRAC-v6r0-pre4/DIRAC/Resources/Catalog/FileCatalogClient.py
New connection -> 127.0.0.1:9197
Trying to load from root path DIRAC
Looking for file /home/vanessa/DIRAC-v6r0-pre4/DIRAC/Resources/Catalog/FileCatalogClient.py
New connection -> 127.0.0.1:9197
New connection -> 127.0.0.1:9197
Trying to load from root path DIRAC
Looking for file /home/vanessa/DIRAC-v6r0-pre4/DIRAC/Resources/Catalog/FileCatalogClient.py
New connection -> 127.0.0.1:9197
ReplicaManager.putAndRegister: Checksum information not provided. Calculating adler32.
ReplicaManager.putAndRegister: Checksum calculated to be 97b75b83.
New connection -> 127.0.0.1:9197
Trying to load from root path DIRAC
Looking for file /home/vanessa/DIRAC-v6r0-pre4/DIRAC/Resources/Storage/DIPStorage.py
StorageElement.isValid: Determining whether the StorageElement DIRAC-USER is valid for use.
StorageElement.isValid: The 'operation' argument is not supplied. It should be supplied in the future.
StorageElement.getStorageElementName: The Storage Element name is DIRAC-USER.
StorageElement.__executeFunction: Attempting to perform 'putFile' operation with 1 pfns.
StorageElement.isValid: Determining whether the StorageElement DIRAC-USER is valid for use.
StorageElement.isLocalSE: Determining whether DIRAC-USER is a local SE.
StorageElement.__executeFunction: Generating 1 protocol PFNs for DIP.
StorageElement.__executeFunction: Attempting to perform 'putFile' for 1 physical files.
New connection -> 127.0.0.1:9148
New connection -> 127.0.0.1:9133
ReplicaManager.putAndRegister: Sending accounting took 0.5 seconds
ReplicaManager.putAndRegister: Failed to put file to Storage Element. test.txt: Server error while serving fromClient: 'Port'
Problem during putAndRegister call
ERROR ReplicaManager.putAndRegister: Failed to put file to Storage Element. Server error while serving fromClient: 'Port'
It does not support an environmental variable in the executable. It probably needs other fixes and we should create other modules as example.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.