Passed
Push — master ( d8e2ec...90ae0b )
by Jordi
10:07 queued 04:19
created

bika.lims.content.arimport   F

Complexity

Total Complexity 174

Size/Duplication

Total Lines 952
Duplicated Lines 2.84 %

Importance

Changes 0
Metric Value
wmc 174
eloc 698
dl 27
loc 952
rs 1.902
c 0
b 0
f 0

25 Methods

Rating   Name   Duplication   Size   Complexity  
A build.bika.lims.content.arimport.ARImport.guard_validate_transition() 0 6 3
A build.bika.lims.content.arimport.ARImport.Vocabulary_SamplePoint() 0 7 2
A build.bika.lims.content.arimport.ARImport.workflow_before_validate() 0 25 2
C build.bika.lims.content.arimport.ARImport.munge_field_value() 12 39 11
A build.bika.lims.content.arimport.ARImport.error() 0 4 1
A build.bika.lims.content.arimport.ARImport.Vocabulary_ContainerType() 0 4 1
A build.bika.lims.content.arimport.ARImport.lookup() 0 12 3
C build.bika.lims.content.arimport.ARImport.save_header_data() 0 59 9
B build.bika.lims.content.arimport.ARImport.create_or_reference_batch() 0 29 7
C build.bika.lims.content.arimport.ARImport.get_batch_header_values() 0 23 9
F build.bika.lims.content.arimport.ARImport.validate_samples() 0 51 16
A build.bika.lims.content.arimport.ARImport.get_row_container() 0 14 4
A build.bika.lims.content.arimport.ARImport.get_row_services() 0 17 5
C build.bika.lims.content.arimport.ARImport.get_header_values() 0 26 10
C build.bika.lims.content.arimport.ARImport.get_sample_values() 0 38 9
A build.bika.lims.content.arimport.ARImport.at_post_edit_script() 0 5 2
A build.bika.lims.content.arimport.ARImport.get_row_profile_services() 0 16 4
A build.bika.lims.content.arimport.ARImport._renameAfterCreation() 0 2 1
D build.bika.lims.content.arimport.ARImport.validate_against_schema() 15 30 12
A build.bika.lims.content.arimport.ARImport.get_row_profiles() 0 7 2
B build.bika.lims.content.arimport.ARImport.workflow_script_import() 0 57 6
F build.bika.lims.content.arimport.ARImport.save_sample_data() 0 135 32
F build.bika.lims.content.arimport.ARImport.validate_headers() 0 65 20
A build.bika.lims.content.arimport.ARImport.Vocabulary_SampleType() 0 7 2
A build.bika.lims.content.arimport.ARImport.Vocabulary_SampleMatrix() 0 4 1

How to fix   Duplicated Code    Complexity   

Duplicated Code

Duplicate code is one of the most pungent code smells. A rule that is often used is to re-structure code once it is duplicated in three or more places.

Common duplication problems, and corresponding solutions are:

Complexity

 Tip:   Before tackling complexity, make sure that you eliminate any duplication first. This often can reduce the size of classes significantly.

Complex classes like bika.lims.content.arimport often do a lot of different things. To break such a class down, we need to identify a cohesive component within that class. A common approach to find such a component is to look for fields/methods that share the same prefixes, or suffixes.

Once you have determined the fields that belong together, you can apply the Extract Class refactoring. If the component makes sense as a sub-class, Extract Subclass is also a candidate, and is often faster.

1
# -*- coding: utf-8 -*-
2
#
3
# This file is part of SENAITE.CORE
4
#
5
# Copyright 2018 by it's authors.
6
# Some rights reserved. See LICENSE.rst, CONTRIBUTORS.rst.
7
8
from AccessControl import ClassSecurityInfo
9
import csv
10
from copy import deepcopy
11
from DateTime.DateTime import DateTime
12
from Products.Archetypes.event import ObjectInitializedEvent
13
from Products.CMFCore.WorkflowCore import WorkflowException
14
from bika.lims import bikaMessageFactory as _
15
from bika.lims.browser import ulocalized_time
16
from bika.lims.config import PROJECTNAME
17
from bika.lims.content.bikaschema import BikaSchema
18
from bika.lims.content.analysisrequest import schema as ar_schema
19
from bika.lims.content.sample import schema as sample_schema
20
from bika.lims.idserver import renameAfterCreation
21
from bika.lims.interfaces import IARImport, IClient
22
from bika.lims.utils import tmpID
23
from bika.lims.utils.analysisrequest import create_analysisrequest
24
from bika.lims.vocabularies import CatalogVocabulary
25
from bika.lims.workflow import doActionFor
26
from collective.progressbar.events import InitialiseProgressBar
27
from collective.progressbar.events import ProgressBar
28
from collective.progressbar.events import ProgressState
29
from collective.progressbar.events import UpdateProgressEvent
30
from Products.Archetypes import atapi
31
from Products.Archetypes.public import *
32
from plone.app.blob.field import FileField as BlobFileField
33
from Products.Archetypes.references import HoldingReference
34
from Products.Archetypes.utils import addStatusMessage
35
from Products.CMFCore.utils import getToolByName
36
from Products.CMFPlone.utils import _createObjectByType
37
from Products.DataGridField import CheckboxColumn
38
from Products.DataGridField import Column
39
from Products.DataGridField import DataGridField
40
from Products.DataGridField import DataGridWidget
41
from Products.DataGridField import DateColumn
42
from Products.DataGridField import LinesColumn
43
from Products.DataGridField import SelectColumn
44
from zope import event
45
from zope.event import notify
46
from zope.i18nmessageid import MessageFactory
47
from zope.interface import implements
48
49
from bika.lims.browser.widgets import ReferenceWidget as bReferenceWidget
50
51
import sys
52
import transaction
53
54
_p = MessageFactory(u"plone")
55
56
OriginalFile = BlobFileField(
57
    'OriginalFile',
58
    widget=ComputedWidget(
0 ignored issues
show
Comprehensibility Best Practice introduced by
Undefined variable 'ComputedWidget'
Loading history...
Comprehensibility Best Practice introduced by
The variable ComputedWidget does not seem to be defined.
Loading history...
59
        visible=False
60
    ),
61
)
62
63
Filename = StringField(
0 ignored issues
show
Comprehensibility Best Practice introduced by
Undefined variable 'StringField'
Loading history...
Comprehensibility Best Practice introduced by
The variable StringField does not seem to be defined.
Loading history...
64
    'Filename',
65
    widget=StringWidget(
0 ignored issues
show
Comprehensibility Best Practice introduced by
Undefined variable 'StringWidget'
Loading history...
Comprehensibility Best Practice introduced by
The variable StringWidget does not seem to be defined.
Loading history...
66
        label=_('Original Filename'),
67
        visible=True
68
    ),
69
)
70
71
NrSamples = StringField(
0 ignored issues
show
Comprehensibility Best Practice introduced by
Undefined variable 'StringField'
Loading history...
72
    'NrSamples',
73
    widget=StringWidget(
0 ignored issues
show
Comprehensibility Best Practice introduced by
Undefined variable 'StringWidget'
Loading history...
74
        label=_('Number of samples'),
75
        visible=True
76
    ),
77
)
78
79
ClientName = StringField(
0 ignored issues
show
Comprehensibility Best Practice introduced by
Undefined variable 'StringField'
Loading history...
80
    'ClientName',
81
    searchable=True,
82
    widget=StringWidget(
0 ignored issues
show
Comprehensibility Best Practice introduced by
Undefined variable 'StringWidget'
Loading history...
83
        label=_("Client Name"),
84
    ),
85
)
86
87
ClientID = StringField(
0 ignored issues
show
Comprehensibility Best Practice introduced by
Undefined variable 'StringField'
Loading history...
88
    'ClientID',
89
    searchable=True,
90
    widget=StringWidget(
0 ignored issues
show
Comprehensibility Best Practice introduced by
Undefined variable 'StringWidget'
Loading history...
91
        label=_('Client ID'),
92
    ),
93
)
94
95
ClientOrderNumber = StringField(
0 ignored issues
show
Comprehensibility Best Practice introduced by
Undefined variable 'StringField'
Loading history...
96
    'ClientOrderNumber',
97
    searchable=True,
98
    widget=StringWidget(
0 ignored issues
show
Comprehensibility Best Practice introduced by
Undefined variable 'StringWidget'
Loading history...
99
        label=_('Client Order Number'),
100
    ),
101
)
102
103
ClientReference = StringField(
0 ignored issues
show
Comprehensibility Best Practice introduced by
Undefined variable 'StringField'
Loading history...
104
    'ClientReference',
105
    searchable=True,
106
    widget=StringWidget(
0 ignored issues
show
Comprehensibility Best Practice introduced by
Undefined variable 'StringWidget'
Loading history...
107
        label=_('Client Reference'),
108
    ),
109
)
110
111
Contact = ReferenceField(
0 ignored issues
show
Comprehensibility Best Practice introduced by
Undefined variable 'ReferenceField'
Loading history...
Comprehensibility Best Practice introduced by
The variable ReferenceField does not seem to be defined.
Loading history...
112
    'Contact',
113
    allowed_types=('Contact',),
114
    relationship='ARImportContact',
115
    default_method='getContactUIDForUser',
116
    referenceClass=HoldingReference,
117
    vocabulary_display_path_bound=sys.maxint,
118
    widget=ReferenceWidget(
0 ignored issues
show
Comprehensibility Best Practice introduced by
Undefined variable 'ReferenceWidget'
Loading history...
Comprehensibility Best Practice introduced by
The variable ReferenceWidget does not seem to be defined.
Loading history...
119
        label=_('Primary Contact'),
120
        size=20,
121
        visible=True,
122
        base_query={'is_active': True},
123
        showOn=True,
124
        popup_width='300px',
125
        colModel=[{'columnName': 'UID', 'hidden': True},
126
                  {'columnName': 'Fullname', 'width': '100',
127
                   'label': _('Name')}],
128
    ),
129
)
130
131
Batch = ReferenceField(
0 ignored issues
show
Comprehensibility Best Practice introduced by
Undefined variable 'ReferenceField'
Loading history...
132
    'Batch',
133
    allowed_types=('Batch',),
134
    relationship='ARImportBatch',
135
    widget=bReferenceWidget(
136
        label=_('Batch'),
137
        visible=True,
138
        catalog_name='bika_catalog',
139
        base_query={'review_state': 'open'},
140
        showOn=True,
141
    ),
142
)
143
144
CCContacts = DataGridField(
145
    'CCContacts',
146
    allow_insert=False,
147
    allow_delete=False,
148
    allow_reorder=False,
149
    allow_empty_rows=False,
150
    columns=('CCNamesReport',
151
             'CCEmailsReport',
152
             'CCNamesInvoice',
153
             'CCEmailsInvoice'),
154
    default=[{'CCNamesReport': [],
155
              'CCEmailsReport': [],
156
              'CCNamesInvoice': [],
157
              'CCEmailsInvoice': []
158
              }],
159
    widget=DataGridWidget(
160
        columns={
161
            'CCNamesReport': LinesColumn('Report CC Contacts'),
162
            'CCEmailsReport': LinesColumn('Report CC Emails'),
163
            'CCNamesInvoice': LinesColumn('Invoice CC Contacts'),
164
            'CCEmailsInvoice': LinesColumn('Invoice CC Emails')
165
        }
166
    )
167
)
168
169
SampleData = DataGridField(
170
    'SampleData',
171
    allow_insert=True,
172
    allow_delete=True,
173
    allow_reorder=False,
174
    allow_empty_rows=False,
175
    allow_oddeven=True,
176
    columns=('ClientSampleID',
177
             'SamplingDate',
178
             'DateSampled',
179
             'SamplePoint',
180
             'SampleMatrix',
181
             'SampleType',  # not a schema field!
182
             'ContainerType',  # not a schema field!
183
             'Analyses',  # not a schema field!
184
             'Profiles'  # not a schema field!
185
             ),
186
    widget=DataGridWidget(
187
        label=_('Samples'),
188
        columns={
189
            'ClientSampleID': Column('Sample ID'),
190
            'SamplingDate': DateColumn('Sampling Date'),
191
            'DateSampled': DateColumn('Date Sampled'),
192
            'SamplePoint': SelectColumn(
193
                'Sample Point', vocabulary='Vocabulary_SamplePoint'),
194
            'SampleMatrix': SelectColumn(
195
                'Sample Matrix', vocabulary='Vocabulary_SampleMatrix'),
196
            'SampleType': SelectColumn(
197
                'Sample Type', vocabulary='Vocabulary_SampleType'),
198
            'ContainerType': SelectColumn(
199
                'Container', vocabulary='Vocabulary_ContainerType'),
200
            'Analyses': LinesColumn('Analyses'),
201
            'Profiles': LinesColumn('Profiles'),
202
        }
203
    )
204
)
205
206
Errors = LinesField(
0 ignored issues
show
Comprehensibility Best Practice introduced by
Undefined variable 'LinesField'
Loading history...
Comprehensibility Best Practice introduced by
The variable LinesField does not seem to be defined.
Loading history...
207
    'Errors',
208
    widget=LinesWidget(
0 ignored issues
show
Comprehensibility Best Practice introduced by
Undefined variable 'LinesWidget'
Loading history...
Comprehensibility Best Practice introduced by
The variable LinesWidget does not seem to be defined.
Loading history...
209
        label=_('Errors'),
210
        rows=10,
211
    )
212
)
213
214
schema = BikaSchema.copy() + Schema((
0 ignored issues
show
Comprehensibility Best Practice introduced by
Undefined variable 'Schema'
Loading history...
Comprehensibility Best Practice introduced by
The variable Schema does not seem to be defined.
Loading history...
215
    OriginalFile,
216
    Filename,
217
    NrSamples,
218
    ClientName,
219
    ClientID,
220
    ClientOrderNumber,
221
    ClientReference,
222
    Contact,
223
    CCContacts,
224
    Batch,
225
    SampleData,
226
    Errors,
227
))
228
229
schema['title'].validators = ()
230
# Update the validation layer after change the validator in runtime
231
schema['title']._validationLayer()
232
233
234
class ARImport(BaseFolder):
0 ignored issues
show
Comprehensibility Best Practice introduced by
Undefined variable 'BaseFolder'
Loading history...
Comprehensibility Best Practice introduced by
The variable BaseFolder does not seem to be defined.
Loading history...
235
    security = ClassSecurityInfo()
236
    schema = schema
0 ignored issues
show
Comprehensibility Best Practice introduced by
The variable schema does not seem to be defined.
Loading history...
237
    displayContentsTab = False
238
    implements(IARImport)
239
240
    _at_rename_after_creation = True
241
242
    def _renameAfterCreation(self, check_auto_id=False):
243
        renameAfterCreation(self)
244
245
    def guard_validate_transition(self):
246
        """We may only attempt validation if file data has been uploaded.
247
        """
248
        data = self.getOriginalFile()
249
        if data and len(data):
250
            return True
251
252
    # TODO Workflow - ARImport - Remove
253
    def workflow_before_validate(self):
254
        """This function transposes values from the provided file into the
255
        ARImport object's fields, and checks for invalid values.
256
257
        If errors are found:
258
            - Validation transition is aborted.
259
            - Errors are stored on object and displayed to user.
260
261
        """
262
        # Re-set the errors on this ARImport each time validation is attempted.
263
        # When errors are detected they are immediately appended to this field.
264
        self.setErrors([])
265
266
        self.validate_headers()
267
        self.validate_samples()
268
269
        if self.getErrors():
270
            addStatusMessage(self.REQUEST, _p('Validation errors.'), 'error')
271
            transaction.commit()
272
            self.REQUEST.response.write(
273
                '<script>document.location.href="%s/edit"</script>' % (
274
                    self.absolute_url()))
275
        self.REQUEST.response.write(
276
            '<script>document.location.href="%s/view"</script>' % (
277
                self.absolute_url()))
278
279
    def at_post_edit_script(self):
280
        workflow = getToolByName(self, 'portal_workflow')
281
        trans_ids = [t['id'] for t in workflow.getTransitionsFor(self)]
282
        if 'validate' in trans_ids:
283
            workflow.doActionFor(self, 'validate')
284
285
    def workflow_script_import(self):
286
        """Create objects from valid ARImport
287
        """
288
        bsc = getToolByName(self, 'bika_setup_catalog')
289
        client = self.aq_parent
290
291
        title = _('Submitting Sample Import')
292
        description = _('Creating and initialising objects')
293
        bar = ProgressBar(self, self.REQUEST, title, description)
294
        notify(InitialiseProgressBar(bar))
295
296
        profiles = [x.getObject() for x in bsc(portal_type='AnalysisProfile')]
297
298
        gridrows = self.schema['SampleData'].get(self)
299
        row_cnt = 0
300
        for therow in gridrows:
301
            row = deepcopy(therow)
302
            row_cnt += 1
303
304
            # Profiles are titles, profile keys, or UIDS: convert them to UIDs.
305
            newprofiles = []
306
            for title in row['Profiles']:
307
                objects = [x for x in profiles
308
                           if title in (x.getProfileKey(), x.UID(), x.Title())]
309
                for obj in objects:
310
                    newprofiles.append(obj.UID())
311
            row['Profiles'] = newprofiles
312
313
            # Same for analyses
314
            newanalyses = set(self.get_row_services(row) +
315
                              self.get_row_profile_services(row))
316
            # get batch
317
            batch = self.schema['Batch'].get(self)
318
            if batch:
319
                row['Batch'] = batch.UID()
320
            # Add AR fields from schema into this row's data
321
            row['ClientReference'] = self.getClientReference()
322
            row['ClientOrderNumber'] = self.getClientOrderNumber()
323
            contact_uid =\
324
                self.getContact().UID() if self.getContact() else None
325
            row['Contact'] = contact_uid
326
            # Creating analysis request from gathered data
327
            ar = create_analysisrequest(
328
                client,
329
                self.REQUEST,
330
                row,
331
                analyses=list(newanalyses),)
332
333
            # progress marker update
334
            progress_index = float(row_cnt) / len(gridrows) * 100
335
            progress = ProgressState(self.REQUEST, progress_index)
336
            notify(UpdateProgressEvent(progress))
337
338
        # document has been written to, and redirect() fails here
339
        self.REQUEST.response.write(
340
            '<script>document.location.href="%s"</script>' % (
341
                self.absolute_url()))
342
343
    def get_header_values(self):
344
        """Scrape the "Header" values from the original input file
345
        """
346
        lines = self.getOriginalFile().data.splitlines()
347
        reader = csv.reader(lines)
348
        header_fields = header_data = []
349
        for row in reader:
350
            if not any(row):
351
                continue
352
            if row[0].strip().lower() == 'header':
353
                header_fields = [x.strip() for x in row][1:]
354
                continue
355
            if row[0].strip().lower() == 'header data':
356
                header_data = [x.strip() for x in row][1:]
357
                break
358
        if not (header_data or header_fields):
359
            return None
360
        if not (header_data and header_fields):
361
            self.error("File is missing header row or header data")
362
            return None
363
        # inject us out of here
364
        values = dict(zip(header_fields, header_data))
365
        # blank cell from sheet will probably make it in here:
366
        if '' in values:
367
            del (values[''])
368
        return values
369
370
    def save_header_data(self):
371
        """Save values from the file's header row into their schema fields.
372
        """
373
        client = self.aq_parent
374
375
        headers = self.get_header_values()
376
        if not headers:
377
            return False
378
379
        # Plain header fields that can be set into plain schema fields:
380
        for h, f in [
381
            ('File name', 'Filename'),
382
            ('No of Samples', 'NrSamples'),
383
            ('Client name', 'ClientName'),
384
            ('Client ID', 'ClientID'),
385
            ('Client Order Number', 'ClientOrderNumber'),
386
            ('Client Reference', 'ClientReference')
387
        ]:
388
            v = headers.get(h, None)
389
            if v:
390
                field = self.schema[f]
391
                field.set(self, v)
392
            del (headers[h])
393
394
        # Primary Contact
395
        v = headers.get('Contact', None)
396
        contacts = [x for x in client.objectValues('Contact')]
397
        contact = [c for c in contacts if c.Title() == v]
398
        if contact:
399
            self.schema['Contact'].set(self, contact)
400
        else:
401
            self.error("Specified contact '%s' does not exist; using '%s'"%
402
                       (v, contacts[0].Title()))
403
            self.schema['Contact'].set(self, contacts[0])
404
        del (headers['Contact'])
405
406
        # CCContacts
407
        field_value = {
408
            'CCNamesReport': '',
409
            'CCEmailsReport': '',
410
            'CCNamesInvoice': '',
411
            'CCEmailsInvoice': ''
412
        }
413
        for h, f in [
414
            # csv header name      DataGrid Column ID
415
            ('CC Names - Report', 'CCNamesReport'),
416
            ('CC Emails - Report', 'CCEmailsReport'),
417
            ('CC Names - Invoice', 'CCNamesInvoice'),
418
            ('CC Emails - Invoice', 'CCEmailsInvoice'),
419
        ]:
420
            if h in headers:
421
                values = [x.strip() for x in headers.get(h, '').split(",")]
422
                field_value[f] = values if values else ''
423
                del (headers[h])
424
        self.schema['CCContacts'].set(self, [field_value])
425
426
        if headers:
427
            unexpected = ','.join(headers.keys())
428
            self.error("Unexpected header fields: %s" % unexpected)
429
430
    def get_sample_values(self):
431
        """Read the rows specifying Samples and return a dictionary with
432
        related data.
433
434
        keys are:
435
            headers - row with "Samples" in column 0.  These headers are
436
               used as dictionary keys in the rows below.
437
            prices - Row with "Analysis Price" in column 0.
438
            total_analyses - Row with "Total analyses" in colmn 0
439
            price_totals - Row with "Total price excl Tax" in column 0
440
            samples - All other sample rows.
441
442
        """
443
        res = {'samples': []}
444
        lines = self.getOriginalFile().data.splitlines()
445
        reader = csv.reader(lines)
446
        next_rows_are_sample_rows = False
447
        for row in reader:
448
            if not any(row):
449
                continue
450
            if next_rows_are_sample_rows:
451
                vals = [x.strip() for x in row]
452
                if not any(vals):
453
                    continue
454
                res['samples'].append(zip(res['headers'], vals))
455
            elif row[0].strip().lower() == 'samples':
456
                res['headers'] = [x.strip() for x in row]
457
            elif row[0].strip().lower() == 'analysis price':
458
                res['prices'] = \
459
                    zip(res['headers'], [x.strip() for x in row])
460
            elif row[0].strip().lower() == 'total analyses':
461
                res['total_analyses'] = \
462
                    zip(res['headers'], [x.strip() for x in row])
463
            elif row[0].strip().lower() == 'total price excl tax':
464
                res['price_totals'] = \
465
                    zip(res['headers'], [x.strip() for x in row])
466
                next_rows_are_sample_rows = True
467
        return res
468
469
    def save_sample_data(self):
470
        """Save values from the file's header row into the DataGrid columns
471
        after doing some very basic validation
472
        """
473
        bsc = getToolByName(self, 'bika_setup_catalog')
474
        keywords = self.bika_setup_catalog.uniqueValuesFor('getKeyword')
475
        profiles = []
476
        for p in bsc(portal_type='AnalysisProfile'):
477
            p = p.getObject()
478
            profiles.append(p.Title())
479
            profiles.append(p.getProfileKey())
480
481
        sample_data = self.get_sample_values()
482
        if not sample_data:
483
            return False
484
485
        # columns that we expect, but do not find, are listed here.
486
        # we report on them only once, after looping through sample rows.
487
        missing = set()
488
489
        # This contains all sample header rows that were not handled
490
        # by this code
491
        unexpected = set()
492
493
        # Save other errors here instead of sticking them directly into
494
        # the field, so that they show up after MISSING and before EXPECTED
495
        errors = []
496
497
        # This will be the new sample-data field value, when we are done.
498
        grid_rows = []
499
500
        row_nr = 0
501
        for row in sample_data['samples']:
502
            row = dict(row)
503
            row_nr += 1
504
505
            # sid is just for referring the user back to row X in their
506
            # in put spreadsheet
507
            gridrow = {'sid': row['Samples']}
508
            del (row['Samples'])
509
510
            # We'll use this later to verify the number against selections
511
            if 'Total number of Analyses or Profiles' in row:
512
                nr_an = row['Total number of Analyses or Profiles']
513
                del (row['Total number of Analyses or Profiles'])
514
            else:
515
                nr_an = 0
516
            try:
517
                nr_an = int(nr_an)
518
            except ValueError:
519
                nr_an = 0
520
521
            # TODO this is ignored and is probably meant to serve some purpose.
522
            del (row['Price excl Tax'])
523
524
            # ContainerType - not part of sample or AR schema
525
            if 'ContainerType' in row:
526
                title = row['ContainerType']
527
                if title:
528
                    obj = self.lookup(('ContainerType',),
529
                                      Title=row['ContainerType'])
530
                    if obj:
531
                        gridrow['ContainerType'] = obj[0].UID
532
                del (row['ContainerType'])
533
534
            if 'SampleMatrix' in row:
535
                # SampleMatrix - not part of sample or AR schema
536
                title = row['SampleMatrix']
537
                if title:
538
                    obj = self.lookup(('SampleMatrix',),
539
                                      Title=row['SampleMatrix'])
540
                    if obj:
541
                        gridrow['SampleMatrix'] = obj[0].UID
542
                del (row['SampleMatrix'])
543
544
            # match against sample schema
545
            for k, v in row.items():
546
                if k in ['Analyses', 'Profiles']:
547
                    continue
548
                if k in sample_schema:
549
                    del (row[k])
550
                    if v:
551
                        try:
552
                            value = self.munge_field_value(
553
                                sample_schema, row_nr, k, v)
554
                            gridrow[k] = value
555
                        except ValueError as e:
556
                            errors.append(e.message)
557
558
            # match against ar schema
559
            for k, v in row.items():
560
                if k in ['Analyses', 'Profiles']:
561
                    continue
562
                if k in ar_schema:
563
                    del (row[k])
564
                    if v:
565
                        try:
566
                            value = self.munge_field_value(
567
                                ar_schema, row_nr, k, v)
568
                            gridrow[k] = value
569
                        except ValueError as e:
570
                            errors.append(e.message)
571
572
            # Count and remove Keywords and Profiles from the list
573
            gridrow['Analyses'] = []
574
            for k, v in row.items():
575
                if k in keywords:
576
                    del (row[k])
577
                    if str(v).strip().lower() not in ('', '0', 'false'):
578
                        gridrow['Analyses'].append(k)
579
            gridrow['Profiles'] = []
580
            for k, v in row.items():
581
                if k in profiles:
582
                    del (row[k])
583
                    if str(v).strip().lower() not in ('', '0', 'false'):
584
                        gridrow['Profiles'].append(k)
585
            if len(gridrow['Analyses']) + len(gridrow['Profiles']) != nr_an:
586
                errors.append(
587
                    "Row %s: Number of analyses does not match provided value" %
588
                    row_nr)
589
590
            grid_rows.append(gridrow)
591
592
        self.setSampleData(grid_rows)
593
594
        if missing:
595
            self.error("SAMPLES: Missing expected fields: %s" %
596
                       ','.join(missing))
597
598
        for thing in errors:
599
            self.error(thing)
600
601
        if unexpected:
602
            self.error("Unexpected header fields: %s" %
603
                       ','.join(unexpected))
604
605
    def get_batch_header_values(self):
606
        """Scrape the "Batch Header" values from the original input file
607
        """
608
        lines = self.getOriginalFile().data.splitlines()
609
        reader = csv.reader(lines)
610
        batch_headers = batch_data = []
611
        for row in reader:
612
            if not any(row):
613
                continue
614
            if row[0].strip().lower() == 'batch header':
615
                batch_headers = [x.strip() for x in row][1:]
616
                continue
617
            if row[0].strip().lower() == 'batch data':
618
                batch_data = [x.strip() for x in row][1:]
619
                break
620
        if not (batch_data or batch_headers):
621
            return None
622
        if not (batch_data and batch_headers):
623
            self.error("Missing batch headers or data")
624
            return None
625
        # Inject us out of here
626
        values = dict(zip(batch_headers, batch_data))
627
        return values
628
629
    def create_or_reference_batch(self):
630
        """Save reference to batch, if existing batch specified
631
        Create new batch, if possible with specified values
632
        """
633
        client = self.aq_parent
634
        batch_headers = self.get_batch_header_values()
635
        if not batch_headers:
636
            return False
637
        # if the Batch's Title is specified and exists, no further
638
        # action is required. We will just set the Batch field to
639
        # use the existing object.
640
        batch_title = batch_headers.get('title', False)
641
        if batch_title:
642
            existing_batch = [x for x in client.objectValues('Batch')
643
                              if x.title == batch_title]
644
            if existing_batch:
645
                self.setBatch(existing_batch[0])
646
                return existing_batch[0]
647
        # If the batch title is specified but does not exist,
648
        # we will attempt to create the bach now.
649
        if 'title' in batch_headers:
650
            if 'id' in batch_headers:
651
                del (batch_headers['id'])
652
            if '' in batch_headers:
653
                del (batch_headers[''])
654
            batch = _createObjectByType('Batch', client, tmpID())
655
            batch.processForm()
656
            batch.edit(**batch_headers)
657
            self.setBatch(batch)
658
659
    def munge_field_value(self, schema, row_nr, fieldname, value):
660
        """Convert a spreadsheet value into a field value that fits in
661
        the corresponding schema field.
662
        - boolean: All values are true except '', 'false', or '0'.
663
        - reference: The title of an object in field.allowed_types;
664
            returns a UID or list of UIDs
665
        - datetime: returns a string value from ulocalized_time
666
667
        Tho this is only used during "Saving" of csv data into schema fields,
668
        it will flag 'validation' errors, as this is the only chance we will
669
        get to complain about these field values.
670
671
        """
672
        field = schema[fieldname]
673
        if field.type == 'boolean':
674
            value = str(value).strip().lower()
675
            value = '' if value in ['0', 'no', 'false', 'none'] else '1'
676
            return value
677 View Code Duplication
        if field.type == 'reference':
0 ignored issues
show
Duplication introduced by
This code seems to be duplicated in your project.
Loading history...
678
            value = str(value).strip()
679
            brains = self.lookup(field.allowed_types, Title=value)
680
            if not brains:
681
                brains = self.lookup(field.allowed_types, UID=value)
682
            if not brains:
683
                raise ValueError('Row %s: value is invalid (%s=%s)' % (
684
                    row_nr, fieldname, value))
685
            if field.multiValued:
686
                return [b.UID for b in brains] if brains else []
687
            else:
688
                return brains[0].UID if brains else None
689
        if field.type == 'datetime':
690
            try:
691
                value = DateTime(value)
692
                return ulocalized_time(
693
                    value, long_format=True, time_only=False, context=self)
694
            except:
695
                raise ValueError('Row %s: value is invalid (%s=%s)' % (
696
                    row_nr, fieldname, value))
697
        return str(value)
698
699
    def validate_headers(self):
700
        """Validate headers fields from schema
701
        """
702
703
        pc = getToolByName(self, 'portal_catalog')
704
        pu = getToolByName(self, "plone_utils")
705
706
        client = self.aq_parent
707
708
        # Verify Client Name
709
        if self.getClientName() != client.Title():
710
            self.error("%s: value is invalid (%s)." % (
711
                'Client name', self.getClientName()))
712
713
        # Verify Client ID
714
        if self.getClientID() != client.getClientID():
715
            self.error("%s: value is invalid (%s)." % (
716
                'Client ID', self.getClientID()))
717
718
        existing_arimports = pc(portal_type='ARImport',
719
                                review_state=['valid', 'imported'])
720
        # Verify Client Order Number
721
        for arimport in existing_arimports:
722
            if arimport.UID == self.UID() \
723
                    or not arimport.getClientOrderNumber():
724
                continue
725
            arimport = arimport.getObject()
726
727
            if arimport.getClientOrderNumber() == self.getClientOrderNumber():
728
                self.error('%s: already used by existing ARImport.' %
729
                           'ClientOrderNumber')
730
                break
731
732
        # Verify Client Reference
733
        for arimport in existing_arimports:
734
            if arimport.UID == self.UID() \
735
                    or not arimport.getClientReference():
736
                continue
737
            arimport = arimport.getObject()
738
            if arimport.getClientReference() == self.getClientReference():
739
                self.error('%s: already used by existing ARImport.' %
740
                           'ClientReference')
741
                break
742
743
        # getCCContacts has no value if object is not complete (eg during test)
744
        if self.getCCContacts():
745
            cc_contacts = self.getCCContacts()[0]
746
            contacts = [x for x in client.objectValues('Contact')]
747
            contact_names = [c.Title() for c in contacts]
748
            # validate Contact existence in this Client
749
            for k in ['CCNamesReport', 'CCNamesInvoice']:
750
                for val in cc_contacts[k]:
751
                    if val and val not in contact_names:
752
                        self.error('%s: value is invalid (%s)' % (k, val))
753
        else:
754
            cc_contacts = {'CCNamesReport': [],
755
                           'CCEmailsReport': [],
756
                           'CCNamesInvoice': [],
757
                           'CCEmailsInvoice': []
758
                           }
759
            # validate Contact existence in this Client
760
            for k in ['CCEmailsReport', 'CCEmailsInvoice']:
761
                for val in cc_contacts.get(k, []):
762
                    if val and not pu.validateSingleNormalizedEmailAddress(val):
763
                        self.error('%s: value is invalid (%s)' % (k, val))
764
765
    def validate_samples(self):
766
        """Scan through the SampleData values and make sure
767
        that each one is correct
768
        """
769
770
        bsc = getToolByName(self, 'bika_setup_catalog')
771
        keywords = bsc.uniqueValuesFor('getKeyword')
772
        profiles = []
773
        for p in bsc(portal_type='AnalysisProfile'):
774
            p = p.getObject()
775
            profiles.append(p.Title())
776
            profiles.append(p.getProfileKey())
777
778
        row_nr = 0
779
        for gridrow in self.getSampleData():
780
            row_nr += 1
781
782
            # validate against sample and ar schemas
783
            for k, v in gridrow.items():
784
                if k in ['Analysis', 'Profiles']:
785
                    break
786
                if k in sample_schema:
787
                    try:
788
                        self.validate_against_schema(
789
                            sample_schema, row_nr, k, v)
790
                        continue
791
                    except ValueError as e:
792
                        self.error(e.message)
793
                        break
794
                if k in ar_schema:
795
                    try:
796
                        self.validate_against_schema(
797
                            ar_schema, row_nr, k, v)
798
                    except ValueError as e:
799
                        self.error(e.message)
800
801
            an_cnt = 0
802
            for v in gridrow['Analyses']:
803
                if v and v not in keywords:
804
                    self.error("Row %s: value is invalid (%s=%s)" %
805
                               ('Analysis keyword', row_nr, v))
806
                else:
807
                    an_cnt += 1
808
            for v in gridrow['Profiles']:
809
                if v and v not in profiles:
810
                    self.error("Row %s: value is invalid (%s=%s)" %
811
                               ('Profile Title', row_nr, v))
812
                else:
813
                    an_cnt += 1
814
            if not an_cnt:
815
                self.error("Row %s: No valid analyses or profiles" % row_nr)
816
817
    def validate_against_schema(self, schema, row_nr, fieldname, value):
818
        """
819
        """
820
        field = schema[fieldname]
821
        if field.type == 'boolean':
822
            value = str(value).strip().lower()
823
            return value
824 View Code Duplication
        if field.type == 'reference':
0 ignored issues
show
Duplication introduced by
This code seems to be duplicated in your project.
Loading history...
825
            value = str(value).strip()
826
            if field.required and not value:
827
                raise ValueError("Row %s: %s field requires a value" % (
828
                    row_nr, fieldname))
829
            if not value:
830
                return value
831
            brains = self.lookup(field.allowed_types, UID=value)
832
            if not brains:
833
                raise ValueError("Row %s: value is invalid (%s=%s)" % (
834
                    row_nr, fieldname, value))
835
            if field.multiValued:
836
                return [b.UID for b in brains] if brains else []
837
            else:
838
                return brains[0].UID if brains else None
839
        if field.type == 'datetime':
840
            try:
841
                ulocalized_time(DateTime(value), long_format=True,
842
                                time_only=False, context=self)
843
            except:
844
                raise ValueError('Row %s: value is invalid (%s=%s)' % (
845
                    row_nr, fieldname, value))
846
        return value
847
848
    def lookup(self, allowed_types, **kwargs):
849
        """Lookup an object of type (allowed_types).  kwargs is sent
850
        directly to the catalog.
851
        """
852
        at = getToolByName(self, 'archetype_tool')
853
        for portal_type in allowed_types:
854
            catalog = at.catalog_map.get(portal_type, [None])[0]
855
            catalog = getToolByName(self, catalog)
856
            kwargs['portal_type'] = portal_type
857
            brains = catalog(**kwargs)
858
            if brains:
859
                return brains
860
861
    def get_row_services(self, row):
862
        """Return a list of services which are referenced in Analyses.
863
        values may be UID, Title or Keyword.
864
        """
865
        bsc = getToolByName(self, 'bika_setup_catalog')
866
        services = set()
867
        for val in row.get('Analyses', []):
868
            brains = bsc(portal_type='AnalysisService', getKeyword=val)
869
            if not brains:
870
                brains = bsc(portal_type='AnalysisService', title=val)
871
            if not brains:
872
                brains = bsc(portal_type='AnalysisService', UID=val)
873
            if brains:
874
                services.add(brains[0].UID)
875
            else:
876
                self.error("Invalid analysis specified: %s" % val)
877
        return list(services)
878
879
    def get_row_profile_services(self, row):
880
        """Return a list of services which are referenced in profiles
881
        values may be UID, Title or ProfileKey.
882
        """
883
        bsc = getToolByName(self, 'bika_setup_catalog')
884
        services = set()
885
        profiles = [x.getObject() for x in bsc(portal_type='AnalysisProfile')]
886
        for val in row.get('Profiles', []):
887
            objects = [x for x in profiles
888
                       if val in (x.getProfileKey(), x.UID(), x.Title())]
889
            if objects:
890
                for service in objects[0].getService():
891
                    services.add(service.UID())
892
            else:
893
                self.error("Invalid profile specified: %s" % val)
894
        return list(services)
895
896
    def get_row_container(self, row):
897
        """Return a sample container
898
        """
899
        bsc = getToolByName(self, 'bika_setup_catalog')
900
        val = row.get('Container', False)
901
        if val:
902
            brains = bsc(portal_type='Container', UID=row['Container'])
903
            if brains:
904
                brains[0].getObject()
905
            brains = bsc(portal_type='ContainerType', UID=row['Container'])
906
            if brains:
907
                # XXX Cheating.  The calculation of capacity vs. volume  is not done.
908
                return brains[0].getObject()
909
        return None
910
911
    def get_row_profiles(self, row):
912
        bsc = getToolByName(self, 'bika_setup_catalog')
913
        profiles = []
914
        for profile_title in row.get('Profiles', []):
915
            profile = bsc(portal_type='AnalysisProfile', title=profile_title)
916
            profiles.append(profile)
917
        return profiles
918
919
    def Vocabulary_SamplePoint(self):
920
        vocabulary = CatalogVocabulary(self)
921
        vocabulary.catalog = 'bika_setup_catalog'
922
        folders = [self.bika_setup.bika_samplepoints]
923
        if IClient.providedBy(self.aq_parent):
924
            folders.append(self.aq_parent)
925
        return vocabulary(allow_blank=True, portal_type='SamplePoint')
926
927
    def Vocabulary_SampleMatrix(self):
928
        vocabulary = CatalogVocabulary(self)
929
        vocabulary.catalog = 'bika_setup_catalog'
930
        return vocabulary(allow_blank=True, portal_type='SampleMatrix')
931
932
    def Vocabulary_SampleType(self):
933
        vocabulary = CatalogVocabulary(self)
934
        vocabulary.catalog = 'bika_setup_catalog'
935
        folders = [self.bika_setup.bika_sampletypes]
936
        if IClient.providedBy(self.aq_parent):
937
            folders.append(self.aq_parent)
938
        return vocabulary(allow_blank=True, portal_type='SampleType')
939
940
    def Vocabulary_ContainerType(self):
941
        vocabulary = CatalogVocabulary(self)
942
        vocabulary.catalog = 'bika_setup_catalog'
943
        return vocabulary(allow_blank=True, portal_type='ContainerType')
944
945
    def error(self, msg):
946
        errors = list(self.getErrors())
947
        errors.append(msg)
948
        self.setErrors(errors)
949
950
951
atapi.registerType(ARImport, PROJECTNAME)
952