Bug #8568

Various fields are duplicated on csv:import if 2 rows have the same legacyId

Added by Mike Gale over 6 years ago. Updated over 5 years ago.

Status:NewStart date:06/17/2015
Priority:LowDue date:
Assignee:Mike Gale% Done:

0%

Category:CSV import
Target version:-
Google Code Legacy ID: Tested version:2.3
Sponsored:No Requires documentation:

Description

This is sort of an edge case so I'm marking it low.

To reproduce:
1. Create a new CSV with all the various columns filled in (especially legacyId) for an archival description on the first row
2. Duplicate this row for the 2nd row
3. Run csv:import on the file

Result: the csv:import tasks complains about a duplicate legacyId when it gets to the second row, and claims it's skipping said row as a result. In actuality, the first row that gets imported will now have duplicates of various fields.

Expected result: when the task says it's skipping a row, it should skip it. Not duplicate fields in other rows.

duplicatefields.png (48.5 KB) Mike Gale, 06/17/2015 03:35 PM

History

#1 Updated by Tim Hutchinson over 6 years ago

The same thing happens for a subsequent import, with the same source-name and legacyID. In that case I was hoping the record would be replaced, rather than skipping that row.

In addition to the fields in the earlier screenshot, it affects RAD notes - basically anything (not surprisingly) where the field is repeatable. The one exception I noticed was language of description, which seems to have a check for duplicates already.

#2 Updated by Dan Gillean over 5 years ago

  • Target version deleted (Release 2.3.0)

Also available in: Atom PDF