Access Duplicate Records

MHamid

Active Member
Joined
Jan 31, 2013
Messages
472
Office Version
  1. 365
Platform
  1. Windows
  2. MacOS
Hello,

I need some suggestions.

I have a Form that will add records to a table.
These are the fields that will need to be filled using the form:
Entered By
Date Entered
Cycle Month
Report Type
Date Reviewed
Reviewer Type
Reviewer
Reviewer Report Area
Main Section
Topic Section
Ownership-Individual
Ownership-Team
Count
Priority
Approved By
L1 (Quality Category L1)
L2 (Quality Category L2)
L3 (Quality Category L3)
Other (If Other is selected in L3)
Repeat Ask (check box)
Exception (check box)
Notes
Outliers
Additional Outliers

I have a RefID column that is an auto-generated number.
I need to ensure that when a record is added that it is not being duplicated. However, based on the columns mentioned above, there is no customer name or account number that will make this task easy to ensure that there are no duplicates.
In looking at each column, they can be duplicate in of itself, but the entire row cannot be the same.
How can I ensure there will be no duplicates added to the table by checking if all columns are the same?


Thank you
 
Last edited:
I have 21 fields. Will each suggestion accommodate for 21 values.

Also, how can I go about doing what you're suggesting in SQL?
 
Upvote 0

Excel Facts

Remove leading & trailing spaces
Save as CSV to remove all leading and trailing spaces. It is faster than using TRIM().
Sorry, is this what you are referring to? A query, but going into the SQL View to type the code below.
If this is what you are referring to, then how would I be able to incorporate this query into the database?

Code:
select * from QAMaster
where EnteredBy = Me.EnteredBy
and DateEntered = Me.DateEntered
and CycleMonth = Me.CycleMonth
and ReportType = Me.ReportType
and DateReviewed = Me.DateReviewed
and ReviewerType = Me.ReviewerType
and Reviewer = Me.Reviewer
and ReviewerReportArea = Me.ReviewerReportArea
and MainSection = Me.MainSection
and TopicSection = Me.TopicSection
and OwnershipIndividual = Me.Individual
and OwnershipTeam = Me.Team
and Count = Me.txtCount
and Priority = Me.PriorityLevel
and ApprovedBy = Me.Approved
and L1 = Me.L1
and L2 = Me.L2
and L3 = Me.L3
and RepeatAsk = Me.Repeat
and Exception = Me.Exception
and Hyperlink = Me.txtHyperlink
 
Last edited:
Upvote 0
That's a very large number of fields to check. Are you sure you will have duplicates on so many fields? And if so, why not just add the duplicate, since you can always filter duplicates out later.


Note: As far as the specific answer to your question, you'd have to turn that into runnable sql text (which means adding quotes to character data and hashes to dates, etc.):

Code:
strSQL = "select * from Table1 where Field1 = " & me.Ctrl1 & " and Field2 = "'" & me.Ctrl2 & "'" & " and Field3 = #" & Field3 & me.ctrl3 "#"

That is very error-prone at best so you have to really be careful, especially with so many fields I think you will have problems.
 
Last edited:
Upvote 0
That's the problem I'm facing because each field on it's own can be duplicated, but the full record should not be duplicated.
It's being requested that we trap the duplicate so that it will not be added to the table. I did read about a query that can remove duplicates, but I don't think my requestor will like that because we want to run a report with valid data and if there are duplicates then that will give us an inaccurate count. That's why I have to figure out a way to trap it even if it means writing a complicated code.

I will try out writing the sql into runnable sql text. Once I'm done writing it into runnable sql text, would I need to call it in one of the events of the form?
 
Upvote 0
To the last question, yes.

Also a bit easier would be to index the table on the fields that cannot be duplicated. You just create a unique index on those fields. Then you only have to trap the error if the insertion fails (which you can test by attempting to insert a duplicate).

Note that if dateentered is the date and time (rather than just the date) then duplicates are very unlikely.

Also it might be better to think in terms of keys. What are the primary keys of this table? They should not be duplicated, but the rest shouldn't matter (since by definition they depend on the primary key, if the primary is properly defined anway (big if since you probably won't know if that's true or not)). If all of these fields are your primary key, you should create the primary key, which also solves the problem (since primary keys by definition are unique indexes).
 
Upvote 0
Below is a sample of the data to be recorded. The columns in red will need to be updated for every record. The columns not in red can be duplicated ... ignore RefID. DateEntered column is just short date with no time stamp.

If I update the fields in red to primary keys does not mean that those fields will never repeat in the column itself at all?
[TABLE="width: 5955"]
<tbody>[TR]
[TD]RefID[/TD]
[TD]EnteredBy[/TD]
[TD]DateEntered[/TD]
[TD]CycleMonth[/TD]
[TD]ReportType[/TD]
[TD]DateReviewed[/TD]
[TD]ReviewerType[/TD]
[TD]Reviewer[/TD]
[TD]ReviewerReportArea[/TD]
[TD]MainSection[/TD]
[TD]TopicSection[/TD]
[TD]OwnershipIndividual[/TD]
[TD]OwnershipTeam[/TD]
[TD]Count[/TD]
[TD]Priority[/TD]
[TD]ApprovedBy[/TD]
[TD]L1[/TD]
[TD]L2[/TD]
[TD]L3[/TD]
[TD]Other[/TD]
[TD]RepeatAsk[/TD]
[TD]Exception[/TD]
[TD]Notes[/TD]
[TD]Outliers[/TD]
[TD]AdditionalOutlier[/TD]
[TD]Hyperlink[/TD]
[/TR]
[TR]
[TD="align: right"]113[/TD]
[TD]mh15601[/TD]
[TD="align: right"]1/9/2018[/TD]
[TD]November[/TD]
[TD]CBNA MOR[/TD]
[TD="align: right"]12/22/2017[/TD]
[TD]Mgr Feedback[/TD]
[TD]Nick Woody[/TD]
[TD]Regulatory[/TD]
[TD]Appendix Section - O&T Risk and Control – CBNA Technology Risk Appetite – Key Indicators (KIs) Additional Operational Risk Metrics[/TD]
[TD]Technology Risk Appetite – Key Indicators (KIs) - KIs Monitored via Trending[/TD]
[TD]David Harrington[/TD]
[TD]ACM-ILC MGR[/TD]
[TD="align: right"]3[/TD]
[TD]P1[/TD]
[TD] [/TD]
[TD]Data_Accuracy[/TD]
[TD]Misclassification_of_Category[/TD]
[TD]Issue Severity Level[/TD]
[TD] [/TD]
[TD="align: center"]FALSE[/TD]
[TD="align: center"]FALSE[/TD]
[TD] [/TD]
[TD] [/TD]
[TD] [/TD]
[TD]\\Namicgdfs\gco_home$\RUT\mh15601\CAP_Profile\Desktop\Projects\QA Database\Section Levels.xlsx#\\Namicgdfs\gco_home$\RUT\mh15601\CAP_Profile\Desktop\Projects\QA Database\Section Levels.xlsx#[/TD]
[/TR]
[TR]
[TD="align: right"]115[/TD]
[TD]mh15601[/TD]
[TD="align: right"]1/9/2018[/TD]
[TD]March[/TD]
[TD]CBNA MOR[/TD]
[TD="align: right"]1/1/2018[/TD]
[TD]Mgr Feedback[/TD]
[TD]Andre Batista[/TD]
[TD]Regulatory[/TD]
[TD]Appendix Section - O&T Risk and Control – CBNA Technology Risk Appetite – Key Indicators (KIs) Additional Operational Risk Metrics[/TD]
[TD]Technology Risk Appetite – Key Indicators (KIs) - KIs Monitored via Trending[/TD]
[TD]David Harrington[/TD]
[TD]ACM-ILC MGR[/TD]
[TD="align: right"]1[/TD]
[TD]P3[/TD]
[TD] [/TD]
[TD]Formatting[/TD]
[TD]Reporting_Enhancement_Request[/TD]
[TD]Reporting Enhancement Request - New Bullet Category[/TD]
[TD] [/TD]
[TD="align: center"]FALSE[/TD]
[TD="align: center"]FALSE[/TD]
[TD] [/TD]
[TD] [/TD]
[TD] [/TD]
[TD]\\Namicgdfs\gco_home$\RUT\mh15601\CAP_Profile\Desktop\Projects\QA Database\Section Levels.xlsx#\\Namicgdfs\gco_home$\RUT\mh15601\CAP_Profile\Desktop\Projects\QA Database\Section Levels.xlsx#[/TD]
[/TR]
[TR]
[TD="align: right"]122[/TD]
[TD]mh15601[/TD]
[TD="align: right"]1/9/2018[/TD]
[TD]November[/TD]
[TD]CIOC[/TD]
[TD="align: right"]1/1/2018[/TD]
[TD]Mgr Feedback[/TD]
[TD]Nick Woody[/TD]
[TD]Self Identified[/TD]
[TD]Appendices - Regulatory Examinations / Activities Impacting Technology[/TD]
[TD]Regulatory Examinations / Activities Impacting Technology - Report Pending[/TD]
[TD]Michiko[/TD]
[TD]REG-RAM[/TD]
[TD="align: right"]2[/TD]
[TD]P3[/TD]
[TD] [/TD]
[TD]Formatting[/TD]
[TD]Reporting_Enhancement_Request[/TD]
[TD]Retire Slide[/TD]
[TD] [/TD]
[TD="align: center"]FALSE[/TD]
[TD="align: center"]FALSE[/TD]
[TD] [/TD]
[TD] [/TD]
[TD] [/TD]
[TD]\\Namicgdfs\gco_home$\RUT\mh15601\CAP_Profile\Desktop\Projects\QA Database\Outline - Breakdown.xlsx#\\Namicgdfs\gco_home$\RUT\mh15601\CAP_Profile\Desktop\Projects\QA Database\Outline - Breakdown.xlsx#[/TD]
[/TR]
[TR]
[TD="align: right"]125[/TD]
[TD]mh15601[/TD]
[TD="align: right"]1/10/2018[/TD]
[TD]December[/TD]
[TD]OTRC MOR[/TD]
[TD="align: right"]1/1/2018[/TD]
[TD]Mgr Feedback[/TD]
[TD]Nick Woody[/TD]
[TD]Regulatory[/TD]
[TD]Appendices - Regulatory – Additional Topics[/TD]
[TD]Global Regulatory Issues – Aging Issues* / MRAs[/TD]
[TD]Mike Billups[/TD]
[TD]ACM-Metrics[/TD]
[TD="align: right"]1[/TD]
[TD]P3[/TD]
[TD] [/TD]
[TD]Consistency[/TD]
[TD]Prior_Current_Comparison[/TD]
[TD]Other[/TD]
[TD]testing[/TD]
[TD="align: center"]FALSE[/TD]
[TD="align: center"]FALSE[/TD]
[TD] [/TD]
[TD] [/TD]
[TD] [/TD]
[TD]\\Namicgdfs\gco_home$\RUT\mh15601\CAP_Profile\Desktop\Projects\QA Database\Category levels.xlsx#\\Namicgdfs\gco_home$\RUT\mh15601\CAP_Profile\Desktop\Projects\QA Database\Category levels.xlsx#[/TD]
[/TR]
[TR]
[TD="align: right"]127[/TD]
[TD]mh15601[/TD]
[TD="align: right"]1/10/2018[/TD]
[TD]December[/TD]
[TD]OTRC MOR[/TD]
[TD="align: right"]1/1/2018[/TD]
[TD]Mgr Feedback[/TD]
[TD]Nick Woody[/TD]
[TD]Regulatory[/TD]
[TD]Appendices - Regulatory – Additional Topics[/TD]
[TD]Global Regulatory Issues – Aging Issues* / MRAs[/TD]
[TD]Mike Billups[/TD]
[TD]ACM-Metrics[/TD]
[TD="align: right"]2[/TD]
[TD]P3[/TD]
[TD] [/TD]
[TD]Formatting[/TD]
[TD]Reporting_Enhancement_Request[/TD]
[TD]Reporting Enhancement Request - Table/Chart Modification[/TD]
[TD] [/TD]
[TD="align: center"]FALSE[/TD]
[TD="align: center"]FALSE[/TD]
[TD] [/TD]
[TD] [/TD]
[TD] [/TD]
[TD]\\Namicgdfs\gco_home$\RUT\mh15601\CAP_Profile\Desktop\Projects\QA Database\Category levels.xlsx#\\Namicgdfs\gco_home$\RUT\mh15601\CAP_Profile\Desktop\Projects\QA Database\Category levels.xlsx#[/TD]
[/TR]
[TR]
[TD="align: right"]128[/TD]
[TD]mh15601[/TD]
[TD="align: right"]1/10/2018[/TD]
[TD]November[/TD]
[TD]OTRC MOR[/TD]
[TD="align: right"]1/1/2018[/TD]
[TD]Mgr Feedback[/TD]
[TD]Nick Woody[/TD]
[TD]Regulatory[/TD]
[TD]Appendices - Regulatory – Additional Topics[/TD]
[TD]Global Regulatory Issues – Aging Issues* / MRAs[/TD]
[TD]Mike Billups[/TD]
[TD]ACM-Metrics[/TD]
[TD="align: right"]2[/TD]
[TD]P1[/TD]
[TD] [/TD]
[TD]Consistency[/TD]
[TD]Other_Inconsistency[/TD]
[TD]Metric - Detail Mismatch[/TD]
[TD] [/TD]
[TD="align: center"]FALSE[/TD]
[TD="align: center"]FALSE[/TD]
[TD] [/TD]
[TD] [/TD]
[TD] [/TD]
[TD]\\Namicgdfs\gco_home$\RUT\mh15601\CAP_Profile\Desktop\Projects\QA Database\Section Levels.xlsx#\\Namicgdfs\gco_home$\RUT\mh15601\CAP_Profile\Desktop\Projects\QA Database\Section Levels.xlsx#[/TD]
[/TR]
[TR]
[TD="align: right"]130[/TD]
[TD]mh15601[/TD]
[TD="align: right"]1/10/2018[/TD]
[TD]December[/TD]
[TD]CIOC[/TD]
[TD="align: right"]1/1/2018[/TD]
[TD]Mgr Feedback[/TD]
[TD]Nick Woody[/TD]
[TD]Self Identified[/TD]
[TD]Appendices - Regulatory Examinations / Activities Impacting Technology[/TD]
[TD]Regulatory Examinations / Activities Impacting Technology - In Progress Exams[/TD]
[TD]Michiko[/TD]
[TD]REG-RAM[/TD]
[TD="align: right"]2[/TD]
[TD]P3[/TD]
[TD] [/TD]
[TD]Formatting[/TD]
[TD]Grammar[/TD]
[TD]Bold[/TD]
[TD] [/TD]
[TD="align: center"]FALSE[/TD]
[TD="align: center"]FALSE[/TD]
[TD] [/TD]
[TD] [/TD]
[TD] [/TD]
[TD]\\Namicgdfs\gco_home$\RUT\mh15601\CAP_Profile\Desktop\Projects\QA Database\Outline - Breakdown.xlsx#\\Namicgdfs\gco_home$\RUT\mh15601\CAP_Profile\Desktop\Projects\QA Database\Outline - Breakdown.xlsx#[/TD]
[/TR]
[TR]
[TD="align: right"]133[/TD]
[TD]mh15601[/TD]
[TD="align: right"]1/11/2018[/TD]
[TD]January[/TD]
[TD]CIOC[/TD]
[TD="align: right"]1/1/2018[/TD]
[TD]Mgr Feedback[/TD]
[TD]Nick Woody[/TD]
[TD]Self Identified[/TD]
[TD]Appendices - Regulatory Examinations / Activities Impacting Technology[/TD]
[TD]Regulatory Examinations / Activities Impacting Technology - In Progress Exams[/TD]
[TD]Michiko[/TD]
[TD]REG-RAM[/TD]
[TD="align: right"]1[/TD]
[TD]P1[/TD]
[TD] [/TD]
[TD]Data_Accuracy[/TD]
[TD]Misclassification_of_Category[/TD]
[TD]Risk_Designation - No Longer At Risk[/TD]
[TD] [/TD]
[TD="align: center"]FALSE[/TD]
[TD="align: center"]FALSE[/TD]
[TD] [/TD]
[TD] [/TD]
[TD] [/TD]
[TD]\\Namicgdfs\gco_home$\RUT\mh15601\CAP_Profile\Desktop\Projects\QA Database\Category levels.xlsx#\\Namicgdfs\gco_home$\RUT\mh15601\CAP_Profile\Desktop\Projects\QA Database\Category levels.xlsx#[/TD]
[/TR]
</tbody><colgroup><col><col><col><col><col><col><col><col><col><col><col><col><col><col><col><col><col><col><col><col><col><col><col><col><col><col></colgroup>[/TABLE]
 
Upvote 0
If you create a primary key then that key (i.e., whatever fields are in the primary key, whether one or more than one) will always be unique and cannot be duplicated.

That said, notes should never be part of a primary key - especially since notes could often be blank or null. Primary keys can never be left null.

You can also create unique indexes, which are not technically primary keys but would work similarly. I think that with unique indexes you can allow values to be null, however, which would be a crucial difference.

The only thing that is strange about this is why you would have so many fields that would be potentially duplicated. On a practical level, it would really take a lot of work on the part of a user to duplicate all that information - everything goes back to your data though.
 
Upvote 0
This is just the type of data that is to be recorded for each QA Finding.
I'm pretty certain that primary key is not going to work for me in this case.

I'll check out unique indexes and see if that will work.
 
Upvote 0
Hello,

So I decided to concatenate my fields. I created a concatenation column in my table and added that field to the form. The txtConcatenation is bound.
I am using the code below in the txtHyperlink field. This is the last field that is required of the user to update. So far no issues using this method to prevent duplications.

Code:
Dim DupRecord As String
Dim stLinkCriteria As String

DupRecord = Me.txtConcatenation.Value
stLinkCriteria = "Concatenation='" & DupRecord & "'"
If Me.txtConcatenation = DLookup("[Concatenation]", "QAMaster", stLinkCriteria) Then
    Call ClearForm_Click
    MsgBox "Record has already been added." _
    & vbCrLf & "Please verify data and try again.", vbInformation, "Duplicate Entry!"
End If

Does anyone see any issues using this method to prevent duplicates?
 
Upvote 0
The principle is sound. So it should work, as long as you have done everything properly. Minimally, you should test your code by attempting to enter a known duplicate record.
 
Upvote 0

Forum statistics

Threads
1,223,911
Messages
6,175,325
Members
452,635
Latest member
laura12345

We've detected that you are using an adblocker.

We have a great community of people providing Excel help here, but the hosting costs are enormous. You can help keep this site running by allowing ads on MrExcel.com.
Allow Ads at MrExcel

Which adblocker are you using?

Disable AdBlock

Follow these easy steps to disable AdBlock

1)Click on the icon in the browser’s toolbar.
2)Click on the icon in the browser’s toolbar.
2)Click on the "Pause on this site" option.
Go back

Disable AdBlock Plus

Follow these easy steps to disable AdBlock Plus

1)Click on the icon in the browser’s toolbar.
2)Click on the toggle to disable it for "mrexcel.com".
Go back

Disable uBlock Origin

Follow these easy steps to disable uBlock Origin

1)Click on the icon in the browser’s toolbar.
2)Click on the "Power" button.
3)Click on the "Refresh" button.
Go back

Disable uBlock

Follow these easy steps to disable uBlock

1)Click on the icon in the browser’s toolbar.
2)Click on the "Power" button.
3)Click on the "Refresh" button.
Go back
Back
Top