Thanks. Not sure what prevents me from arriving at the conclusion you did, but I've been told that I'm somewhat 'black and white' in my approach or analysis of things. OK, so the term was not actually "somewhat". Sometimes it's a blessing...
Anyway, this really looks like we're looking for a way to put lipstick on a pig as the saying goes, because normalization went out the window (assuming it was in the room in the first place). What comes to mind is this:
A 'master' query could call an outer aggregate query (which uses FIRST as the aggregate function) to get the first B record. An aggregate subquery of that query could get the FIRST A record, and the outer query could add the two values together, using an IIF statement to deal with records that have no pairs. This query would also have to set a flag so that on the next pass, it ignores the FIRST record it finds WHERE the flag is True. This record would be passed to the 'master' query. However, I'm pretty sure the query/subquery combination would only return one record (a message to that effect would probably pop up upon each execution of the main query) so that might blow up the whole idea.
My focus would first be on attempting to somehow normalize the data. If not possible, the only option that the OP might have a chance of creating and maintaining a solution (a HUGE consideration is maintaining this) would be to create a form that lists A records in one listbox and B records in another. The user selects one from each and clicks the button to pair these records. The code could do the math at the same time while this "concatenated" record is written to a table. However, neither solution is something that I would be willing to create as a volunteer (sorry).
BTW, did I ever mention that my wife says I always look at things while wearing my 'complicated' glasses? Hopefully, someone has a far simpler solution that I tend not to see.