Record caching? |
Post Reply |
Author | |
Algae
Senior Member Joined: 08 January 2007 Location: United States Status: Offline Points: 217 |
Post Options
Thanks(0)
Posted: 17 November 2010 at 8:08pm |
Anyone know the best way to cache report records in memory so they can be read back into a report control at a later time? Or is it better to write them to disk? Or is either just a bad idea?
I'd like to use virtual mode but it seems you give up a lot of flexibility doing so (sorting, grouping etc). Thanks in advance to anyone who has a suggestion. |
|
Source
Senior Member Joined: 19 June 2006 Status: Offline Points: 103 |
Post Options
Thanks(0)
|
That is the problem using Virtual Mode... you lose some of the good stuff.
Working with large recordsets is fine but you do not have this things. Working in "normal mode" with the same large recordsets is too slow but you have the good stuff. The best of the 2 worlds - not! |
|
Product: Xtreme SuitePro (ActiveX) version 13.1.0
Platform: Windows XP (32bit) - SP 3 Language: Visual Basic 6.0 |
|
Source
Senior Member Joined: 19 June 2006 Status: Offline Points: 103 |
Post Options
Thanks(0)
|
That is why I did not renew my subscription...
- Virtual mode does not have the same functionality has "normal mode"; - Grid control does not exists; - RC databinding is very, very... slow with large recordsets; - Flatedit, date and text controls are ages away from the componentone activex controls designed so many years ago (they are so simple to use and implement) - ex: decimals behavour in a simple control. ... probably I will by it again :) |
|
Product: Xtreme SuitePro (ActiveX) version 13.1.0
Platform: Windows XP (32bit) - SP 3 Language: Visual Basic 6.0 |
|
Algae
Senior Member Joined: 08 January 2007 Location: United States Status: Offline Points: 217 |
Post Options
Thanks(0)
|
There may be better ways to do this, but no one is saying so I constructed a fairly simple system using CMemFile and serializing the records as a CArchive using property exchange for the records. Its a lot faster than dredging up the records from SQL again and again. Closest thing I've got to virtual records without clobbering the nice sort, group, etc options.
|
|
Source
Senior Member Joined: 19 June 2006 Status: Offline Points: 103 |
Post Options
Thanks(0)
|
Can you post an example? It will be good for all!
|
|
Product: Xtreme SuitePro (ActiveX) version 13.1.0
Platform: Windows XP (32bit) - SP 3 Language: Visual Basic 6.0 |
|
Algae
Senior Member Joined: 08 January 2007 Location: United States Status: Offline Points: 217 |
Post Options
Thanks(0)
|
While I can't give out the entire project, the archive construct is fairly simple. Cache is sort of a misnomer because it's not really acting like a true cache.
Note that if you have enormous amount of records it's going to eat memory proportionally. // .h file stuff // oldDB is string to name the old records file // newDB is string to name the new records file // aSQL is query cmd string SELECT * FROM etc void UpdateRecords(CString oldDB, CString newDB, CString aSQL); CArray <CMemFile*,CMemFile*> cache_files; void _cache(CMemFile *pFile, BOOL bLoad = FALSE); // .cpp stuff // Cache archive function void CReportView::_cache(CMemFile *pFile, BOOL bLoad) { CArchive archive(pFile, bLoad); CXTPPropExchangeArchive px(archive); GetReportCtrl().GetRecords()->DoPropExchange(&px); archive.Close(); pFile->SeekToBegin(); } // when program changes database tables do this. // // for performance testing a timer is included in debug build // // CDataHandler is an instance of your database engine. Change it to whatever you use. void CReportView::UpdateRecords(CString oldDB, CString newDB, CString aSQL) { DWORD nCount = GetReportCtrl().GetRecords()->GetCount(); CDataHandler dh; #ifdef _DEBUG DWORD dwTime0 = ::GetTickCount(); DWORD dwTime1 = 0; #endif // if no records, we assume that it's never been loaded // so load from database if (nCount == 0) { // using your database engine, send the query and put results in the report control dh.AddRecords(aSQL, GetReportCtrl()); #ifdef _DEBUG dwTime1 = ::GetTickCount(); TRACE(_T("Load New %s, Operation %.3f sec\n"),newDB, (dwTime1 - dwTime0) / 1000.0); #endif return; } // if we never change tables, this never happens! CMemFile *pCache = NULL; BOOL bFound = FALSE; // find cache file and save old db records for (int i = 0; i < cache_files.GetCount(); i++) { pCache = cache_files.GetAt(i); if (pCache->GetFilePath() == oldDB) { bFound = TRUE; break; } } // make a new memfile if we need one if (!bFound) { pCache = new CMemFile(); pCache->SetFilePath(oldDB); cache_files.Add(pCache); } // clobber the existing memfile and store current data to it. if (pCache != NULL) { pCache->SetLength(0); _cache(pCache,CArchive::store); } GetReportCtrl().ResetContent(FALSE); bFound = FALSE; // find cache file and load it up with a batch of records for (int i = 0; i < cache_files.GetCount(); i++) { pCache = cache_files.GetAt(i); if (pCache->GetFilePath() == newDB) { bFound = TRUE; break; } } // no cache found, we have to go to the database again. if (!bFound) { dh.AddRecords(aSQL, GetReportCtrl()); #ifdef _DEBUG dwTime1 = ::GetTickCount(); TRACE(_T("Load New %s, Operation %.3f sec\n"),newDB, (dwTime1 - dwTime0) / 1000.0); #endif return; } _cache(pCache,CArchive::load); #ifdef _DEBUG dwTime1 = ::GetTickCount(); TRACE(_T("Load from Cache %s, Operation %.3f sec\n"),newDB, (dwTime1 - dwTime0) / 1000.0); #endif // get modified or new records from everyone else // deleted records handled by storing operation // set time stamp for update CString sTime = m_lastTime.Format(); aSQL.AppendFormat(_T(" WHERE Created > '%s' OR Modified > '%s'"), sTime, sTime); dh.RefreshRecords(aSQL,GetReportCtrl()); } |
|
Source
Senior Member Joined: 19 June 2006 Status: Offline Points: 103 |
Post Options
Thanks(0)
|
thx
vb6?
|
|
Product: Xtreme SuitePro (ActiveX) version 13.1.0
Platform: Windows XP (32bit) - SP 3 Language: Visual Basic 6.0 |
|
Algae
Senior Member Joined: 08 January 2007 Location: United States Status: Offline Points: 217 |
Post Options
Thanks(0)
|
Back to square 1 with caching.
The property exchange code is too slow to be effective for large record sets. Performance degraded significantly with large sets. Don't bother with this method. |
|
Algae
Senior Member Joined: 08 January 2007 Location: United States Status: Offline Points: 217 |
Post Options
Thanks(0)
|
I ended up with another solution which is pretty fast.
I load the data into CXTPReportRecords collection, then when I need to set it aside, I put that collection into a CMap. From the CMap I can then do a lookup when I need it again. The map is keyed to the database table name. I suppose you could use an STL map if you prefer. // set record collection to control.. add this method to cxtpreportcontrol.h (public!) void SetRecords(CXTPReportRecords* pRecords) { m_pRecords = pRecords; } // View class stuff // record collection map in .h file CMap<CString,LPCTSTR,CXTPReportRecords*,CXTPReportRecords*> recordMap; // accessor for record collection // if it returns false, use your loader to get the data straight from the database. BOOL GetMappedRecords(CString db) { CXTPReportRecords* pRecords; if (recordMap.Lookup(db,pRecords)) { GetReportCtrl().SetRecords(pRecords); return TRUE; } return FALSE; } // in your view code somewhere // Set data to map for subsequent retrieval. // this is only necessary the first time a new table is encountered and you need to save // off the existing records. CXTPReportRecords* pRecords = new CXTPReportRecords(); pRecords = GetReportCtrl().GetRecords(); // save old table data to the map for later lookup recordMap.SetAt(oldDB, pRecords); // get existing "new" records if (!GetMappedRecords(newDB)) { .. if not in the map, use your loader to get records here } ... make sure to populate (slow!) // tear down when done, typically the view destructor POSITION pos = recordMap.GetStartPosition(); while( pos != NULL ) { CXTPReportRecords* pRecords; CString str; recordMap.GetNextAssoc( pos, str, pRecords ); // report control will cleanup the current data collection on it's own so we skip that one if (GetReportCtrl().GetRecords() != pRecords) { pRecords->RemoveAll(); delete pRecords; } } recordMap.RemoveAll(); *Note: In a multi-user environment you have to track record changes and update the records as usual. *This gives up memory for speed. |
|
Post Reply | |
Tweet
|
Forum Jump | Forum Permissions You cannot post new topics in this forum You cannot reply to topics in this forum You cannot delete your posts in this forum You cannot edit your posts in this forum You cannot create polls in this forum You cannot vote in polls in this forum |