+ Reply to Thread
Page 1 of 2 1 2 LastLast
Results 1 to 10 of 15

Thread: FSAE Data Swap Consortium

  1. #1
    Senior Member
    Join Date
    Aug 2011
    Brisbane, Australia

    FSAE Data Swap Consortium

    Here at UQ myself and another team member have been discussing the idea of a data swap consortium. The idea being that participating teams would provide data captured by their DAQ system during competition weekends and any relevant testing. In return they have access to all the data that has been uploaded by other participating teams. In its simplest form it would be a read only dropbox of data from all participating teams and teams would be invited to the dropbox as they gave their data to whoever was managing the folder at the time. I'd like to think that in the spirit of collaboration people would be willing to share data with other teams in hopes of being able to benchmark different concepts against each other and gain a better picture of performance of the field and certain trends within it. eg 10" vs 13" wheels and tyres.

    I think a new dropbox folder each year would be advisable as it requires teams to contribute a new dataset to continue to recieve new data from other teams. I envisage that certain requirements would need to be met in order to join the consortium, not to exclude younger teams but to make each team's data useful to an outsider. Things such as each team must submit a "spec sheet" and series of photos of the car the data was captured from along with their data. This way we have some idea of the car the data came from and the data has some relevence and usefulness. They would also provide the data in an organised fashion with notes describing the type of testing, weather, surface conditions (dusty, bumpy etc.) and any quirks that may need to be known about a set of test data. (ie. a certain sensor was calbirated incorrectly but can be corrected by doing x and y).

    One of the challenges I can see would be the difference between each teams DAQ capability. Some teams run extensive data acquisition setups whilst other teams don't have the budget to do so. I'm not keen on the idea of setting a minimum requirement set of channels as I'd like the teams with less capable DAQ systems to be able to participate as well. However I would like to avoid a situation where a team may try to gain access to everyone elses data by only providing a very limited dataset of what they actually capture. I would hope that the more developed teams could see this opportunity as a great way to help younger teams develop rather than a place where their secrets may be revealed. I'm not sure entirely on the best way to tackle this at the moment except by having faith in people to do the right thing.

    The other challenge I can see is the different formats provided by different DAQ systems. MoTeC is seemingly very common amongst teams at my competition (FSAE Australasia) but I have no doubt there are plenty of teams out there on other systems such as AIM, Bosch, Racepak or even their own systems. Most of the time this isn't an issue as the software to view the data is free but it would become an issue if there were teams using formats that only work with paid software.

    This aim of this post is to gauge feedback. I'm curious to hear other peoples ideas about how something like this could run and whether their team would be interested in participating in such a consortium.

  2. #2
    Solid idea.

    Formats are workable - just export everything CSV.

    You'll probably want to put together a list of checkout tests as a metadata requirement: things like line noise testing etc. This would also add to knowledge by defining/refining good processes in addition to data quality.

    A small database of sensors and sensor performance should similarly go with this: teams should be required to upload specs for every sensor and acquisition unit used. Many students don't understand when they need resolution, accuracy, response or any combination thereof. Contrary to believers that data is everything - data is useless without context.

    You'll need an administrator and I'd suggest that only contributing teams can view, or that not-contributing teams contribute in other ways ($ for site maintenance, etc).

    Bonus points for those thinking a good idea might involve expanding this to collaborative MATLAB libraries for further processing, this could be expressed as a code repository under SVN or similar. Super bonus points for expanding that project into a collaborative vehicle simulator.

    Fully supportive. Go for it.

  3. #3
    Senior Member
    Join Date
    Mar 2008
    Brighton, MI

    Shared Data Library

    Although this may think like a good or great idea, let me offer you some advise on why it's not very practical or useful.

    1) Professional companies have trouble with shared data libraries even within their own house. There are generations of procedures, transducers, transducer locations, units, signal conditioners, sample rates, signs, channels, purposes and title and labeling. Even channel names.

    2) Test data is generally considered to be proprietary. Findings or weaknesses in vehicle characteristics are not something to be found out by other, better, or smarter teams.

    3) Testing usually involves some type of procedure (lets say an ISO procedure for example). Some drivers or operators, or special input requirements may or may not be satisfactory to accomplish proper metric formulations that the procedure is intended to produce. (" You call that a step input ??" or ("You are supposed to be on the racing line").

    4) Sometimes data is broadcast out of the cars in real time. This allows you to sniff out developing issues as they happen. Do you want your competitors to learn more about your car than you are staffed or equipped to learn ?

    5) Who will have the keys to this data and where will it reside ? You'll need a good, reliable inexpensive relational database system to hold it. I'm sure there will be some team that feels that 1024 channels at 1000 scans per second seems reasonable.

    6) The process will snowball into a DAQ fest as teams beg for more information because their competitors have it. Think driver's seat butt temperature is useful ? Well the U of $ has it, so it must be necessary...

    7) Testing and tracking usually aligns itself to simulation. Do teams have simulations of the track or testing events so they can be compared ? You'll need to store sim data, too because that's what the pros do. That's also how you learn and fine tune the models.

    I'm sure there are more reasons for and against. I only offer my advice based on experience. The proprietaryness of findings (steering efforts, steering gains, understeer, yaw damping, ride frequencies, roll rates, tire pressures, fuel usage rate, driver steer velocity and many others, are not parameters you want to share with your competitors.

  4. #4
    Senior Member
    Join Date
    Sep 2002
    Perth, Western Australia

    About the only thing I dislike about running FSAE as a competition is that while encouraging excellence it also promotes a sense of propriety. Something that is very important for business, but not too helpful when trying to educate. Ideally we would increase sharing of design information and outcomes for the benefit of the industry that would employ these students.


  5. #5

    I don't share client data.

    FSAE is education however. Very different scenario.

    Anything that brings about awareness of proper validation and appropriate procedures to this end would be one of the single most significant and educative things a community-based effort could offer those involved.

    Resource limitations limit much of what you've suggested might happen. I've no doubt room for fools as suggested in your advice, however there'd be hope that smarter players would do.the opposite.

    As mentioned earlier agree with the idea of a collaborative simulator.

    Would remind all that this needs some strict constructs for access, limitations and reproduction. Usual stuff for open resources.

  6. #6
    In short Bill your concern have some merit - the questions asked are precisely why there's value in efforts that make these issues valuable.

  7. #7
    I like this idea

    I understand about proprietary data - but pretty sure the benefit of this exercise isn't in the data sharing, it's in the collaborative model building. BillCobb, I would guess you aren't opposed to that exactly - you make some huge contributions to the private TTC forum in regards to modelling

    Personally, I wouldn't find sorting mountains of other teams data isn't really helpful. Their car is either good or bad already, and there's nothing you can do about it. BUT, having a couple sets of reference data for practice setting up models is excellent help (Claude used to give some clean datasets after his seminar, I learned a ton from writing MOTEC channels to play with that data.) So, for instance let's say we think we can estimate slip angle from a gyroscope and a lap beacon. Just because you don't have optical slip, maybe one of our collected reference data-sets has it, and so you can check your filter design on that data before implementing it on your car. It's kinda like trying to copy a paper that validated a model against logged data - except now you have the data too!

    So maybe this is a shared library of MATLAB code, and it's based on an agreed-upon reference data-sets?

    Benefits to FS in general:

    1) Students involved in building the models learn a lot
    2) Students who have no data (new teams) can get a feel for the practical limits of FSAE cars
    3) Students start to set practical targets for data channels they need TO USE the models (there is no input for seat-temp in the model)
    4) Riding points 2 and 3, teams will gain an understanding of sensor quality required (slip example: what gyroscopes actually can be measurement-updated once per lap?)

    Unfortunately, very few teams are ever going to know how to steal your yaw damping rates, and fewer what to do with it if they get it. And if I shared data, and someone got it and understood how to make use of my yaw damping rates in their design - that's awesome and congratulations on design finals, ha. And perhaps next year everyone understands it; the competition is a little more competitive and a little smarter for it!

    Short story - I see the benefit is in the collaborative modelling, not necessarily the data-sharing itself.
    Austin G.
    Tech. Director of APEX Pro LLC
    Auburn University FSAE
    War Eagle Motorsports
    Chief Chassis Engineer 2013
    Vehicle Dynamics 2010-2012

  8. #8
    Senior Member
    Join Date
    Mar 2005
    Modena, Italy

    Couple of things to note...

    LatAcc data coming from a sensor in an unknown location on the car is useless for anything but steady state measurements

    Slip angle data coming from a sensor in an unknown location on the car is useless under all conditions

    Damper speed data without a known motion ratio is useless under all conditions

  9. #9
    Senior Member
    Join Date
    Mar 2008
    Brighton, MI

    Shared testing

    How would you feel if a test operations group were present at a FSAE Meet where there was sufficient lanes and straightaway length so that some handling performance tests could be run for a team? Sure, share the data and its free. Don't share, it costs you some beverage funding.

    Since you (hopefully) already have some sort of common steering wheel disconnect(s), it would be very practical for a volunteer squad (maybe even from private industry) to install, calibrate, run a few test procedures, (let's say frequency response or constant radius, max lateral g, static steer effort and overall steer ratio) and deliver raw data with post-processed results in a nice and tidy summary report. Added to a database, teams would swarm all over the population of findings and do their own analysis of what works and what doesn't .

    Its usually the SWA adapter that causes the most problems in getting useful data (I repeat USEFUL data) in race testing, so a combined common SWA and SWT xducer would be a huge feature. Add some cheap (Sorry, I mean inexpensive) speed and yaw and roll velocity xducers and maybe a lateral accelerometer and you can go play car-car with the data.

    Using the ChassisSim procedures to match the data to a reference car model is up to the teams, but that's a really high value undertaking. All data processing done in Matlab and a post to a Access database while you wait. I'd guess it could or would be a 2 hour or less process per car (probably 1 hr if common power and transducer attachment fixturing is specified and in place). I'm pretty sure this would cover the Best of the Best (BOB) and the Worst of the Worst (WOW) design cases (Those are real word descriptions, folks). A couple of sets of equipment, one semi-pro driver (sorry, pretty shoes doesn't make you a good test driver) and a take-a-number waiting line is all it takes. Maybe do the top 5 cars or something like that. You decide. BTW: your cars have to run for this folks.

    There could be some serious profit in this if a group of real race geeks could keep the bottle and can return deposit money for embarking on this undertaking. Maybe Shark Tank possibilities here, too.

    I've attached some sample results pics for a couple of different test procedures.. This is sim data but the processing accommodates actual road test data, too. All you then need is to know what it means or implies.
    Attached Images
    Last edited by BillCobb; 02-04-2015 at 11:02 AM. Reason: spilling korrectnivity

  10. #10
    Senior Member
    Join Date
    Mar 2008
    Palo Alto, CA
    I'd recommend starting with something manageable, like getting a repository of vehicle masses and CG locations together, or getting teams to volunteer sharing their spec sheets.

    Anything more is likely to take time away from you building your own car for the year. But hey, maybe that's ok with your team.
    Formula SAE: When you just can't get rid of a girlfriend.

+ Reply to Thread
Page 1 of 2 1 2 LastLast

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts