Sharing data between PULP and HWPE on HERO
(02-06-2019, 07:47 AM)Adi Wrote: How do I do the same thing in PULPissimo? I see that there is no such constant in in PULPissimo. Do I change TCDM_END_ADDRESS in

I think that should work, but I am not very familiar with PULPissimo. If it does not work, please open a new thread on this so we can direct the attention of the responsible people there.
(01-08-2019, 09:57 AM)akurth Wrote: What second toolchain did you use? The PULP SDK deliberately only implements a subset of the standard C library, so unless the other toolchain and the Fortran library are similarly specialized, a lot of mismatches can happen. Do you need the functions that are missing, or could you reduce the Fortran library to the required minimum and thus work around the missing functions?

I don't think the missing functions are needed for BLAS, so theoretically I think I can reduce the Fortran library so it won't use functions that are missing from the standard C library.
The problem is I'm not sure how to do this - specialize the other toolchain and reduce the Fortran library.
I've also tried to add Fortran to the hero-sdk toolchain compilation, but couldn't get it to compile the needed libgfortran.a.

Could you help with one of the methods above?
How can I make the necessary specialization to the other toolchain and to the Fortran library in it in order to prevent mismatches?
Alternatively, how can I add the libgfortran.a compilation to the hero-sdk toolchain?


Edit: I found a workaround and I use a library which doesn't include Fortran code. Thanks anyway!

Initially I used the TCDM to share data between PULP and a HWPE in bigpulp, but I couldn't make the TCDM large enough because of FPGA BRAM and routing limitations.
I understand that I can also use rt_alloc.h API to share data between PULP and the HWPE, is that right? 
If so, how can I do it? I tried using the API but I see that the HWPE doesn't read the expected data so clearly I'm not using it right. Is there a working example somewhere?


Forum Jump: