12-28-2018, 12:52 PM
(12-27-2018, 09:25 AM)lightning_fingers Wrote: in "main()" {
set_csr (0x300, 0x00000009); //enable interrupts?
write_csr(0x305, my_isr); // isr address?
// big long delay
for (int i=0;i<10000;i++) asm("NOP");
}
void __attribute__ ((interrupt())) my_isr (){
*simresult = (unsigned short int) 0xF111; // flag result
}
which gives me:
00000300 <my_isr>:
300: ff010113 addi sp,sp,-16
304: 00f12423 sw a5,8(sp)
308: 00e12623 sw a4,12(sp)
30c: 40002703 lw a4,1024(zero) # 400 <simresult>
310: 0000f7b7 lui a5,0xf
314: 11178793 addi a5,a5,273 # f111 <_end+0xece5>
318: 00f72023 sw a5,0(a4)
31c: 00c12703 lw a4,12(sp)
320: 00812783 lw a5,8(sp)
324: 01010113 addi sp,sp,16
328: 10000073 eret
The compiler is inserting the correct isr handling code......
But when simulated with the latest ri5cy core I get:
# - DGB - write 0 into CTRL
# 2465870000 ps: Illegal instruction (core 0) at PC 0x00000328:
# 2469762000 ps: Illegal instruction (core 0) at PC 0x00000328:
I've managed work around this version of the tool chain I am using by hacking the memory image that I read into my memory models in my RTL stimulus env.
i.e. replace any opcodes eret (32'h10000073) with mret (32'h30200073)
this is obviously a very short term fix, but does at least allow me to continue this particular RTL simulation.
With this hack, the simulation executes as expected.