Quantcast
Channel: EngineerZone : Discussion List - All Communities
Viewing all 51060 articles
Browse latest View live

Question about LTC3118 Single Power Supply

$
0
0

Hi

 

I have tested the LTC3118 by a single power supply from 3.6V to 7.2V, the output voltage is 5V.  I have used the Vin1 pin as the voltage source for the converter. My question is that where the unused Vin2 pin and the Run2 pin are connected to. I have connected the Vin2 to GND and the Run2 to the Run1(=Vin1), is it no problem?

 

Regards,

Kazu


Interfacing AD7903 with Labview or Matlab

$
0
0

Hello,

 

I have assembled a Direct Conversion Receiver (CN0374 in the Design Center CN0374 | circuit note and reference circuit info RF-to-Bits Solution Offers Precise Phase and Magnitude Data to 6 GHz |…) using ADL5380 (Demodulator), ADA4940 Low Noise Diff Amplifier, AD7903 (16 bit ADC), and the SDP-B board for downconverting a 1.7 GHz signal to baseband.

 

I am currently using the AD7093 evaluation software as prescribed in the Design Center page. However, I would like to know if there is any way I could interface it to either Labview or Matlab for automating the process of controlling the ADC for Data Acquisition purposes.

 

Thank you

how to use SYNC pin to synchronize multiple adc

$
0
0

HI,

 I am using three dual channel adc ad9652. I want know how to use SYNC pin of those ADC to synchronize  ADC data output . Sampling clk for all adc is same and generated from a single crystal oscillator( same source). 

 

 May I need to generate sync pulse from FPGA and give to all adc SYNC pin simultaneously through multiple  FPGA pin   or need I just short the SYNC pin of all the ADC.

 

If need to generate a SYNC pulse from ADC then how  it is to be generate.?

 

Let me know how to use SYNC pin if multiple ADC are used . I never used multiple ADC and sync pin configuration.

ADAU1701 i2s slave input issue

$
0
0

I have finished a PCB application of ADAU1701 which is AUXin & a i2s in / 2 i2s output  to a DAC .  12.288MHz mclk feed in to 1701 MCLKI  and DAC  MCLK in.   a WIFI i2s master ( BCLK,LRCK,DATA)to 1701 i2s input. 1701 BCLK/LRCK output  , not connected to  input BCLK/LRCK. , now. my issue  are 2.

1. when wifi i2s non-active( BLCK/LRCK/DATA all high ) , AUXin mode works (  2 i2s output  and DAC has output). but when wifi  i2s to 1701 is active ( 44.1k Fs/ BCLK/ has DATA), 1701   I2S DATA0/ DATA1  then OFF, if just turn wifi i2s to 1701 again , the i2s out from 1701  active again.  that's strange .

2.  For 1701 i2s slave ,   the BCLK/LRCK/DATA input of i2s , can be any configration, like 44.1/48/96KHz Fs, and the rate can be 128*Fs/256Fs....  how to let slave BCLK/LRCK/DATA, sync to MCLK of 1701 ?

 

if anybody have some comments , much thanks.

 

Meican.

ADAU1701 i2s slave input issue

$
0
0

I have finished a PCB application of ADAU1701 which is AUXin & a i2s in / 2 i2s output  to a DAC .  12.288MHz mclk feed in to 1701 MCLKI  and DAC  MCLK in.   a WIFI i2s master ( BCLK,LRCK,DATA)to 1701 i2s input. 1701 BCLK/LRCK output  , not connected to  input BCLK/LRCK. , now. my issue  are 2.

1. when wifi i2s non-active( BLCK/LRCK/DATA all high ) , AUXin mode works (  2 i2s output  and DAC has output). but when wifi  i2s to 1701 is active ( 44.1k Fs/ BCLK/ has DATA), 1701   I2S DATA0/ DATA1  then OFF, if just turn wifi i2s to 1701 again , the i2s out from 1701  active again.  that's strange .

2.  For 1701 i2s slave ,   the BCLK/LRCK/DATA input of i2s , can be any configration, like 44.1/48/96KHz Fs, and the rate can be 128*Fs/256Fs....  how to let slave BCLK/LRCK/DATA, sync to MCLK of 1701 ?

 

if anybody have some comments , much thanks.

 

Meican.

Evaluating No-OS with ADRV9375-N/PCBZ board and KCU105 board

$
0
0

Hi team,

We are evaluating No-OS with ADRV9375-N/PCBZ board and KCU105 board.

Now we try Push data into/out of the AD9371/AD9375 according to following site.

 

https://wiki.analog.com/resources/eval/user-guides/mykonos/no-os-setup

 

About “Push data out”, we confirmed CW signal at 1981MHz.

setting Tx Frequency is 1980MHz

Next we are trying Capture Date. We use signal generator and connect it to Rx port.

But capture data is all 0.

We use ILA (Integrated Logic Analyzer) at FPGA, but monitor data is all 0.

 

Question

1)Would you tell us how we solve this problem(capture data is all 0)?

2)Especially would you to conform headless.c file. Is there anything wrong our  usage

 

GPIO/SPI modified source are as follows.

myk.c

myk_ad9528init.c

platform_xilinx/platform_drivers.h

 

software Version are as follows.

NO-OS:2018_R1

API:mykonos_api_source_1.5.2.3566

 

 I have attached a file " src.zip" to be related to as above.

 

Also We would like to confirme about version of  no-os source.

Now 2018_R1 using version 2018_R1 no-os source.

There is a difference between the files downloaded in June and August.

Did changes have been made on June 27?

Is there any possibility that it will be changed in the future?

 

 

Best Regard.

Yoshihiro

To operate AD9375 with DDS mode

$
0
0

We wouild like to confim about operation of AD9375 with DDS mode.

Will it be in what have only to change "DAC_SRC_DMA" of the "headless.c" source to "DAC_SRC_DDS" to operate AD9375 with DDS mode?

 

dac_channel ad9371_tx_channels[4] = {
 {
  3*1000*1000, // dds_frequency_tone0
  90000,  // dds_phase_tone0
  50*1000, // dds_scale_tone0 1.0*1000*1000
  0,  // dds_frequency_tone1
  0,  // dds_phase_tone1
  0,  // dds_scale_tone1
  1,  // dds_dual_tone
  0,  // pat_data
  DAC_SRC_DMA // sel    <===   Change to DAC_SRC_DDS
 },

(four places in this table in total)

 

Regards,

Yoshihiro

No output from DAC with AD9959 PCB Eval board

$
0
0

Hi

 

I am a newbie on the AD9959 and I have recently purchased the PCBZ evaluation board. 

 

I have powered up the board and I am using the AD9959/59 Evaluation software. I can load and read the AD9959 through the tool (USB Status: CYusb-1 DUT Type: AD9959).

 

I am using a VNA as a function generator providing a 40Mhz input signal to the REF CLK. All is good as I am reading a 10Mhz output signal from the SYNC CLK on the scope. 

 

Via the software I am loading the SIngle Tone Mode > All Channels on @10_20_30_40MHz configuration. The default ref clock is 500Mhz so I adjust this to match the 40Mhz input signal, and the Frquency on Channel 0 adjust itself to 1.6Mhz. Evaluation software is set to Auto I/O Update. If i read the settings on the Channel Output Config the frequency is indeed 1.6Mhz. 

 

I am trying to view the output of DAC Channel 0 on the scope but I don't get any signal. I must be missing something non-trivial. Any feedback is welcome. 

 

Best regards

Charles 9H1Y


SC573 security problem

$
0
0

SC573 security problem

I set public and private key in OTP memory using API functions. Then lock the device with function 'adi_rom_lock'. All functions returned 'true'. Then I reset the device. But device is OPEN. I can load unsecure programs to it and read OTP. I try to read lockbit in OTP memory location 0x48C: it is '1'. 

I repeat this procedure at another chip with the same result. The chip revision is 0.0.

The same procedure on Blackfin BF707 (it has the same security system) lock the part without any problem.

Custom IIO Plugin for AD9364

$
0
0

Hi everyone!

 

I would like to know the beginning steps of building my own GUI (Radio Configuration Wizard for my AD9364 device) App,  I collected the insight about LIBIIO But I don't know how I should interface this LIBIIO with my GUI application, what are things I have to implement in order to build a basic radio configuration wizard. Could anyone please guide me in this direction?

 

I have gone through IIO Oscilloscope but it seems a bit complicated to me, my requirement is basic radio configuration wizard where I shall be able to control and configure my radio device.

 

Also address what existing resources may I use to build the application, I would prefer the easy and simple method.

 

 

Any help would be much appreciated.

Thanks & Regards

Issues attempting UART Slave boot in SC573

$
0
0

Hi there,

 

We are trying to do load a UART slave boot by sending a .ldr file across a USB-to-UART cable as binary. We are successfully receiving the 4-byte autobaud response to suggest that the chip is ready to receive the bootstream. However, after sending ~4500 bytes, the RTS line goes high and stays high, and we are unable to finish loading it into the chip. We have tried sending at a variety of baudrates, as low as 1200 and as high as 115200, and it is always the same result.

 

We are using CCES to generate the .ldf file. It is running elfloader with the following script

 

elfload.exe -proc ADSP-SC573 -si-revision none -b uartslave -f binary -width 8 -init "C:\Analog Devices\CrossCore Embedded Studio 2.6.0\SHARC\ldr\ezkitSC573_initcode_core0" -core0="(project directory)/Debug/SC573_UART_BootMode_Application_Core0"

 

Is this an issue with our .ldf file not being in the proper format?

 

Thanks

VisualDSP++ 3.5 elf2aexe crippled executables FIX!

$
0
0

TAPR.org (Now defunct) DSPx kit for ADSP-2185N

 

Imagine you downloaded Visual DSP++ 3.5 and got the trial license installed. You are hoping to get your 10 year old DSP Easy Kit Lite or now defunct TAPR DSPx running again, because you became nostalgic/melancholic a long time ago about the way ADI does things in a very unique and logical way. Plus you don't need the myriad of peripherals that the new kits have on them that are enough to make you lose your mind. You want something simple, DSP, SPORT and codec. You want to rehash the experience of what makes simple and efficient DSP code run efficiently and you love how ADI implemented it back in the day (approx a decade ago) on the ADSP-2185N. It sounds great, doesn't it, with one small problem. That issue is caused by elf2aexe.exe. This is ADI's elf loader utility which will scramble up the vector table entry point into a tangled mess. This will not work for you and you have no idea how to fix it. Well this article offers some glimmer of hope, dust off your kit, plug it in and lets go on a little journey of how to go about fixing it...I have included the attached utility I have created to fix this mess and I'm sharing it with you, because it helps me also work this out in a jiffy...

 

 

Inside the VisualDSP++ 3.5 directory there is a utility that supposedly helps users of the ADSP-2185N kits bring their code into the new compiler without issues. Instead if you search the 2185N related forums you will find strange excuses claiming that tech support does not understand the executable format of the file and has no idea why it does not work. I have taken some time to research the issues and have found the following issues with elf2aexe

 

 

Firstly, it does not work correctly without the new fitting of sections into the Expert Linker naming conventions...By adding the section names you see above, everything you see inside Expert Linker should gel with your code much better. This means that your old code needs to be modified and re-written from Legacy (-legacy) compiler option to the new format. That should at least make your jump table start to appear coherently inside the simulator. But will the converter utility still work correctly too ? Not exactly! Here is why...

 

In the Post Build Step or manually you tried something as follows : elf2aexe -x debug\c1sin.dxe debug\c1sin

 

The utility creates an executable as you expect it, but also does something unexpected like scrambling the Expert Linker vector sections on you inside the executable! This makes the EXE file unusable unless you remove all the unnecessary irrelevant garbage they wrote into the file (obviously not understood by the original monitor) and re-organize all the vector entries so that they are ordered correctly. Why ADI would do this in a released product, is beyond me and I find it surprisingly unacceptable. Seems some folks have discovered this very same exact issue and have gotten some head scratching answers from tech support. Here is the issue that this utility creates in VisualDsp++ 3.5 in this discussion thread: https://ez.analog.com/message/26332?commentID=26332#comment-26332 

 

I will also summarize this briefly below :

 

 

Here you can clearly see that the start vector is not zero and thus seems to start at 001C, which is erroneously incorrect. The only thing left for the hopeless user is to unscramble the mess, so I decided to take on that task myself. In fact the first task I performed was to do all this vector unscrambling manually inside the same file created by the loader utility. It took me forever and finally got the executable to load correctly. The next week, I embarked on a journey to create a helper utility which fixes all this for me auto-magically...Well almost. The issue that I was faced with is using the content of the splitter created .BNM file. My utility bnm2exe only understands Motorola SRec. format, so I wrote a C# parser application which does it all as long as you point to the .BNM file created by the spliter. This is easy to do, just go into your settings and choose as follows: 

 

Make sure that type is set to : Splitter file. next make sure that in the Load tab you select...

 

I know I said it does it all, but that is partially incorrect. You need to know the exact length of the PROM boot-loader section inside the .BNM file. I tried to estimate this from ADI documentation and got it wrong, so then I started the copy and paste into notepad++ method and copying bytes and counting and still got it wrong. My counting abilities must be crippled also, pun intended, so I decided to do it visually by decremented the -h[bytes] flag until it all went correctly into alignment. As you would imagine this method worked the best! I just visually inspect the .BNM file for the startup byte sequence I see in the startup code inside the simulator and use that as my visual cue. Within a few minutes, you get the hang of it and adjust accordingly in either direction, navigating through the .BNM file until correct alignment is reached! Consider you only have to get this alignment correct once for the lifetime of your development on these ancient kits. This part if unfortunately a necessary evil and something you have to get going correctly, at least this first time!

 

Thus, in Post Build I add an option : bnm2exe C:\dspx\code2\c1sin\Debug\c1sin.bnm -h1027

 

 

Finally, I do a clean followed by a build and my parser executable bnm2exe messages me back with an encouraging message...

 

 

Well, there you have it, so it seems the replacement loader worked correctly and the good news is you do not have to engage it from the Build Step either. You can do the very same thing manually. Just drop the executable into your directory of choice and engage the new loader via a terminal as such :

 

 

if you engage my tool without parameters, by simply typing : bnm2exe you will get an error saying that you must follow the convention that is clearly stated in the verbiage. The tool has very simple syntax that you must always specify or you will get nothing out of it. Make sure you read my instructions on the screen and understand them correctly, otherwise, contact me as a last resort...

 

 

As a last step, I load a tool such as ezload.exe and point it to the new exe location...

 

 

 

Here is what I see when ezload is all finished...

 

 

A further scope zoom-in of my messy desk reveals...This method works quite well!

 

 

Furthermore, if you do decide to use my tool : bnm2exe, please keep in mind that it has no mind of its own. It is your duty to drop it inside the adi folder where Visual DSP ++ 3.5 has been installed, in other words, navigate to :

 

C:\Program Files (x86)\Analog Devices\VisualDSP 3.5 16-Bit\

 

and drop it there...

 

 

That's about all there is to it folks...The rest is history like they say. I find the ADSP-2185N inspiring and a neat little chip to learn and exercise on. Whole DSP Ham Radios have been written on it with complex FFT implementations. It is not the fastest at 80 MIPS, but what a great chip for your grandsons to practice on if you still have the old kits around...

 

Please note that bnm2exe.exe is a .NET C# application that will require the .NET Run-time, so you might need to install that first if it is not already installed on your machine and you see complaints about it. Also, let me know if you run into trouble with it and I would be more than happy to assist, to a point. Please don't ask for the source, because I consider it part of my personal IP. Particularly, my own personal Motorola S-rec file parser which I wrote from scratch and further more the communication protocol that makes it all possible. I'd like to retain those IP's for future use... 

 

Now, I'm going to be pushing my luck over here and say that if you do like the tool, I will be enhancing it this week to include the RS-232 transport binding logic as well, so that the cumbersome ezloader.exe invocation is skipped all together and you never need to leave Visual DSP++ 3.5 to complete this entire process...That would be neat wouldn't it ?!

 

Happy DSP'ing...

Change JESD Line rate in tcl

$
0
0

Hi,

 

I am using only one ADC AD9234 channel which requires 5 Gbits /S lane rate. When build my system I got an error in implementation because my design is based on daq2 zc706 template. 

 

 

I think I need to change a config in my build tcl script, but how?

 

here is my script in the moment, is something else missing?

 

Is it also correct that I disabled QPLL as it was only used for TX ind daq2 design?

 

 

source$ad_hdl_dir/library/jesd204/scripts/jesd204.tcl

# adc peripherals

ad_ip_instance axi_adxcvr axi_ad9680_xcvr ;# call axi_adxcvr for setup
ad_ip_parameter axi_ad9680_xcvr CONFIG.NUM_OF_LANES 4 ;# number of rx lanes
ad_ip_parameter axi_ad9680_xcvr CONFIG.QPLL_ENABLE 0 ;# 0 to disable, 1 to enable
ad_ip_parameter axi_ad9680_xcvr CONFIG.TX_OR_RX_N 0 ;# 0 for RX, 1 for TX

adi_axi_jesd204_rx_create axi_ad9680_jesd 4 ;# number of rx lanes

ad_ip_instance axi_ad9680 axi_ad9680_core ;# add ad9680 IP core

ad_ip_instance util_cpack axi_ad9680_cpack ;# add cpack IP
ad_ip_parameter axi_ad9680_cpack CONFIG.CHANNEL_DATA_WIDTH 64 ;# cpack channel data width
ad_ip_parameter axi_ad9680_cpack CONFIG.NUM_OF_CHANNELS 2 ;# number of channels (1 ADC = 2 possible channels)

ad_ip_instance axi_dmac axi_ad9680_dma ;# add axi dma IP
ad_ip_parameter axi_ad9680_dma CONFIG.DMA_TYPE_SRC 1 ;#
ad_ip_parameter axi_ad9680_dma CONFIG.DMA_TYPE_DEST 0 ;#
ad_ip_parameter axi_ad9680_dma CONFIG.ID 0 ;#
ad_ip_parameter axi_ad9680_dma CONFIG.AXI_SLICE_SRC 0 ;#
ad_ip_parameter axi_ad9680_dma CONFIG.AXI_SLICE_DEST 0 ;#
ad_ip_parameter axi_ad9680_dma CONFIG.SYNC_TRANSFER_START 1 ;#
ad_ip_parameter axi_ad9680_dma CONFIG.DMA_LENGTH_WIDTH 24 ;#
ad_ip_parameter axi_ad9680_dma CONFIG.DMA_2D_TRANSFER 0 ;#
ad_ip_parameter axi_ad9680_dma CONFIG.CYCLIC 0 ;#
ad_ip_parameter axi_ad9680_dma CONFIG.DMA_DATA_WIDTH_SRC 64 ;#
ad_ip_parameter axi_ad9680_dma CONFIG.DMA_DATA_WIDTH_DEST 64 ;#

# shared transceiver core

ad_ip_instance util_adxcvr util_daq2_xcvr ;# add util_adxcvr IP core
ad_ip_parameter util_daq2_xcvr CONFIG.RX_NUM_OF_LANES 4 ;# number of RX lanes (4 per ADC)
ad_ip_parameter util_daq2_xcvr CONFIG.TX_NUM_OF_LANES 0 ;# number of TX lanes

 

ad_connect sys_cpu_resetn util_daq2_xcvr/up_rstn
ad_connect sys_cpu_clk util_daq2_xcvr/up_clk

# reference clocks & resets
create_bd_port -dir I rx_ref_clk_0 ;# create clock port
# The QPLL was used for TX only
ad_xcvrpll rx_ref_clk_0 util_daq2_xcvr/cpll_ref_clk_* ;# connect rx_ref_clk_0 >> cpll_ref_clk_*
ad_xcvrpll rx_ref_clk_0 util_daq2_xcvr/qpll_ref_clk_*
ad_xcvrpll axi_ad9680_xcvr/up_pll_rst util_daq2_xcvr/up_cpll_rst_* ;# connect (axi_ad9680_xcvr) up_pll_rst >> up_cpll_rst_* (util_daq2_xcvr)

# connections (adc)

ad_xcvrcon util_daq2_xcvr axi_ad9680_xcvr axi_ad9680_jesd ;# connect ?
ad_connect util_daq2_xcvr/rx_out_clk_0 axi_ad9680_core/rx_clk ;# connect (util_daq2_xcvr) rx_out_clk_0 >> rx_clk (axi_ad9680_core)
ad_connect util_daq2_xcvr/rx_out_clk_0 axi_ad9680_fifo/adc_clk ;# connect (util_daq2_xcvr) rx_out_clk_0 >> adc_clk (axi_ad9680_fifo)
ad_connect util_daq2_xcvr/rx_out_clk_0 axi_ad9680_cpack/adc_clk ;# connect (util_daq2_xcvr) rx_out_clk_0 >> adc_clk (axi_ad9680_cpack)

ad_connect axi_ad9680_jesd/rx_sof axi_ad9680_core/rx_sof
ad_connect axi_ad9680_jesd/rx_data_tdata axi_ad9680_core/rx_data
ad_connect axi_ad9680_jesd_rstgen/peripheral_reset axi_ad9680_cpack/adc_rst
ad_connect axi_ad9680_core/adc_enable_0 axi_ad9680_cpack/adc_enable_0
ad_connect axi_ad9680_core/adc_valid_0 axi_ad9680_cpack/adc_valid_0
ad_connect axi_ad9680_core/adc_data_0 axi_ad9680_cpack/adc_data_0
ad_connect axi_ad9680_core/adc_enable_1 axi_ad9680_cpack/adc_enable_1
ad_connect axi_ad9680_core/adc_valid_1 axi_ad9680_cpack/adc_valid_1
ad_connect axi_ad9680_core/adc_data_1 axi_ad9680_cpack/adc_data_1
ad_connect axi_ad9680_jesd_rstgen/peripheral_reset axi_ad9680_fifo/adc_rst
ad_connect axi_ad9680_cpack/adc_valid axi_ad9680_fifo/adc_wr
ad_connect axi_ad9680_cpack/adc_data axi_ad9680_fifo/adc_wdata
ad_connect sys_cpu_clk axi_ad9680_fifo/dma_clk
ad_connect sys_cpu_clk axi_ad9680_dma/s_axis_aclk
ad_connect sys_cpu_resetn axi_ad9680_dma/m_dest_axi_aresetn
ad_connect axi_ad9680_fifo/dma_wr axi_ad9680_dma/s_axis_valid
ad_connect axi_ad9680_fifo/dma_wdata axi_ad9680_dma/s_axis_data
ad_connect axi_ad9680_fifo/dma_wready axi_ad9680_dma/s_axis_ready
ad_connect axi_ad9680_fifo/dma_xfer_req axi_ad9680_dma/s_axis_xfer_req
ad_connect axi_ad9680_core/adc_dovf axi_ad9680_fifo/adc_wovf

# interconnect (cpu)
ad_cpu_interconnect 0x44A50000 axi_ad9680_xcvr
ad_cpu_interconnect 0x44A10000 axi_ad9680_core
ad_cpu_interconnect 0x44AA0000 axi_ad9680_jesd
ad_cpu_interconnect 0x7c400000 axi_ad9680_dma

# gt uses hp3, and 100MHz clock for both DRP and AXI4
ad_mem_hp3_interconnect sys_cpu_clk sys_ps7/S_AXI_HP3
ad_mem_hp3_interconnect sys_cpu_clk axi_ad9680_xcvr/m_axi

#ad_mem_hp1_interconnect sys_cpu_clk sys_ps7/S_AXI_HP1
ad_mem_hp2_interconnect sys_cpu_clk sys_ps7/S_AXI_HP2
ad_mem_hp2_interconnect sys_cpu_clk axi_ad9680_dma/m_dest_axi

# interrupts
ad_cpu_interrupt ps-11 mb-14 axi_ad9680_jesd/irq
ad_cpu_interrupt ps-13 mb-12 axi_ad9680_dma/irq

 

 

 

Thanks,

Nils

Best LI Ion Battery and Solar and System Power

$
0
0

I am trying to figure out the best IC for a small IoT that can

1. Solar Energy Harvest

2. Charge a single cell Lithium Ion 4.2v

3. Provide a 3.3v regulated power supply

 

My system will be small <20mA ... and very very periodic a sensor + a BLE beacon

 

Seems like ADP5091/5092 or LTC4121-4.2 or LTC3331 all fit the bill.

 

I was originally planning to use a 1 watt 6v solar cell.. though I could use something smaller.

 

My questions

1. What is the difference between 5091/5092

2.  What is your recommendation?

 

Thanks,
Alan

 

Sigma Studio download link for 32 bit version.

$
0
0

Hello,

I lost my Sigma Studio 3.12.

Please help me finding a 32 bit version for windows 7.

Thanks.


ADG1236 mux CANBUS

$
0
0

Hi,

 

im planning to use ADG1236 to mux CAN bus signals at 1Mbps. Would it be a good choice?

Thanks!

ADXL372 FIFO Read Problem

$
0
0

Hi,

I am just getting started with using ADXL372 IC. I am currently facing few issues getting it to work according to my requirements.

I am using an EVAL-ADXL372Z development module along with an ATSAMD21G18 microcontroller with an Arduino core to get of ADXL372. I started off with the ADXL example code mentioned here. I was able to get that working.

 

I am planning to use ADXL to measure an impact force for my experiments. My requirement is to set ADXL372 in a low power mode till the time an impact occurs(Wake up or Instant ON mode), I need to analyse the data of the impact. For this, I need to store the data in the FIFO(360 samples after the impact and 150 samples after the impact).

 

I modified the example code, to match the above requirements.

  • Configured FIFO to be in Triggered mode.
  • Operation mode was set to Instant On Mode
  • Configured the INT1 pin to trigger HIGH when FIFO is full(360 samples)
  • Getting the FIFO data and display it on Serial port

 

I have attached the code which I am using.

What seems to be happening is that ADXL_INT1_PIN is always getting triggered always and not specifically at an impact exceeding the low threshold(like it should in the Instant On mode)

 

Please Note that in the code I modified the INT1, INT2, CS Pins to suit my development board.

 

Can someone help me with this problem? I maybe making some mistake in my code.

 

jwang

LTC4121EUD charge when Vin

$
0
0

Only at an input voltage >12V PWM occurs and the charging current appears (0.14A input current). Otherwise, there are no signals. The charge status and error are read at <12 volts as well as at >12 volts at the input: nCHG = 0, nFLT = 1.

I need Vfloat = 4.0-4.2V, batery voltage which I try to charge V=3.5V. Vin=6-8.4V (two LiPo cells), Vout=4V (single cell LiFePO4).

May be only LTC4121-4.2 can work with <12Vin? But I need Adj Vfloat.

What should I do to start PWM on low voltage with LTC4121EUD?

PS. When Vin < 12V Vfloat avg = 2.74V, voltage ripple with a frequency of 1/280 ms, Vp-p=2.7-2.8V looks like a saw.

ltpowercad

$
0
0

LTPowerCad 2 - capability request

Maybe I've missed it, but it would be helpful to be able to add diode controllers, diode OR-ing controllers, or even just diodes. I working on a complex multi-stage PS design which involves diode OR-ing in the early stages and would like to, at a minimum, represent the OR-ing function graphically, but preferably, actually have the real diode (or diode controller) functions represented for analysis. I've just started playing with LTPowerCAD2 and had hoped I could all of the high-level design and even much of the detailed design in this tools. It looks great and looks like it is close to being able to do what I was hoping for.

EZBoard BF518 PG1 as an input

$
0
0

Hi all,

 

I try to use PG1 as an input on a EZBoard BF518 with an ethernet link.

 

Trouble is when I activate the input driver with *pPORTGIO_INEN |= 0x0002, the Ethernet link stops receiving messages.

 

Attachments show the effect of INEN bit 1 set and clear on the EMAC_RX_STAT register.

 

The PORTG_FER is not set for PG1, so I don't understand why EMAC is concerned.

 

Could you tell me where am I wrong ?

 

Thank you

Viewing all 51060 articles
Browse latest View live