Swift:声音输出和麦克风输入 |使用 AudioKit |

Swift: Sound-Output & Microphone-Input | using AudioKit |

提问人:Jonas0000 提问时间:1/7/2018 最后编辑:Jonas0000 更新时间:1/7/2018 访问量:2348

问:


我使用的是 >Xcode 版本 9.2<
我使用的是 >AudioKit 版本 4.0.4
<


我写了一些代码,你可以在下面找到,应该能够

  • 播放特定声音(frequency: 500.0HZ)
  • “监听”麦克风输入并实时计算频率

如果我打电话或分开,一切看起来都很好,并且真的像我预期的那样工作。但是打电话和之后呢?就在那里,我遇到了大问题。playSound()receiveSound()playSound()receiveSound()

这就是我想让代码工作的方式:

SystemClass.playSound() //play sound
DispatchQueue.main.asyncAfter(deadline: (DispatchTime.now() + 3.0)) {
   SystemClass.receiveSound() //get microphone input 3 seconds later
}

let SystemClass: System = System()
class System {
    public init() { }

    func playSound() {
        let sound = AKOscillator()
        AudioKit.output = sound
        AudioKit.start()
        sound.frequency = 500.0
        sound.amplitude = 0.5
        sound.start()
        DispatchQueue.main.asyncAfter(deadline: (DispatchTime.now() + 2.0)) {
            sound.stop()
        }
    }


    var tracker: AKFrequencyTracker!
    func receiveSound() {
        AudioKit.stop()
        AKSettings.audioInputEnabled = true
        let mic = AKMicrophone()
        tracker = AKFrequencyTracker(mic)
        let silence = AKBooster(tracker, gain: 0)
        AudioKit.output = silence
        AudioKit.start()
        Timer.scheduledTimer( timeInterval: 0.1, target: self, selector: #selector(SystemClass.outputFrequency), userInfo: nil, repeats: true)
    }

    @objc func outputFrequency() {
        print("Frequency: \(tracker.frequency)")
    }
}

这些消息是我每次想要运行代码时收到的一些编译器错误消息(调用并在 3 秒后调用):playSound()receiveSound ()

AVAEInternal.h:103:_AVAE_CheckNoErr: [AVAudioEngineGraph.mm:1266:Initialize: (err = AUGraphParser::InitializeActiveNodesInOutputChain(ThisGraph, kOutputChainOptimizedTraversal, *GetOutputNode(), isOutputChainActive)): error -10875

AVAudioEngine.mm:149:-[AVAudioEngine prepare]: Engine@0x1c401bff0: could not initialize, error = -10875

[MediaRemote] [AVOutputContext] WARNING: AVF context unavailable for sharedSystemAudioContext

[AVAudioEngineGraph.mm:1266:Initialize: (err = AUGraphParser::InitializeActiveNodesInOutputChain(ThisGraph, kOutputChainOptimizedTraversal, *GetOutputNode(), isOutputChainActive)): error -10875

Fatal error: AudioKit: Could not start engine. error: Error 

Domain=com.apple.coreaudio.avfaudio Code=-10875 "(null)" UserInfo={failed call=err = AUGraphParser::InitializeActiveNodesInOutputChain(ThisGraph, kOutputChainOptimizedTraversal, *GetOutputNode(), isOutputChainActive)}.: file /Users/megastep/src/ak/AudioKit/AudioKit/Common/Internals/AudioKit.swift, line 243
iOS Swift Xcode 音频 AudioKit

评论


答:

3赞 Aurelius Prochazka 1/7/2018 #1

我相信您的问题的最大部分是由于使用它们的函数中 AKNodes 的本地声明:

   let sound = AKOscillator()
   let mic = AKMicrophone()        
   let silence = AKBooster(tracker, gain: 0)

请改为将这些变量声明为实例变量,如此处所述。

评论

0赞 tsuyoski 6/20/2019
你是否能够完成你在问题中写的内容(播放声音和接收声音)?我想做同样的事情,但它只是崩溃了。