iOS 音频录制AMR和WAV互转(支持64位)

公司项目中涉及到语音录制的功能,在录音格式方面遇到一些小问题现在拿出来与大家分享一下。
众所周知,iOS 音频录制是不支持AMR格式的。但 Android 好像是默认是AMR格式的。两边格式不同必然有一方做出妥协的。这里只简单介绍一下iOS 格式转码的方法。

1、音频录制简介

AVFoundation框架中AVAudioRecorder类专门处理录音操作,它支持多种音频格式。这里以AVAudioRecorder录制音频,AVAudioPlayer播放音频。 在创建录音时除了指定路径外还必须指定录音设置信息,因为录音时必须知道录音文件的格式、采样率、通道数、每个采样点的位数等信息,但是也并不是所有的信息都必须设置,通常只需要几个常用设置。关于录音设置详见帮助文档中的“AV Foundation Audio Settings Constants”。

2、相关库导入

查了好久 转码好使的就是opencore-amr 编译的静态库这里已经为大家整理好了。
点击这里下载
打开是酱紫滴!导入项目中就OK了。


新项目有BUG!!!
开发遇到BUG很正常,请详细阅读报错信息。我只遇到这一个错先给大家展示出来。


选中你的Targets 选中Build Setting 点击搜索框搜索BitCode 改成No 就OK了。
另外还需要导入两个库
#import <AVFoundation/AVFoundation.h>
#import <AudioToolbox/AudioToolbox.h>
引入头文件
#import "VoiceConverter.h"

3、音频录制

  • 开始录音
    //根据当前时间生成文件名
    
    self.recordFileName = [ViewController GetCurrentTimeString];
    
    //获取路径
    
    NSString *documentsDirectory = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0];
    
    NSInteger timeStamp = [[NSDate date] timeIntervalSince1970] * 1000;
    
    self.recordFilePath = [documentsDirectory stringByAppendingPathComponent:[NSString stringWithFormat:@"%@.wav", @(timeStamp)]];
    
    NSDictionary *recordSetting = [[NSDictionary alloc] initWithObjectsAndKeys:
                                   [NSNumber numberWithFloat: 8000.0],AVSampleRateKey, //采样率
                                   [NSNumber numberWithInt: kAudioFormatLinearPCM],AVFormatIDKey,
                                   [NSNumber numberWithInt:16],AVLinearPCMBitDepthKey,//采样位数 默认 16
                                   [NSNumber numberWithInt: 1], AVNumberOfChannelsKey,//通道的数目
                                   //                                   [NSNumber numberWithBool:NO],AVLinearPCMIsFloatKey,//采样信号是整数还是浮点数
                                   //                                   [NSNumber numberWithInt: AVAudioQualityMedium],AVEncoderAudioQualityKey,//音频编码质量
                                   nil];
    
    //初始化录音
    
    self.recorder = [[AVAudioRecorder alloc]initWithURL:[NSURL fileURLWithPath:self.recordFilePath]
                                               settings:recordSetting
                                                  error:nil];
    //准备录音
    
    if ([self.recorder prepareToRecord]){
        
        [[AVAudioSession sharedInstance] setCategory: AVAudioSessionCategoryPlayAndRecord error:nil];
        [[AVAudioSession sharedInstance] setActive:YES error:nil];
        
        //开始录音
        if ([self.recorder record]){
           
            //UI操作
       
        }
    }else{
        
        NSLog(@"音频格式出错,Recorder---%@",self.recorder);
        
        }
  • 停止录音
if (self.recorder.isRecording){
        
        //停止录音
        
        [self.recorder stop];
        
    }
  • 生成当前时间字符串
+(NSString*)GetCurrentTimeString{
    
    NSDateFormatter *dateformat = [[NSDateFormatter  alloc]init];
    
    [dateformat setDateFormat:@"yyyyMMddHHmmss"];
    
    return [dateformat stringFromDate:[NSDate date]];
}
  • 生成文件路径
+(NSString*)GetPathByFileName:(NSString *)_fileName ofType:(NSString *)_type{ 

 NSString *directory = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES)objectAtIndex:0];
NSString* fileDirectory = [[[directory stringByAppendingPathComponent:_fileName]
                                stringByAppendingPathExtension:_type]
                               stringByAddingPercentEscapesUsingEncoding:NSUTF8StringEncoding];
 return fileDirectory;
}

4、格式转换

  • WAV转AMR
NSString *amrPath = [ViewController GetPathByFileName:self.recordFileName ofType:@"amr"];
  
        if ([VoiceConverter ConvertAmrToWav:self.recordFilePath wavSavePath:amrPath]) {
            
            //UI操作 播放音频
            
        }else{
            
            NSLog(@"wav转amr失败");
        }
  • AMR转WAV
  NSString  *convertedPath = [ViewController GetPathByFileName:[self.recordFileName stringByAppendingString:@"_AmrToWav"] ofType:@"wav"];
    if ([VoiceConverter ConvertAmrToWav:amrPath wavSavePath:convertedPath]){
        
        //UI操作 播放音频
        
    }else{
        
        NSLog(@"amr转wav 转换失败");
    }

5、总结

由于时间关系暂时写这么多 ,后面上传会给大家补上。(未完待续。。。。)

 

 

 

 

最近在倒腾语音,遇到一些问题,再次记录一下,方便日后使用。

        0.ios录制。

扫描二维码关注公众号,回复: 3946965 查看本文章

1.ios录制语音转化为 amr格式,至于为什么用amr,就不在此赘述了。

2.在静音状态下播放语音。

3.播放过后,整体音量变小。

4.听筒播放和喇叭播放切换。

1.ios录制

使用AVAudioRecorder, 即可完成录音操作。

            NSString* pcm_path = @”路径“;
            _recordedTmpFile = [[NSURL alloc] initFileURLWithPath:pcm_path];
            //LinearPCM 是iOS的一种无损编码格式,但是体积较为庞大
            //录音设置
            NSMutableDictionary *recordSettings = [[NSMutableDictionary alloc] init];
            //录音格式 无法使用
            [recordSettings setValue :[NSNumber numberWithInt:kAudioFormatLinearPCM] forKey: AVFormatIDKey];
            //采样率
            [recordSettings setValue :[NSNumber numberWithFloat:8000.0] forKey: AVSampleRateKey];//44100.0
            //通道数
            [recordSettings setValue :[NSNumber numberWithInt:1] forKey: AVNumberOfChannelsKey];
            //线性采样位数
            //[recordSettings setValue :[NSNumber numberWithInt:16] forKey: AVLinearPCMBitDepthKey];
            //音频质量,采样质量
            [recordSettings setValue:[NSNumber numberWithInt:AVAudioQualityMin] forKey:AVEncoderAudioQualityKey];
            
            //Setup the recorder to use this file and record to it.
            _recorder = [[ AVAudioRecorder alloc] initWithURL:_recordedTmpFile settings:recordSettings error:nil];
            _recorder.delegate = self;
            _recorderTime = 0.0f;
        [_recorder prepareToRecord];
        [_recorder record];

2.格式转换。
此处主要使用ibopencore-amr库(Go)。下载下来进行编译,编译脚本网上也可以找到。此处提供懒人模式,密码: 89py。
文件编译好过后,正常导入使用即可。
在使用过成功遇到的一个问题就是。安卓录制的是非amr文件,导致转码失败。此时在读取需要转码文件的时候,看看文件的头部。是不是“

#!AMR\n”。

使用方法:直接调用amrFileCodec.h下面的

EncodeWAVEFileToAMRFile,  

DecodeAMRFileToWAVEFile就可以进行转码了。

3.用户静音状态下播放

    //不会停止其他程序播放,当设备设置为静音时,无法播放。锁屏会停止播放
    NSString * const AVAudioSessionCategoryAmbient = @"AVAudioSessionCategoryAmbient";
    //停止其他程序播放,当设备设置为静音时,
    NSString *const AVAudioSessionCategorySoloAmbient = @"AVAudioSessionCategorySoloAmbient";
    //停止播放其他程序音频,设备设置为静音的时候,音频依然会继续播放。
    NSString *const AVAudioSessionCategoryPlayback = @"AVAudioSessionCategoryPlayback";
    //只能进行录音
    NSString *const AVAudioSessionCategoryRecord = @"AVAudioSessionCategoryRecord";
    //同时进行录音和播放
    NSString *const AVAudioSessionCategoryPlayAndRecord = @"AVAudioSessionCategoryPlayAndRecord";
    //音频处理,不能进行录制和播放
    NSString *const AVAudioSessionCategoryAudioProcessing = @"AVAudioSessionCategoryAudioProcessing";
        AVAudioSession * audioSession = [AVAudioSession sharedInstance];
        [audioSession setCategory:AVAudioSessionCategoryPlayAndRecord error: nil];

可以根据实际情况,选择合适自己的。

4.音量
此处需要设置一下,可能会出现音量变小的问题。

        AVAudioSession * audioSession = [AVAudioSession sharedInstance];
        [audioSession overrideOutputAudioPort:AVAudioSessionPortOverrideSpeaker error:nil];

5.听筒和喇叭的切换

需要监听听筒旁边的设备,当设备被挡住时,也就是在接听电话状态。

    [[UIDevice currentDevice] setProximityMonitoringEnabled:YES];
     [[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(senderStateChange:)
     name:@"UIDeviceProximityStateDidChangeNotification"
    error:nil];


    - (void)senderStateChange:(NSNotificationCenter*)not
    {
        if ([[UIDevice currentDevice] proximityState] == YES)
        {
            NSLog(@"Device is close to user");
            [[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayAndRecord error:nil];
            
        }
        else
        {
            NSLog(@"Device is not close to user");
            [[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayback error:nil];
            
        }
    }


后记:
下载中有.cpp文件,可以把文件改成点.m。都是c语言的语法。或者是把引用该文件的文件名改成.mm。
最后是需要把自己的语音上传后台,然后再下载下来进行播放即可。
为了避免广告嫌疑,自己找找。有第三方平台,可以上传第三方,然后在下载下来播放。这样自己的后台工作量就简单多了。
--------------------

 

结合以上两个例子,本人应用实例(不细细解释了,.h要添加类库以及遵守recorder代理协议)

#pragma mark--测试区域-----
-(void)testClick{
    NSLog(@"已点击登录按钮,正常开始录音");
    
    [_recorder record];
    [self performSelector:@selector(stopRecorder) withObject:nil afterDelay:5];
}
-(void)stopRecorder{
    [_recorder stop];
    //    NSString* path = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES).lastObject stringByAppendingPathComponent:@"tmp/aaa"];
    //    NSURL* url = [NSURL fileURLWithPath:path];
    //    NSData *data = [NSData dataWithContentsOfURL:url];
    //    [data writeToFile:[NSTemporaryDirectory() stringByAppendingPathComponent:@"tmp/aaa"] atomically:YES];
    NSLog(@"voicerecorder stop");
    
}
-(void)configRecorder{
    //设置音频会话
    NSError *sessionError;
    [[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryRecord error:&sessionError];
    if (sessionError){
        NSLog(@"Error creating session: %@",[sessionError description]);
        
    }else{
        [[AVAudioSession sharedInstance] setActive:YES error:&sessionError];
        
    }
    NSError *error = nil;
    
    NSMutableDictionary *recordSetting = [[NSMutableDictionary alloc]init];
    [recordSetting setObject:[NSNumber numberWithInt:kAudioFormatLinearPCM] forKey:AVFormatIDKey];
    [recordSetting setObject:[NSNumber numberWithFloat:8000] forKey:AVSampleRateKey];
    [recordSetting setObject:[NSNumber numberWithInt:1] forKey:AVNumberOfChannelsKey];
//    [recordSetting setObject:[NSNumber numberWithInt:16] forKey:AVLinearPCMBitDepthKey];
    //是否使用浮点数采样
    [recordSetting setObject:@(YES) forKey:AVLinearPCMIsFloatKey];
    [recordSetting setObject:[NSNumber numberWithInt:AVAudioQualityMin] forKey:AVEncoderAudioQualityKey];
    NSString* path = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES).lastObject stringByAppendingPathComponent:@"aaa.wav"];
    NSURL* url = [NSURL fileURLWithPath:path];
    _recorder = [[AVAudioRecorder alloc] initWithURL:url settings:recordSetting error:nil];
    _recorder.meteringEnabled = YES;
    _recorder.delegate = self;
    
    if ([_recorder prepareToRecord]) {
        [[AVAudioSession sharedInstance] setCategory: AVAudioSessionCategoryPlayAndRecord error:nil];
        [[AVAudioSession sharedInstance] setActive:YES error:nil];
    }
    
    
    
}
-(void)ChangeWithInfo:(NSDictionary*)dic{
    NSString *amrPath = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES).lastObject stringByAppendingPathComponent:@"bbb.amr"];
    NSString* wavPath = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES).lastObject stringByAppendingPathComponent:@"aaa.wav"];
    NSString* newWavPath = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES).lastObject stringByAppendingPathComponent:@"ccc.wav"];
    if ([VoiceConvert ConvertWavToAmr:wavPath amrSavePath:amrPath]) {
        //UI操作 播放音频
        //上传
        NSLog(@"wav转amr成功");
        [self startUploadWithToken:dic localName:@"localTempAudio" cloudName:@"ios_audio" fileURL:amrPath];
        
        if ([VoiceConvert ConvertAmrToWav:amrPath wavSavePath:newWavPath]) {
            //UI操作 播放音频
            //上传
            NSLog(@"amr再转wav成功");
            [self startUploadWithToken:dic localName:@"localTempAudio" cloudName:@"ios_audio" fileURL:amrPath];
        }else{
            NSLog(@"amr再转wav失败");
            
        }
        
    }else{
        NSLog(@"wav转amr失败");
        
    }
    
        
}
-(void)audioRecorderDidFinishRecording:(AVAudioRecorder *)recorder successfully:(BOOL)flag{
    NSLog(@"录音完成,写入文件文称,可以上传");
    
    if (flag) {
        NSLog(@"录音完成成功,写入文件文称,可以上传");
        
        NSString* path = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES).lastObject stringByAppendingPathComponent:@"aaa.wav"];
        
        NSDictionary *params = @{@"type":@"1",@"videoType":@"wav",@"anchorid":@"90602998"};
        
        [[AFRequestManager shareManager] postAudioPath:path params:params AndFlag0Php1WWW2java:1 WithDataName:@"audio" FileName:@"ios_audio" successPost:^(id  _Nonnull successPost) {
            NSLog(@"audio success:%@",successPost);
            
            if ([successPost isKindOfClass:[NSDictionary class]]&&successPost[@"data"]&&[[NSString stringWithFormat:@"%@",successPost[@"status"]] isEqualToString:@"100000"]) {
                //先判断状态码
                NSDictionary*info = [NSDictionary dictionaryWithDictionary:successPost[@"data"]];
                //在判断参数
                if (info[@"id"]&&info[@"uploadToken"]) {
                    [self ChangeWithInfo:info];
                }
                
            }
            
            
        } errorResponse:^(NSError * _Nonnull error) {
            NSLog(@"audio error:%@",error);
        }];
        
    }else{
        NSLog(@"录音完成失败,写入文件文称,可以上传");
    }
}
- (void)startUploadWithToken:(NSDictionary *)tokenDict localName:(NSString *)localName cloudName:(NSString *)cloudName fileURL:(NSString *)path{
    
    NSString * token = @"";
    if (tokenDict[@"uploadToken"] != nil && ![tokenDict[@"uploadToken"] isKindOfClass:[NSNull class]]) {
        token = tokenDict[@"uploadToken"];
    }
    
    NSNumber * ID;
    if (tokenDict[@"id"] != nil && ![tokenDict[@"id"] isKindOfClass:[NSNull class]]) {
        ID = tokenDict[@"id"];
    }
    
    WCSUploadObjectRequest *uploadRequest = [[WCSUploadObjectRequest alloc] init];
    uploadRequest.token = token;
    uploadRequest.key = localName;
    uploadRequest.fileName = cloudName;
    uploadRequest.fileURL = [NSURL URLWithString:path];
    
    [uploadRequest setUploadProgress:^(int64_t bytesSent, int64_t totalBytesSent, int64_t totalBytesExpectedToSend) {
        CGFloat precent = (CGFloat)totalBytesSent/(CGFloat)totalBytesExpectedToSend;
        NSLog(@"progress - %@,%@,%f.2", @(totalBytesSent), @(totalBytesExpectedToSend),precent);
        dispatch_async(dispatch_get_main_queue(), ^{
            
            NSDictionary * dict = @{@"progress":[NSNumber numberWithFloat:precent]};
//            [[NSNotificationCenter defaultCenter] postNotificationName:kShareVideoUpdateProgressNotification object:nil userInfo:dict];
        });
    }];
    
    [[self.wcsClient uploadRequest:uploadRequest] continueWithBlock:^id _Nullable(WCSTask<WCSUploadObjectResult *> * _Nonnull task) {
        
        if (task.error) {
            NSLog(@"===ERROR===\r\n%@", task.error);
            dispatch_async(dispatch_get_main_queue(), ^{
                [RemindView showHUDWithText:@"上传失败" onView:kYBKeyWindow];
            });
        } else {
            
            /*
             taskResult - {
             bucket = "11-vod"; 1
             fsize = 1013859; 1
             hash = "FthrKaoXR-EWP5daJkF4fPv-KNWr"; 1
             key = "2690104-20171117171405.mp4"; 1
             mimeType = "video/mp4"; 1
             url = "http://wsrecord.xxxx.com/2690104-20171117171405.mp4"; 1
             }
             */
            NSLog(@"taskResult - %@", task.result.results);
            NSDictionary * dictionary;
            @try {
                dictionary = @{
                               @"results" : task.result.results,
                               @"ID" : ID
                               };
            } @catch (NSException *exception) {
#ifdef DEBUG
                NSLog(@"exception - %@",exception.description);
                NSAssert(!exception, exception.description);
#endif
                dictionary = [NSDictionary dictionary];
            } @finally {
                ///请求服务器,H5地址
                ///这里面通过AFN请求,会回到主线程发送通知
                [[AFRequestManager shareManager] requestUploadResultWithDict:dictionary CompletionHandler:^(NSString *url) {
//                    [[NSNotificationCenter defaultCenter] postNotificationName:kShareVideoSetupShareStateNotification object:nil userInfo:@{@"url":url}];
                }];
            }
        }
        return nil;
    }];
}
- (WCSClient *)wcsClient {
    if (!_wcsClient) {
        _wcsClient = [[WCSClient alloc]initWithBaseURL:[NSURL URLWithString:@"http://jjhgame.up3.v1.wcsapi.com"] andTimeout:10];
    }
    return _wcsClient;
}



更多问题加Q群讨论:565191947

猜你喜欢

转载自blog.csdn.net/a787188834/article/details/83412556