最新消息:雨落星辰是一个专注网站SEO优化、网站SEO诊断、搜索引擎研究、网络营销推广、网站策划运营及站长类的自媒体原创博客

hadoop2.5.2学习14

网站源码admin44浏览0评论

hadoop2.5.2学习14

hadoop2.5.2学习14

一、代码步骤:


1、 去重
2、 获取所有用户的喜欢矩阵:

3、 获得所有物品之间的同现矩阵
4、 两个矩阵相乘得到三维矩阵
5、 三维矩阵的数据相加获得所有用户对所有物品的推荐值(二维矩阵)
6、 按照推荐值降序排序。

二、代码实现

2.1、第一个mapreduce, 实现去重


由于原始数据可能有还坏数据,我们需要进行去重。
第一个mapreduce实现去重
这个非常简单, 将每行数据最为map的key
洗牌阶段(shuffle) 会进行分组, 默认key相同的分到一组, 有多个value
但是我们在reducer中输出key, 就保证了去重

package com.chb.catTest;import java.io.IOException;
import java.util.Map;import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.LongWritable;
import org.apache.hadoop.io.NullWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Job;
import org.apache.hadoop.mapreduce.Mapper;
import org.apache.hadoop.mapreduce.Reducer;
import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;
import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;/***第一个mapreduce实现去重*这个非常简单, 将每行数据最为map的key*洗牌阶段(shuffle) 会进行分组, 默认key相同的分到一组, 有多个value*但是我们在reducer中输出key, 就保证了去重**/
public class Step1 {public static boolean run(Configuration conf, Map<String, String> paths) throws Exception {FileSystem fs = FileSystem.get(conf);Job job = Job.getInstance();job.setJar("");job.setJarByClass(Step1.class);job.setJobName("Step1");job.setMapperClass(step1Mapper.class);job.setReducerClass(step1Reducer.class);job.setMapOutputKeyClass(Text.class);job.setMapOutputValueClass(NullWritable.class);FileInputFormat.addInputPath(job, new Path(paths.get("step1Input")));Path out = new Path(paths.get("step1Output"));if (fs.exists(out)) {fs.delete(out, true);}FileOutputFormat.setOutputPath(job, out);boolean f = job.waitForCompletion(true);return f;}static class step1Mapper extends Mapper<LongWritable, Text, Text, NullWritable> {@Overrideprotected void map(LongWritable key, Text value, Context context)throws IOException, InterruptedException {if (key.get() != 0) {context.write(value, NullWritable.get());}}}static class step1Reducer extends Reducer<Text, NullWritable, Text, NullWritable>{@Overrideprotected void reduce(Text key, Iterable<NullWritable> values, Context context)throws IOException, InterruptedException {//直接输出keycontext.write(key, NullWritable.get());}}
}

2.2 第二个mapreduce 计算每个用户对每种商品的喜爱得分

按照每个用户分组, 对第一个mapreduce去重之后的数据进行统计,
只需要对每行数据提取用户:item, 操作,
为了简化, 将用户的每种操作设置固定的喜爱得分

    /*** 根据用户对商品的操作, * 设置对商品的喜爱度。*/public static Map<String, Integer> R = new HashMap<String, Integer>();static {R.put("click", 1);   //点击R.put("collect", 2); //收藏R.put("cart", 3);    //加入购物篮R.put("alipay", 4);  //付款}

在第二个的mapreduce中的Mapper中对每行数据处理,输出
user_id item_id1:评分

            Integer rv = RunJob.R.get(action);context.write(new Text(user_id), new Text(item_id+":"+rv.intValue()));//user_id     item_id1:评分 

在reduce中,

因为key为用户,在shuffle中,默认分组按照key是否相同,
所以只需要对每个用户对同一物品的喜爱得分合并即可

            //一个用户分一个组,首先对同意物品的喜爱得分进行合并Map<String, Integer>  maps = new HashMap<String, Integer>();for (Text text : values) {String[] vs = text.toString().split(":");String item_id = vs[0];int action = Integer.parseInt(vs[1]);if (maps.get(item_id) == null) {maps.put(item_id, action);}else {int av = maps.get(item_id);av += action;maps.put(item_id, av);}}

然后输出同一用户的不同物品的喜爱得分,

以逗号分割不用物品的喜爱得分

            StringBuffer sb = new StringBuffer();//输出每个用户对每个产品的喜爱得分 输出到value, 以item_id:喜爱得分 追加, 以逗号分隔不同物品的喜爱得分情况Set<String> keySet = maps.keySet();for (Iterator<String> iterator = keySet.iterator(); iterator.hasNext();) {String item_id =  iterator.next();int action = maps.get(item_id);sb.append(item_id+":"+action+",");}context.write(key, new Text(sb.toString()));

输出结果如下:

u2722   i105:1,i79:2,i471:1,i489:1,i203:2,i485:1,i258:1,i63:2,i254:4,i444:1,i159:1,i283:3,i161:11,i175:1,i149:2,i335:3,i528:1,i36:2,i230:1,
u2723   i429:1,i87:2,i561:1,i427:2,i428:2,i364:3,i362:1,i83:1,i84:1,i562:1,i86:1,i130:1,i234:1,i236:2,i237:5,i239:1,i137:1,i421:1,i139:1,i134:1,i230:1,i490:1,i359:1,i79:1,i358:1,i550:3,i419:1,i77:1,i354:1,i9:3,i492:1,i71:1,i557:1,i74:1,i75:2,i72:1,i142:4,i143:1,i246:1,i8:1,i149:1,i241:1,i2:2,i145:1,i1:2,i119:1,i388:1,i407:1,i309:1,i383:1,i488:1,i29:1,i384:1,i27:1,i258:1,i25:1,i26:1,i254:1,i304:1,i303:1,i111:1,i250:1,i116:2,i118:1,i198:2,i376:1,i471:1,i194:1,i379:1,i476:1,i90:1,i371:1,i14:1,i266:5,i124:1,i263:1,i122:1,i262:1,i126:1,i524:1,i328:1,i273:1,i469:1,i178:1,i40:1,i274:1,i179:1,i46:2,i173:3,i327:2,i279:1,i175:2,i323:1,i172:2,i171:1,i321:1,i501:1,i514:1,i502:2,i106:1,i394:1,i101:1,i392:4,i102:1,i202:1,i203:2,i397:1,i282:1,i208:1,i281:5,i455:4,i458:2,i31:1,i283:1,i313:1,i33:1,i519:2,i185:1,i515:1,i109:3,i310:1,i517:1,i39:4,i506:1,i213:1,i295:1,i155:1,i157:1,i446:30,i60:1,i290:1,i344:1,i548:2,i342:2,i343:1,i348:1,i299:1,i152:1,i298:1,i534:1,i535:2,i225:1,i228:2,i227:1,i169:1,i435:2,i167:1,i53:13,i166:1,i433:1,i538:1,i160:1,i334:1,i335:2,i57:1,i163:1,
u2724   i429:2,i428:2,i366:1,i362:1,i83:1,i84:1,i237:1,i420:1,i425:1,i79:1,i77:1,i550:1,i419:2,i495:1,i9:2,i70:1,i496:1,i551:1,i142:1,i143:1,i144:1,i1:1,i240:1,i386:1,i409:1,i484:1,i406:2,i309:1,i381:1,i383:3,i488:1,i487:2,i27:1,i258:3,i28:1,i26:1,i254:1,i304:1,i257:1,i117:1,i472:1,i375:1,i194:1,i19:1,i474:1,i120:1,i121:1,i266:1,i125:1,i524:1,i523:1,i464:1,i41:1,i40:1,i46:2,i173:3,i44:1,i324:2,i43:1,i528:1,i461:1,i172:1,i320:1,i105:1,i190:1,i107:1,i394:2,i391:4,i392:6,i203:1,i206:1,i208:1,i281:2,i455:2,i285:1,i459:1,i314:1,i316:1,i183:1,i504:1,i517:1,i547:2,i214:11,i212:1,i448:1,i297:2,i445:1,i446:22,i60:2,i291:1,i442:1,i344:1,i345:1,i348:1,i152:1,i67:1,i66:1,i536:1,i535:1,i221:1,i223:1,i51:1,i53:7,i430:1,i160:1,i334:2,i165:1,i54:1,i56:1,
u2725   i281:1,i446:1,i77:1,i12:1,i192:1,i72:1,
u2726   i525:1,i542:1,i540:1,i483:1,i521:1,i385:1,i86:1,i300:1,i445:1,i446:1,i178:1,i291:1,i304:1,i293:1,i442:1,i421:1,i176:1,i342:1,i324:1,i549:1,i278:1,i348:1,i115:1,i151:1,i230:1,i153:1,i79:1,i394:1,i534:1,i353:1,i535:1,i192:2,i390:1,i352:1,i94:1,i479:1,i73:1,i435:1,i140:1,i52:1,i455:1,i559:1,i263:1,i313:1,i160:1,i334:1,i518:1,
u2727   i446:3,i468:2,
u2728   i502:1,i364:1,i194:2,i536:1,i191:1,i9:1,i192:1,i360:1,i280:1,i446:1,i455:2,i229:1,i431:1,i40:1,i247:1,i331:1,i250:1,i313:1,i160:1,i114:1,i151:1,

与本文相关的文章

发布评论

评论列表(0)

  1. 暂无评论